I've always recommended fast and cheap user testing, with as many iterations as you can squeeze into your schedule. My true preference is weekly testing; make, say, every Wednesday your User Day and schedule 4 customers to test.
Unfortunately, I know of few projects that have sustained this pace, and many of those were my own. Recently, I came across another project that employed weekly testing: the redesign of tivo.com.
I've long been a fan of TiVo's usability. Indeed, 4 years ago, The New York Times quoted me calling the oversized yellow pause button in the middle of the TiVo remote ''the most beautiful pause button I've ever seen.''
As it happens, Nielsen Norman Group's own research confirms the new TiVo website's usability. Before I knew about the design team's methodology, we had selected the site to be part of one of our independent research studies, in which we compare a bunch of sites to derive general lessons for Web usability. We included Tivo.com as an example of a "fancy" design, and it scored in the top 20% in our study. It's definitely possible to have a good-looking, media-rich site, as long as you keep usability as a goal throughout the design process.
After hearing that TiVo employed my favorite usability approach, I decided to find out more about how they did it. Here's what we learned from talking with three key members of the TiVo team.
TiVo Case Study
Although the usability of TiVo's DVR product interface has been celebrated, it was only in a recent redesign that the company applied the same rigorous user research to its website. As a company known for including user testing in its product development, TiVo realized that this situation was incongruous.
"I'm selling you a product where the key differentiator is ease of use," says Margret Schmidt, the company's vice president of user experience and design, "but if the website isn't easy to use, how will you believe that the product is? We tried to bring that to the site."
12 Weeks, 12 Tests
The project team had a short timeline for the comprehensive site redesign. To make the best use of that time, team members devised a research plan to conduct 12 consecutive weeks of user testing leading up to the site's relaunch. "We had a certain timeline for the project," says Deborah Torres, a TiVo user researcher. "We were working backwards from a launch date."
For 8 of the 12 sessions, Torres and her team employed remote testing methods, engaging users with collaborative online tools such as Go-To Meeting or WebEx. "We tested users across the country," she says. "Midwest [and] East Coast tested designs and content remotely — we did remote usability testing where we shared the mouse."
For the last sessions, the team brought users into the company's usability lab and tested 1-on-1. "We got stakeholders in the room," says Torres, adding that it was good for the team to "see frustrations in people's faces" rather than simply listen over the phone.
Key company stakeholders supported the testing effort, believing strongly that the investments at the project's beginning would pay off in fewer site changes after launch. And indeed, as Schmidt notes, the early testing revealed things that they otherwise wouldn't have found until after implementation — when the problems would have been both more expensive and more difficult to fix. "It would be hard to get those [problems] fixed later," says Schmidt. "The more info we get now, the better."
Actionable Results
One of the major findings of the user testing was the need for a new approach to content, both for words and presentation. Despite being well intentioned, the website's content didn't resonate with users.
"When the agency first wrote the content, it had a voice and personality to bring out the attributes of the brand," says Schmidt. "But when you see 8 people who say, 'I don't read fluff — I read bullets,' it changes things."
Based on the testing results, the design team changed the content. In some cases, they reduced chunks of text down to 1 or 2 sentences, followed by bullets. No fluff. In other cases, they changed the content's presentation, wording, or placement, all to good effect.
"The biggest learning was that when people are shopping for a DVR, they need a lot of information and very detailed information, but also want a clean, sleek design," says Torres.
The team balanced these two requirements by offering deeper links, with more detailed information in inline links that users can access without leaving the page. "This way we are providing the details without muddying it up for those who want high-level information," she says. By applying this simple interface solution, they were able to present the same information to purchasers at different phases of the purchase cycle.
Another problem they discovered was that users weren't finding a critical link because it wasn't well labeled or well placed on the page. "Below the fold we had a link for: 'Already have a TiVo DVR?' " says Torres. "It links to how-to articles. Users were completely ignoring this. They thought it looked like an ad. And this information is one of the factors in purchase decisions."
Although team members saw (and heard) users express keen interested in such content, the users couldn't find it. By renaming the link "want step-by-step instructions?" and moving it up the page, the team helped users find the link.
Such content changes are, for the most part, small ones. But their impact on task success was huge.
"Words and phrases matter to the users," says Schmidt. "With content, you think you've got it nailed down, then you see users get stuck on something really simple. Without hearing [the feedback] on the way, we wouldn't know why the site was not working later."
This is perhaps one of the most compelling reasons for undertaking this type of rigorous testing prior to launch. Once a site goes live, you don't know what's behind the stats that you see reflecting user behavior. That is, you don't know the reasons why people act the way they do.
"It's hard to drill down into why people are doing or not doing something," says Schmidt. "It's hard to do with just analytics. When you hear and see people react, you can get into their heads and make decisions based on this."
As Schmidt rightly notes, such information is invaluable. "This is huge!" she says. "It's easy to iterate now, and hard to do after it is built."
Design Team Benefits
At TiVo, iterative testing was a good business decision that effectively addressed quick fixes and identified longer-term projects for the site. The testing was also a great tool for the designers — that is, it was both useful and encouraging.
"It was amazing: They could do tests, and turn around and get us feedback the next day," says user interface designer Alex Logan. "It's informal feedback, but useful. Timelines are tight, and it would allow me to get my work rolling right away."
Another reason this design approach was so successful for the designers was that the research built on itself. "I never felt like the research contradicted itself from week to week," says Logan. "We got a general validation of a design. In my mind, you never really know about a design, whether it will work, until you put it in front of users."
Logan says that the learning came in "tiny pieces" that let the team immediately make defined changes. "With each test you get closer to the ideal design," he says.
"Even minimal testing can [offer] something substantial," says Logan, adding that sometimes they had two or three options, but didn't know which users would prefer. Instead of just picking one, he says, " We let the users decide. No matter how brilliant I think I am, I'm not secure until [my design] is in front of people."
5 Reasons to Take This Approach
- Costs the company less. TiVo's testing approach included many remote user sessions, so the company investment was mostly in staff time, with a limited amount devoted to incentives. And the ROI was great. As Schmidt put it, "one to two thousand dollars is not that much money for what you learn."
- Offers motivation. Rapid testing cycles give the design team reassurance that they're on the right path. "Test after test, you see that you're going in the right direction," says Schmidt. "It can be very motivating to the team."
- Helps drive business decisions. Because the feedback is weekly and available in real time, the team can react in real time. Schmidt used the feedback cycles to push back on business owners and ask, "Do we really have to keep pushing forward, or fix the issues? The users are telling you how they will react in the field."
- Creates a testing culture. Admittedly, TiVo's culture is already user-focused on the product side; until this exercise, however, the company wasn't particularly user-focused on the Web side. After seeing the testing cycle results, the company committed to user experience across the board. "Now, user experience is key," says Schmidt. "It has changed how we structure all Web projects moving forward. Each new [Web] project will have the same level of UE testing as products."
- Builds internal knowledge. "Build internal knowledge," says Schmidt. "Use internal resources, internal research groups that can move quickly."
"Agency timelines can be long and it takes a while for new people to get up to speed on issues," she says. "We present a new design to a researcher who can learn what's [right and wrong] about it in the context of other projects. Designers build a design gut," she says. "Every time you listen — you become a better designer."
8 Keys to Successful Outcomes
- Just do it. "Do it and listen to it," says Schmidt. "You have to take the ego out of the design. The design is great when the users say it is."
- Get support from stakeholders and find a project champion. "It is only valuable if you have stakeholders who are willing to listen and act on the feedback," says Torres.
- Encourage participation — from across the organization. "Invite everyone who has an interest in it," says Torres. "Every usability test session, the observation room was packed with designers, product managers, and content people. Even remote sessions, there were managers watching and logging in." That kind of participation ensures widespread buy-in across the organization.
- Determine priorities so you get clear results. Ask pointed questions and set clear research objectives for each screen/task so you can turn them into actionable results immediately after the test sessions.
- Circulate the results — quickly. The morning after each test session, Torres released a top-line summary of teaser bullets telling people the test results from the night before. People across the organization appreciated these summaries.
- Match the research method to what you're trying to uncover. Different test methods suit different goals. The TiVo team used a variety of methods, from focus groups and concept testing to remote user testing and onsite 1-on-1 task-based testing. By matching the method to the testing goal, the team obtained a range of valuable feedback.
- Ask for buy-in on the results. According to Schmidt, the most effective way to get buy-in — especially in a rapid testing situation — is to present the findings along with a solution. "When you share the raw research without proposing a solution," she says, "they worry about it. You don't need 20 people debating how to address the findings."
- Keep a sense of humor and a good attitude. Have a flexible staff that's willing to roll with the pace and unknown nature of the work. "You don't know what you're going to test next," says Torres. "It was really taxing on researchers. At times, I didn't know what we were testing until the night before, and I had to build a protocol and be ready to moderate the next day."
You Can Do It
TiVo may have more of a user-centered design tradition, but there's no reason you can't institute weekly testing at your company as well. By making user exposure an expected part of the week, you keep everybody focused on customer needs.
Share this article: