By now, most companies accept the need to improve the usability of their websites, intranets, software and hardware designs, and other projects that have a user interface. Many companies also know that user testing is the fastest and easiest method in the usability engineering toolbox. (Unfortunately, many don't know about the other methods in the toolbox or how to effectively combine multiple usability methods throughout the project lifecycle, but that's a story for another day.)
Many people believe in user testing, but in real design projects, not much testing takes place. Why the discrepancy between belief and action? Mainly, it is the company's inability to fire off a quick, small test when faced with a design decision. Few organizations can run such tests within the deadlines required by fast-moving development projects. This lack of test-readiness means that testing becomes a rare and precious event that -- at best -- happens once per project.
Single-test projects invariably defer usability testing until the complete design is available. This practice still occurs despite twenty years of experience uniformly showing that most projects require multiple rounds of testing and redesign to achieve acceptable user-experience quality, and the equally strong finding that it is a hundred times cheaper to fix usability problems discovered early in the project rather than at its end.
Simplifying User Testing
To increase the number of companies that apply usability methods correctly, we must make it easier and cheaper to do the right thing .
The three main rules for simplified user testing are:
- Get representative users
- Ask them to perform representative tasks with the design
- Shut up and let the users do the talking
The third rule is surprisingly difficult, while rule #2 requires some experience to execute well. Still, the main obstacle to quick and frequent user testing is the difficulty of finding warm bodies that satisfy rule #1. Most companies have no procedures for getting five customers to show up at specified times next Wednesday, and yet that's what is required for a successful usability study.
Participant recruiting is the unglamorous foundation for all user testing. Without recruiting, you won't have users. Having a systematic recruiting program in place will make a huge difference in how much usability testing your organization conducts, and the quality of your recruiting will immediately increase the quality of your test results.
Recruiting: The State of the Art
To assess the current state of recruiting usability study participants, Nielsen Norman Group surveyed 201 usability professionals. Because we wanted to report how recruiting actually occurs in today's design projects, we deliberately surveyed a biased sample of respondents who were actively involved in usability testing and recruiting. Of course, because most companies don't currently conduct user testing, they also don't recruit test participants. The findings reported here relate solely to the practices of those companies that run usability tests.
Of the survey respondents, 54% were based in the US, 8% in the UK, 7% in Canada, and 5% in Australia. Continental Europe accounted for 14% of respondents, and Brazil, China, Ecuador, India, Israel, Mexico, New Zealand, Singapore, and South Africa were represented as well. Clearly, usability testing, and thus participant recruiting, is a worldwide phenomenon.
Specialized Recruiting Agencies
Most companies recruited their own test participants, possibly because of the cost of engaging a specialized recruiting agency. Only 36% of our respondents used an outside recruiting agency. These companies typically handled some of their own recruiting as well; only 9% of respondents used outside agencies to find all of their test participants.
Recruiting agency costs can be substantial: the average agency fee was $107 per participant. Fees varied significantly based on geography, with the highest fees in the world on the West Coast of the US, where the average was $125 per participant. As I can attest from painful experience, Silicon Valley is not only an expensive place to do business, it's also a place where you must work extra hard to recruit people who have not already been studied to death.
Companies that did their own recruiting reported spending an average of 1.15 hours of staff time for each participant recruited. Still, 24% of respondents reported spending more than two hours per participant. If you don't have a streamlined recruiting process in place, with a skilled recruiting specialist, it might not pay off to handle recruiting in-house.
Recruiting fees also varied dramatically by user profile. It's no big surprise that agencies charge twice as much to recruit high-end professionals ($161 per person) as to recruit average consumers or students (about $84 per person).
Incentives for Test Participants
Companies offered monetary incentives to only 10% of participants in internal studies, such as intranet or MIS system tests. This finding corresponds well with my recommendation that companies not pay their own employees extra money to participate in usability testing, because they're already being paid for their time.
About a third of companies (35%) did offer non-monetary incentives to internal test participants. Typically, this was a small gift, such as a coupon for a free book or lunch in the company cafeteria.
In contrast, companies typically offered cash to participants recruited from outside as incentive to participate in a test: 63% of external users received monetary compensation, 41% received non-monetary incentives, and 9% received nothing. (The numbers total more than 100% because a lucky 13% of external participants were given both monetary and non-monetary incentives.)
The average incentive paid to external users was $64 per hour of test time. Again, the US West Coast was the most expensive, with an average incentive of $81 per hour.
Incentives varied even more by user profile: High-level professionals received almost four times as much as nonprofessional users ($118 vs. $32 per hour, on average).
No-Show Rates
Study participants reported an average no-show rate of 11%, which translates into one in nine users failing to show up as promised. However, because of uncontrollable events like weather, traffic, and random personal events, no-show rates are highly variable from one study to the next. Thus, if you're running a "standard" simple test with five users, you might easily be hit by one or two no-shows.
There are many tricks for minimizing no-shows and alleviating their impact when they do occur, but unfortunately we cannot completely eliminate the problem. No-shows are highly annoying and, therefore, one of the main reasons I recommend paying fairly generous incentives to test participants, even when their normal hourly salary is relatively low.
Initiating Systematic Recruiting
I strongly recommend that you treat participant recruiting as an important component of your user experience process. The more you establish and systematize your recruiting approach, the easier it will be to run studies when you need usability data.
If you're a small company, or if you haven't done much user testing in the past, I recommend that you consider hiring a professional recruiting agency. If you can't afford the agency fee or if you conduct too few studies to employ an in-house recruiting specialist, you can certainly recruit your own test participants, and it will become easier to do so as you gain experience.
In either case, following best practices for recruiting will reduce your usability program's overall cost and increase the validity of your test findings. If you get the wrong users, or if you don't get enough users, your usability studies will not generate the results you deserve, and your credibility could suffer as a result.
Full Report (Free)
The full report on recruiting test users for usability studies is available for free download.
(Update added 2013: Recruiting agency fees and recommended user incentives are now around 20% higher than the numbers reported here, but the relative size of the expenses remains the same.)
Share this article: