Research Methods Articles & Videos

  • Contextual Inquiry Pitfalls

    Contextual inquiry is a UX research method where you shadow people as they do their job (or leisure tasks), allowing you to ask questions in context. This video provides advice on overcoming the main challenges with this method.

  • Sympathy vs. Empathy in UX

    Sympathy acknowledges that users are having difficulties, but empathy goes further by understanding the users' needs and motivations.

  • UX Research Made Agile

    Test early and often is a key recommendation for UX research. Dora Brune shares her approach, including regular Open Test Labs to engage more product teams and make user research more agile. Kinder Eggs make for a nice warmup task, even in remote tests. (Recorded at a participant panel at the UX Conference.)

  • Ethnography in UX

    Good UX design requires understanding the context and patterns of human behavior, especially in new products or features that solve real needs. The 5 steps to rapid corporate ethnography lead you to these discoveries.

  • Recruiting and Screening Candidates for User Research Projects

    Know the inherent biases in your recruiting process and avoid them in order to recruit study participants that are representative for your target audience.

  • How Many Participants for a UX Interview?

    In the early stages of a UX-design project, recruit enough people to gain an in-depth understanding of users’ experiences and needs. The number of people needed for an interview study is often smaller than you think.

  • Context Methods: Study Guide

    Unsure where to start? Use this collection of links to our articles and videos to learn about ethnographic methods like field studies and diary studies — methods that help you learn about your user’s context.

  • Better UX Deliverables

    Communicating UX work and findings to the full team, stakeholders, and leadership requires engaging deliverables. Amanda Gulley shared her experience improving the design and usability of UX deliverables at a UX Conference participant panel.

  • Advanced User Testing Methods for Accelerating Innovation

    Two user research methods allow you to quickly test a large number of design alternatives, thus accelerating UX innovation. Rapid iterative design and within-subjects testing of multiple alternate designs aren't for every project, but are great when they do apply.

  • Triangulation: Combine Findings from Multiple User Research Methods

    Improve design decisions by looking at the problem from multiple points of view: combine multiple types of data or data from several UX research methods.

  • Quantitative Research: Study Guide

    Unsure where to start? Use this collection of links to our articles and videos to learn about quant research, quant usability testing, analytics, and analyzing data.

  • Identify and Document Your UX Methods

    For each research or design method you employ, create a document that defines this method and can be used to educate other team members on UX activities.

  • Partner with Other Research Teams in Your Organization

    To gain a holistic picture of your users, exchange data with the non-UX teams in your company who are collecting other forms of customer data, besides the user research you do yourself. You gain; they gain.

  • Data Is More than Numbers: Why Qualitative Data Isn’t Just Opinions

    Systematically gathered qualitative data is a dependable method of understanding what users need, why problems occur, and how to solve them.

  • Four Factors in UX Maturity

    Improving UX maturity requires growth and evolution across 4 high-level factors: strategy, culture, process, and outcomes.

  • How Many Participants for Quantitative Usability Studies: A Summary of Sample-Size Recommendations

    40 participants is an appropriate number for most quantitative studies, but there are cases where you can recruit fewer users.

  • Remote Usability Testing Costs

    We compare the budgets needed for different kinds of qualitative user research: in-person usability testing vs. remote studies run by software (unmoderated) or run by a human moderator.

  • Why 5 Participants Are Okay in a Qualitative Study, but Not in a Quantitative One

    Qualitative usability testing aims to identify issues in an interface, while quantitative usability testing is meant to provide metrics that capture the behavior of your whole user population.

  • 5 Facilitation Mistakes to Avoid During User Interviews

    Some common mistakes to avoid in UX interviews include poor rapport, multitasking, leading, insufficient probing, and poorly managed observers.

  • How Useful Is the System Usability Scale (SUS) in UX Projects?

    SUS is a 35-years old and thus well-established way to measure user satisfaction, but it is not the most recommended way of doing so in user research.

  • Contextual Inquiry Pitfalls

    Contextual inquiry is a UX research method where you shadow people as they do their job (or leisure tasks), allowing you to ask questions in context. This video provides advice on overcoming the main challenges with this method.

  • Sympathy vs. Empathy in UX

    Sympathy acknowledges that users are having difficulties, but empathy goes further by understanding the users' needs and motivations.

  • UX Research Made Agile

    Test early and often is a key recommendation for UX research. Dora Brune shares her approach, including regular Open Test Labs to engage more product teams and make user research more agile. Kinder Eggs make for a nice warmup task, even in remote tests. (Recorded at a participant panel at the UX Conference.)

  • Ethnography in UX

    Good UX design requires understanding the context and patterns of human behavior, especially in new products or features that solve real needs. The 5 steps to rapid corporate ethnography lead you to these discoveries.

  • Better UX Deliverables

    Communicating UX work and findings to the full team, stakeholders, and leadership requires engaging deliverables. Amanda Gulley shared her experience improving the design and usability of UX deliverables at a UX Conference participant panel.

  • Advanced User Testing Methods for Accelerating Innovation

    Two user research methods allow you to quickly test a large number of design alternatives, thus accelerating UX innovation. Rapid iterative design and within-subjects testing of multiple alternate designs aren't for every project, but are great when they do apply.

  • Triangulation: Combine Findings from Multiple User Research Methods

    Improve design decisions by looking at the problem from multiple points of view: combine multiple types of data or data from several UX research methods.

  • Identify and Document Your UX Methods

    For each research or design method you employ, create a document that defines this method and can be used to educate other team members on UX activities.

  • Partner with Other Research Teams in Your Organization

    To gain a holistic picture of your users, exchange data with the non-UX teams in your company who are collecting other forms of customer data, besides the user research you do yourself. You gain; they gain.

  • Remote Usability Testing Costs

    We compare the budgets needed for different kinds of qualitative user research: in-person usability testing vs. remote studies run by software (unmoderated) or run by a human moderator.

  • How Useful Is the System Usability Scale (SUS) in UX Projects?

    SUS is a 35-years old and thus well-established way to measure user satisfaction, but it is not the most recommended way of doing so in user research.

  • Usability Testing for Content

    Usability testing can yield valuable insights about your content. Make sure you test with the correct users, carefully craft the tasks, and ask the right follow-up questions.

  • Comparing Qualitative and Quantitative UX Research

    Qualitative and quantitative are both useful types of user research, but involve different methods and answer different questions for your UX design process. Use both!

  • Is A/B Testing Faster than Usability Testing at Getting Results?

    If A/B testing can quickly show which design is best, why should a UX team bother doing usability studies and other user research?

  • Diary Studies

    Ask users to keep a diary throughout a fairly long period is great for researching customer journeys or other bigger-scope issues in user experience that go beyond a single interaction.

  • Field Studies vs. Ethnographic Studies vs. Contextual Inquiry

    What is the difference between a field study, an ethnographic study, and a contextual inquiry in a user experience design project? Not much. The main difference is that between field methods and lab-based user research.

  • Findability vs. Discoverability

    Locating features or content on a website or in an app happen in two different ways: finding (users look for the item) and discovering (users come across the item). Both are important, but require different user research techniques to evaluate.

  • Running a Remote Usability Test, Part 2

    Learn how to run a remote moderated usability test. This second video covers how to actually facilitate the session with the participant and how to end with debrief, incentive, and initial analysis with your team.

  • Running a Remote Usability Test, Part 1

    Learn how to run a remote moderated usability test. Part 1 covers starting the session with your participant and observers.

  • Catching Cheaters and Outliers in Remote Unmoderated Studies

    In remote usability studies, it's hard to identify test participants who should not be in the study because they don't fit the profile or don't attempt the task seriously. This is even harder in unmoderated studies, but it can (and should) be done.

  • Voodoo Usability

    Focus groups and surveys study users' opinions - not actual behavior - so they are misleading for the design of interactive systems like websites. Automated usability measures are just as misleading.

  • Collecting Feedback From Users of an Archive (Reader Challenge)

    How to collect usability data from site users, using a historical archive as the case study. Keep surveys simple, collect data from real-world usage, and get feedback from friends of the site.

  • Tracking the Growth of a Site

    Website usage must be tracked to plan server capacity needs and future business models. Examples show use of regression statistics to predict future traffic patterns.

  • The Use and Misuse of Focus Groups

    Focus groups can be a powerful tool in system development, but they should not be the only source of information about user behavior. In interactive systems development, the proper role of focus groups is not to assess interaction styles or design usability, but to discover what users want from the system.

  • Discount Usability for the Web

    Discount usability engineering is our only hope. We must evangelize methods simple enough that departments can do their own usability work, fast enough that people will take the time, and cheap enough that it's still worth doing. The methods that can accomplish this are simplified user testing with one or two users per design and heuristic evaluation.

  • Technology Transfer of Heuristic Evaluation and Usability Inspection

    Participants in a course on usability inspection methods were surveyed 7-8 months after the course. Factors which influenced adoption were cost, rated benefit of the method, relevance to current projects, and whether the methods had active evangelists.

  • Usability Testing for the 1995 Sun Microsystems' Website

    Paper prototyping, card sorting, and traditional usability testing were all employed to guide the design of the 1995 Sun Microsystems' Web site.

  • Overview of the 1995 Design of Sun Microsystems' Website, Using Iterative Design and User Testing

    Extensive usability testing was conducted to guide the 1995 Sun Microsystems' Web site design. This series of articles describes in detail the methods and findings of the design team.

  • Characteristics of Usability Problems Found by Heuristic Evaluation

    Heuristic evaluation is a good method of identifying both major and minor problems with an interface, but the lists of usability problems found by heuristic evaluation will tend to be dominated by minor problems, which is one reason severity ratings form a useful supplement to the method.

  • Summary of Usability Inspection Methods

    Usability inspection is the generic name for a set of methods that are all based on having evaluators inspect a user interface. Typically, usability inspection is aimed at finding usability problems in the design, though some methods also address issues like the severity of the usability problems and the overall usability of an entire system.

  • Severity Ratings for Usability Problems

    Rating usability problems according to their severity facilitates the allocation of resources to fix the most serious problems. Severity ratings are a combination of frequency, impact, and persistence.

  • Usability Laboratories: A 1994 Survey

    A summary of statistics for 13 usability laboratories in 1994, an introduction to the main uses of usability laboratories in usability engineering, and survey of some of the issues related to practical use of user testing and computer-aided usability engineering.

  • Goal Composition: Extending Task Analysis to Predict Things People May Want to Do

    This essay describes a technique for extending a task analysis based on the principle of goal composition. Basically, goal composition starts by considering each primary goal that the user may have when using the system. A list of possible additional features is then generated by combining each of these goals with a set of general meta-goals that extend the primary goals.