Research Methods Articles & Videos

  • Contextual Inquiry Pitfalls

    Contextual inquiry is a UX research method where you shadow people as they do their job (or leisure tasks), allowing you to ask questions in context. This video provides advice on overcoming the main challenges with this method.

  • Sympathy vs. Empathy in UX

    Sympathy acknowledges that users are having difficulties, but empathy goes further by understanding the users' needs and motivations.

  • UX Research Made Agile

    Test early and often is a key recommendation for UX research. Dora Brune shares her approach, including regular Open Test Labs to engage more product teams and make user research more agile. Kinder Eggs make for a nice warmup task, even in remote tests. (Recorded at a participant panel at the UX Conference.)

  • Ethnography in UX

    Good UX design requires understanding the context and patterns of human behavior, especially in new products or features that solve real needs. The 5 steps to rapid corporate ethnography lead you to these discoveries.

  • Recruiting and Screening Candidates for User Research Projects

    Know the inherent biases in your recruiting process and avoid them in order to recruit study participants that are representative for your target audience.

  • How Many Participants for a UX Interview?

    In the early stages of a UX-design project, recruit enough people to gain an in-depth understanding of users’ experiences and needs. The number of people needed for an interview study is often smaller than you think.

  • Context Methods: Study Guide

    Unsure where to start? Use this collection of links to our articles and videos to learn about ethnographic methods like field studies and diary studies — methods that help you learn about your user’s context.

  • Better UX Deliverables

    Communicating UX work and findings to the full team, stakeholders, and leadership requires engaging deliverables. Amanda Gulley shared her experience improving the design and usability of UX deliverables at a UX Conference participant panel.

  • Advanced User Testing Methods for Accelerating Innovation

    Two user research methods allow you to quickly test a large number of design alternatives, thus accelerating UX innovation. Rapid iterative design and within-subjects testing of multiple alternate designs aren't for every project, but are great when they do apply.

  • Triangulation: Combine Findings from Multiple User Research Methods

    Improve design decisions by looking at the problem from multiple points of view: combine multiple types of data or data from several UX research methods.

  • Quantitative Research: Study Guide

    Unsure where to start? Use this collection of links to our articles and videos to learn about quant research, quant usability testing, analytics, and analyzing data.

  • Identify and Document Your UX Methods

    For each research or design method you employ, create a document that defines this method and can be used to educate other team members on UX activities.

  • Partner with Other Research Teams in Your Organization

    To gain a holistic picture of your users, exchange data with the non-UX teams in your company who are collecting other forms of customer data, besides the user research you do yourself. You gain; they gain.

  • Data Is More than Numbers: Why Qualitative Data Isn’t Just Opinions

    Systematically gathered qualitative data is a dependable method of understanding what users need, why problems occur, and how to solve them.

  • Four Factors in UX Maturity

    Improving UX maturity requires growth and evolution across 4 high-level factors: strategy, culture, process, and outcomes.

  • How Many Participants for Quantitative Usability Studies: A Summary of Sample-Size Recommendations

    40 participants is an appropriate number for most quantitative studies, but there are cases where you can recruit fewer users.

  • Remote Usability Testing Costs

    We compare the budgets needed for different kinds of qualitative user research: in-person usability testing vs. remote studies run by software (unmoderated) or run by a human moderator.

  • Why 5 Participants Are Okay in a Qualitative Study, but Not in a Quantitative One

    Qualitative usability testing aims to identify issues in an interface, while quantitative usability testing is meant to provide metrics that capture the behavior of your whole user population.

  • 5 Facilitation Mistakes to Avoid During User Interviews

    Some common mistakes to avoid in UX interviews include poor rapport, multitasking, leading, insufficient probing, and poorly managed observers.

  • How Useful Is the System Usability Scale (SUS) in UX Projects?

    SUS is a 35-years old and thus well-established way to measure user satisfaction, but it is not the most recommended way of doing so in user research.

  • Contextual Inquiry Pitfalls

    Contextual inquiry is a UX research method where you shadow people as they do their job (or leisure tasks), allowing you to ask questions in context. This video provides advice on overcoming the main challenges with this method.

  • Sympathy vs. Empathy in UX

    Sympathy acknowledges that users are having difficulties, but empathy goes further by understanding the users' needs and motivations.

  • UX Research Made Agile

    Test early and often is a key recommendation for UX research. Dora Brune shares her approach, including regular Open Test Labs to engage more product teams and make user research more agile. Kinder Eggs make for a nice warmup task, even in remote tests. (Recorded at a participant panel at the UX Conference.)

  • Ethnography in UX

    Good UX design requires understanding the context and patterns of human behavior, especially in new products or features that solve real needs. The 5 steps to rapid corporate ethnography lead you to these discoveries.

  • Better UX Deliverables

    Communicating UX work and findings to the full team, stakeholders, and leadership requires engaging deliverables. Amanda Gulley shared her experience improving the design and usability of UX deliverables at a UX Conference participant panel.

  • Advanced User Testing Methods for Accelerating Innovation

    Two user research methods allow you to quickly test a large number of design alternatives, thus accelerating UX innovation. Rapid iterative design and within-subjects testing of multiple alternate designs aren't for every project, but are great when they do apply.

  • Triangulation: Combine Findings from Multiple User Research Methods

    Improve design decisions by looking at the problem from multiple points of view: combine multiple types of data or data from several UX research methods.

  • Identify and Document Your UX Methods

    For each research or design method you employ, create a document that defines this method and can be used to educate other team members on UX activities.

  • Partner with Other Research Teams in Your Organization

    To gain a holistic picture of your users, exchange data with the non-UX teams in your company who are collecting other forms of customer data, besides the user research you do yourself. You gain; they gain.

  • Remote Usability Testing Costs

    We compare the budgets needed for different kinds of qualitative user research: in-person usability testing vs. remote studies run by software (unmoderated) or run by a human moderator.

  • How Useful Is the System Usability Scale (SUS) in UX Projects?

    SUS is a 35-years old and thus well-established way to measure user satisfaction, but it is not the most recommended way of doing so in user research.

  • Usability Testing for Content

    Usability testing can yield valuable insights about your content. Make sure you test with the correct users, carefully craft the tasks, and ask the right follow-up questions.

  • Comparing Qualitative and Quantitative UX Research

    Qualitative and quantitative are both useful types of user research, but involve different methods and answer different questions for your UX design process. Use both!

  • Is A/B Testing Faster than Usability Testing at Getting Results?

    If A/B testing can quickly show which design is best, why should a UX team bother doing usability studies and other user research?

  • Diary Studies

    Ask users to keep a diary throughout a fairly long period is great for researching customer journeys or other bigger-scope issues in user experience that go beyond a single interaction.

  • Field Studies vs. Ethnographic Studies vs. Contextual Inquiry

    What is the difference between a field study, an ethnographic study, and a contextual inquiry in a user experience design project? Not much. The main difference is that between field methods and lab-based user research.

  • Findability vs. Discoverability

    Locating features or content on a website or in an app happen in two different ways: finding (users look for the item) and discovering (users come across the item). Both are important, but require different user research techniques to evaluate.

  • Running a Remote Usability Test, Part 2

    Learn how to run a remote moderated usability test. This second video covers how to actually facilitate the session with the participant and how to end with debrief, incentive, and initial analysis with your team.

  • Running a Remote Usability Test, Part 1

    Learn how to run a remote moderated usability test. Part 1 covers starting the session with your participant and observers.

  • Catching Cheaters and Outliers in Remote Unmoderated Studies

    In remote usability studies, it's hard to identify test participants who should not be in the study because they don't fit the profile or don't attempt the task seriously. This is even harder in unmoderated studies, but it can (and should) be done.

  • Accuracy vs. Insights in Quantitative Usability

    Better to accept a wider margin of error in usability metrics than to spend the entire budget learning too few things with extreme precision.

  • Interviewing Users

    Despite many weaknesses, interviews are a valuable method for exploratory user research.

  • Testing Expert Users

    It's more difficult to conduct usability studies with experienced users than with novices, and the improvements are usually smaller. Still, improving expert performance is often worth the effort.

  • Card Sorting: Pushing Users Beyond Terminology Matches

    It's easy to bias study participants, whether in user testing or in card sorting, if they focus on matching stimulus words instead of working on the underlying problem.

  • Building Respect for Usability Expertise

    Enemies of usability claim that because 'the experts disagree,' they can safely ignore user advocates' expertise and run with whatever design they personally prefer.

  • Guesses vs. Data as Basis for Design Recommendations

    Even the tiniest amount of empirical facts (say, observing 2 users) vastly improves the probability of making correct UI design decisions.

  • Fast, Cheap, and Good: Yes, You Can Have It All

    The sooner you complete a usability study, the higher its impact on the design process. Slower methods should be deferred to an annual usability checkup.

  • Data Visualization of Web Stats: Logarithmic Charts and the Drooping Tail

    Using a linear diagram to plot data from website traffic logs can lead you to overlook important conclusions. Sometimes advanced visualizations are worth the effort.

  • Traffic Log Patterns

    The relative popularity of a site's pages, the number of visitors referred by other sites, and the traffic from search queries continue to follow a Zipf distribution.

  • Quantitative Studies: How Many Users to Test?

    When collecting usability metrics, testing with 20 users typically offers a reasonably tight confidence interval.

  • Enterprise Usability

    Usability goes beyond the level of individual users interacting with screens. It's also a question of how easy or cumbersome it is for the entire organization to use a system.

  • Putting A/B Testing in Its Place

    Measuring the live impact of design changes on key business metrics is valuable, but often creates a focus on short-term improvements. This near-term view neglects bigger issues that only qualitative studies can find.

  • Authentic Behavior in User Testing

    Despite being an artificial situation, user testing generates realistic findings because people engage strongly with the tasks and suspend their disbelief.

  • Acting on User Research

    User research offers a learning opportunity that can help you build an understanding of user behavior, but you must resolve discrepancies between research findings and your own beliefs.

  • Card Sorting: How Many Users to Test

    Testing ever-more users in card sorting has diminishing returns, but test at least 15 users -- 3 times more than you would in traditional usability tests.

  • Keep Online Surveys Short

    To ensure high response rates and avoid misleading survey results, keep your surveys short and ensure that your questions are well written and easy to answer.

  • User Empowerment and the Fun Factor

    Designs that engage and empower users increase their enjoyment and encourage them to explore websites in-depth. Once we achieve ease of use, we'll need additional usability methods to further strengthen joy of use.

  • Field Studies Done Right: Fast and Observational

    Field studies should emphasize the observation of real user behavior. Simple field studies are fast and easy to conduct, and do not require a posse of anthropologists: All members of a design team should go on customer visits.

  • First Rule of Usability? Don't Listen to Users

    For good UX, pay attention to what users do, not what they say. Self-reported claims are unreliable, as are user speculations about future behavior.

  • Usability Metrics

    Although measuring usability can cost four times as much as conducting qualitative studies (which often generate better insight), metrics are sometimes worth the expense. Among other things, metrics can help managers track design progress and support decisions about when to release a product.