Research Methods Articles & Videos

  • Contextual Inquiry Pitfalls

    Contextual inquiry is a UX research method where you shadow people as they do their job (or leisure tasks), allowing you to ask questions in context. This video provides advice on overcoming the main challenges with this method.

  • Sympathy vs. Empathy in UX

    Sympathy acknowledges that users are having difficulties, but empathy goes further by understanding the users' needs and motivations.

  • UX Research Made Agile

    Test early and often is a key recommendation for UX research. Dora Brune shares her approach, including regular Open Test Labs to engage more product teams and make user research more agile. Kinder Eggs make for a nice warmup task, even in remote tests. (Recorded at a participant panel at the UX Conference.)

  • Ethnography in UX

    Good UX design requires understanding the context and patterns of human behavior, especially in new products or features that solve real needs. The 5 steps to rapid corporate ethnography lead you to these discoveries.

  • Recruiting and Screening Candidates for User Research Projects

    Know the inherent biases in your recruiting process and avoid them in order to recruit study participants that are representative for your target audience.

  • How Many Participants for a UX Interview?

    In the early stages of a UX-design project, recruit enough people to gain an in-depth understanding of users’ experiences and needs. The number of people needed for an interview study is often smaller than you think.

  • Context Methods: Study Guide

    Unsure where to start? Use this collection of links to our articles and videos to learn about ethnographic methods like field studies and diary studies — methods that help you learn about your user’s context.

  • Better UX Deliverables

    Communicating UX work and findings to the full team, stakeholders, and leadership requires engaging deliverables. Amanda Gulley shared her experience improving the design and usability of UX deliverables at a UX Conference participant panel.

  • Advanced User Testing Methods for Accelerating Innovation

    Two user research methods allow you to quickly test a large number of design alternatives, thus accelerating UX innovation. Rapid iterative design and within-subjects testing of multiple alternate designs aren't for every project, but are great when they do apply.

  • Triangulation: Combine Findings from Multiple User Research Methods

    Improve design decisions by looking at the problem from multiple points of view: combine multiple types of data or data from several UX research methods.

  • Quantitative Research: Study Guide

    Unsure where to start? Use this collection of links to our articles and videos to learn about quant research, quant usability testing, analytics, and analyzing data.

  • Identify and Document Your UX Methods

    For each research or design method you employ, create a document that defines this method and can be used to educate other team members on UX activities.

  • Partner with Other Research Teams in Your Organization

    To gain a holistic picture of your users, exchange data with the non-UX teams in your company who are collecting other forms of customer data, besides the user research you do yourself. You gain; they gain.

  • Data Is More than Numbers: Why Qualitative Data Isn’t Just Opinions

    Systematically gathered qualitative data is a dependable method of understanding what users need, why problems occur, and how to solve them.

  • Four Factors in UX Maturity

    Improving UX maturity requires growth and evolution across 4 high-level factors: strategy, culture, process, and outcomes.

  • How Many Participants for Quantitative Usability Studies: A Summary of Sample-Size Recommendations

    40 participants is an appropriate number for most quantitative studies, but there are cases where you can recruit fewer users.

  • Remote Usability Testing Costs

    We compare the budgets needed for different kinds of qualitative user research: in-person usability testing vs. remote studies run by software (unmoderated) or run by a human moderator.

  • Why 5 Participants Are Okay in a Qualitative Study, but Not in a Quantitative One

    Qualitative usability testing aims to identify issues in an interface, while quantitative usability testing is meant to provide metrics that capture the behavior of your whole user population.

  • 5 Facilitation Mistakes to Avoid During User Interviews

    Some common mistakes to avoid in UX interviews include poor rapport, multitasking, leading, insufficient probing, and poorly managed observers.

  • How Useful Is the System Usability Scale (SUS) in UX Projects?

    SUS is a 35-years old and thus well-established way to measure user satisfaction, but it is not the most recommended way of doing so in user research.

  • Contextual Inquiry Pitfalls

    Contextual inquiry is a UX research method where you shadow people as they do their job (or leisure tasks), allowing you to ask questions in context. This video provides advice on overcoming the main challenges with this method.

  • Sympathy vs. Empathy in UX

    Sympathy acknowledges that users are having difficulties, but empathy goes further by understanding the users' needs and motivations.

  • UX Research Made Agile

    Test early and often is a key recommendation for UX research. Dora Brune shares her approach, including regular Open Test Labs to engage more product teams and make user research more agile. Kinder Eggs make for a nice warmup task, even in remote tests. (Recorded at a participant panel at the UX Conference.)

  • Ethnography in UX

    Good UX design requires understanding the context and patterns of human behavior, especially in new products or features that solve real needs. The 5 steps to rapid corporate ethnography lead you to these discoveries.

  • Better UX Deliverables

    Communicating UX work and findings to the full team, stakeholders, and leadership requires engaging deliverables. Amanda Gulley shared her experience improving the design and usability of UX deliverables at a UX Conference participant panel.

  • Advanced User Testing Methods for Accelerating Innovation

    Two user research methods allow you to quickly test a large number of design alternatives, thus accelerating UX innovation. Rapid iterative design and within-subjects testing of multiple alternate designs aren't for every project, but are great when they do apply.

  • Triangulation: Combine Findings from Multiple User Research Methods

    Improve design decisions by looking at the problem from multiple points of view: combine multiple types of data or data from several UX research methods.

  • Identify and Document Your UX Methods

    For each research or design method you employ, create a document that defines this method and can be used to educate other team members on UX activities.

  • Partner with Other Research Teams in Your Organization

    To gain a holistic picture of your users, exchange data with the non-UX teams in your company who are collecting other forms of customer data, besides the user research you do yourself. You gain; they gain.

  • Remote Usability Testing Costs

    We compare the budgets needed for different kinds of qualitative user research: in-person usability testing vs. remote studies run by software (unmoderated) or run by a human moderator.

  • How Useful Is the System Usability Scale (SUS) in UX Projects?

    SUS is a 35-years old and thus well-established way to measure user satisfaction, but it is not the most recommended way of doing so in user research.

  • Usability Testing for Content

    Usability testing can yield valuable insights about your content. Make sure you test with the correct users, carefully craft the tasks, and ask the right follow-up questions.

  • Comparing Qualitative and Quantitative UX Research

    Qualitative and quantitative are both useful types of user research, but involve different methods and answer different questions for your UX design process. Use both!

  • Is A/B Testing Faster than Usability Testing at Getting Results?

    If A/B testing can quickly show which design is best, why should a UX team bother doing usability studies and other user research?

  • Diary Studies

    Ask users to keep a diary throughout a fairly long period is great for researching customer journeys or other bigger-scope issues in user experience that go beyond a single interaction.

  • Field Studies vs. Ethnographic Studies vs. Contextual Inquiry

    What is the difference between a field study, an ethnographic study, and a contextual inquiry in a user experience design project? Not much. The main difference is that between field methods and lab-based user research.

  • Findability vs. Discoverability

    Locating features or content on a website or in an app happen in two different ways: finding (users look for the item) and discovering (users come across the item). Both are important, but require different user research techniques to evaluate.

  • Running a Remote Usability Test, Part 2

    Learn how to run a remote moderated usability test. This second video covers how to actually facilitate the session with the participant and how to end with debrief, incentive, and initial analysis with your team.

  • Running a Remote Usability Test, Part 1

    Learn how to run a remote moderated usability test. Part 1 covers starting the session with your participant and observers.

  • Catching Cheaters and Outliers in Remote Unmoderated Studies

    In remote usability studies, it's hard to identify test participants who should not be in the study because they don't fit the profile or don't attempt the task seriously. This is even harder in unmoderated studies, but it can (and should) be done.

  • Rating Scales in UX Research: Likert or Semantic Differential?

    Likert and semantic differential are instruments used to determine attitudes to products, services, and experiences, but depending on your situation, one may work better than the other.

  • How Well Discovery Phases Are Performed in UX Projects

    Our survey results reveal that many UX practitioners perform discoveries in some shape and form, although many are on the short side, lack user research and don’t involve the right people.

  • Benchmarking UX: Tracking Metrics

    Quantitatively evaluate a product or service’s user experience by using metrics to gauge its relative performance against a meaningful standard.

  • Remote Moderated Usability Tests: How to Do Them

    The key to good remote moderated testing is to be thoroughly prepared and organized. Follow these 7 steps to ensure your study’s success.

  • What a UX Career Looks Like Today

    Our latest research on UX careers looks into specialization, explores unique backgrounds of practitioners entering the field, and details the skills and responsibilities needed to work in UX today.

  • Remote Moderated Usability Tests: Why to Do Them

    Remote unmoderated usability testing is so fast and easy that some teams make it their only evaluation method. But don’t shy away from its more robust alternative, the remote moderated usability test, which can give you more information and is also inexpensive.

  • The Discovery Phase in UX Projects

    Although there can be many different instigators, roles, and activities involved in a discovery, all discoveries strive to achieve consensus on the problem to be solved and desired outcomes.

  • How to Maximize Insights in User Testing: Stepped User Tasks

    You can learn the right kind of things and much more in user tests if you start with broad tasks instead of immediately leading to areas of interest. Prepare additional, focused tasks that can be used to direct users.

  • The Critical Incident Technique in UX

    The CIT is a research method for systematically obtaining recalled observations of significant events or behaviors from people who have first-hand experience.

  • Ethical Maturity in User Research

    How ethically mature are your user-research practices? Assess your current state of ethical maturity by answering these simple questions.

  • The Diverge-and-Converge Technique for UX Workshops

    By first working independently on a problem, then converging to share insights, teams can leverage the benefits of both work styles, leading to rapid data analysis, diverse ideas, and high-quality designs.

  • Tracking Research Questions, Assumptions, and Facts in Agile

    User-related questions and assumptions are not tracked throughout a product’s lifecycle, causing misalignment and overconfidence. Documenting these questions and assumptions in a knowledge board differentiates them from real facts.

  • Usability Testing 101

    UX researchers use this popular observational methodology to uncover problems and opportunities in designs.

  • Unmoderated User Tests: How and Why to Do Them

    The 6 steps for running unmoderated usability testing are: define study goals, select testing software, write task descriptions, pilot the test, recruit participants, and analyze the results.

  • Iterative Design of a Survey Question: A Case Study

    Through 4 rounds of revisions, we made a survey more specific and usable. Running pilot studies before conducting a full-scale survey ensures you’ll reach your research goal.

  • How to Analyze Qualitative Data from UX Research: Thematic Analysis

    Identifying the main themes in data from user studies — such as: interviews, focus groups, diary studies, and field studies — is often done through thematic analysis.

  • Tools for Unmoderated Usability Testing

    Many platforms for unmoderated usability testing have similar features; to choose the best tool for your needs, focus on the type of data that you need to collect for your goals.

  • Setup of an Eyetracking Study

    If you’re planning on running your own eyetracking study, pay attention to equipment, supplies, and placement to ensure high quality data.

  • Cognitive Mapping in User Research

    In cognitive mapping sessions, users are asked to produce a visual representation of their mental models. This type of user interview can provide stimulus for conversation, generate insights, and act as a facilitation aid.

  • Formative vs. Summative Evaluations

    Formative evaluations are used in an iterative process to make improvements before production. Summative evaluations are used to evaluate a shipped product in comparison to a benchmark.