Contextual inquiry is a UX research method where you shadow people as they do their job (or leisure tasks), allowing you to ask questions in context. This video provides advice on overcoming the main challenges with this method.
Test early and often is a key recommendation for UX research. Dora Brune shares her approach, including regular Open Test Labs to engage more product teams and make user research more agile. Kinder Eggs make for a nice warmup task, even in remote tests. (Recorded at a participant panel at the UX Conference.)
Good UX design requires understanding the context and patterns of human behavior, especially in new products or features that solve real needs. The 5 steps to rapid corporate ethnography lead you to these discoveries.
Know the inherent biases in your recruiting process and avoid them in order to recruit study participants that are representative for your target audience.
In the early stages of a UX-design project, recruit enough people to gain an in-depth understanding of users’ experiences and needs. The number of people needed for an interview study is often smaller than you think.
Unsure where to start? Use this collection of links to our articles and videos to learn about ethnographic methods like field studies and diary studies — methods that help you learn about your user’s context.
Communicating UX work and findings to the full team, stakeholders, and leadership requires engaging deliverables. Amanda Gulley shared her experience improving the design and usability of UX deliverables at a UX Conference participant panel.
Two user research methods allow you to quickly test a large number of design alternatives, thus accelerating UX innovation. Rapid iterative design and within-subjects testing of multiple alternate designs aren't for every project, but are great when they do apply.
Improve design decisions by looking at the problem from multiple points of view: combine multiple types of data or data from several UX research methods.
Unsure where to start? Use this collection of links to our articles and videos to learn about quant research, quant usability testing, analytics, and analyzing data.
For each research or design method you employ, create a document that defines this method and can be used to educate other team members on UX activities.
To gain a holistic picture of your users, exchange data with the non-UX teams in your company who are collecting other forms of customer data, besides the user research you do yourself. You gain; they gain.
We compare the budgets needed for different kinds of qualitative user research: in-person usability testing vs. remote studies run by software (unmoderated) or run by a human moderator.
Qualitative usability testing aims to identify issues in an interface, while quantitative usability testing is meant to provide metrics that capture the behavior of your whole user population.
Contextual inquiry is a UX research method where you shadow people as they do their job (or leisure tasks), allowing you to ask questions in context. This video provides advice on overcoming the main challenges with this method.
Test early and often is a key recommendation for UX research. Dora Brune shares her approach, including regular Open Test Labs to engage more product teams and make user research more agile. Kinder Eggs make for a nice warmup task, even in remote tests. (Recorded at a participant panel at the UX Conference.)
Good UX design requires understanding the context and patterns of human behavior, especially in new products or features that solve real needs. The 5 steps to rapid corporate ethnography lead you to these discoveries.
Communicating UX work and findings to the full team, stakeholders, and leadership requires engaging deliverables. Amanda Gulley shared her experience improving the design and usability of UX deliverables at a UX Conference participant panel.
Two user research methods allow you to quickly test a large number of design alternatives, thus accelerating UX innovation. Rapid iterative design and within-subjects testing of multiple alternate designs aren't for every project, but are great when they do apply.
Improve design decisions by looking at the problem from multiple points of view: combine multiple types of data or data from several UX research methods.
For each research or design method you employ, create a document that defines this method and can be used to educate other team members on UX activities.
To gain a holistic picture of your users, exchange data with the non-UX teams in your company who are collecting other forms of customer data, besides the user research you do yourself. You gain; they gain.
We compare the budgets needed for different kinds of qualitative user research: in-person usability testing vs. remote studies run by software (unmoderated) or run by a human moderator.
Usability testing can yield valuable insights about your content. Make sure you test with the correct users, carefully craft the tasks, and ask the right follow-up questions.
Qualitative and quantitative are both useful types of user research, but involve different methods and answer different questions for your UX design process. Use both!
Ask users to keep a diary throughout a fairly long period is great for researching customer journeys or other bigger-scope issues in user experience that go beyond a single interaction.
What is the difference between a field study, an ethnographic study, and a contextual inquiry in a user experience design project? Not much. The main difference is that between field methods and lab-based user research.
Locating features or content on a website or in an app happen in two different ways: finding (users look for the item) and discovering (users come across the item). Both are important, but require different user research techniques to evaluate.
Learn how to run a remote moderated usability test. This second video covers how to actually facilitate the session with the participant and how to end with debrief, incentive, and initial analysis with your team.
In remote usability studies, it's hard to identify test participants who should not be in the study because they don't fit the profile or don't attempt the task seriously. This is even harder in unmoderated studies, but it can (and should) be done.
It's more difficult to conduct usability studies with experienced users than with novices, and the improvements are usually smaller. Still, improving expert performance is often worth the effort.
It's easy to bias study participants, whether in user testing or in card sorting, if they focus on matching stimulus words instead of working on the underlying problem.
Enemies of usability claim that because 'the experts disagree,' they can safely ignore user advocates' expertise and run with whatever design they personally prefer.
The sooner you complete a usability study, the higher its impact on the design process. Slower methods should be deferred to an annual usability checkup.
Using a linear diagram to plot data from website traffic logs can lead you to overlook important conclusions. Sometimes advanced visualizations are worth the effort.
The relative popularity of a site's pages, the number of visitors referred by other sites, and the traffic from search queries continue to follow a Zipf distribution.
Usability goes beyond the level of individual users interacting with screens. It's also a question of how easy or cumbersome it is for the entire organization to use a system.
Measuring the live impact of design changes on key business metrics is valuable, but often creates a focus on short-term improvements. This near-term view neglects bigger issues that only qualitative studies can find.
Despite being an artificial situation, user testing generates realistic findings because people engage strongly with the tasks and suspend their disbelief.
User research offers a learning opportunity that can help you build an understanding of user behavior, but you must resolve discrepancies between research findings and your own beliefs.
Testing ever-more users in card sorting has diminishing returns, but test at least 15 users -- 3 times more than you would in traditional usability tests.
To ensure high response rates and avoid misleading survey results, keep your surveys short and ensure that your questions are well written and easy to answer.
Designs that engage and empower users increase their enjoyment and encourage them to explore websites in-depth. Once we achieve ease of use, we'll need additional usability methods to further strengthen joy of use.
Field studies should emphasize the observation of real user behavior. Simple field studies are fast and easy to conduct, and do not require a posse of anthropologists: All members of a design team should go on customer visits.
Although measuring usability can cost four times as much as conducting qualitative studies (which often generate better insight), metrics are sometimes worth the expense. Among other things, metrics can help managers track design progress and support decisions about when to release a product.