A user interview is a UX research method during which a researcher asks one user questions about a topic of interest (e.g., use of a system, behaviors and habits) with the goal of learning about that topic. Unlike focus groups, which involve multiple users at the same time, user interviews are one-on-one sessions (although occasionally several facilitators may take turns asking questions).

UX Interviews tend to be a quick and easy way to collect user data, so they are often used, especially in Lean and Agile environments. They are closely related to journalistic interviews and to the somewhat narrower and more formal HCI method called the critical incident technique, which was introduced in 1954 by John Flanagan.

Although you may feel that doing a UX user interview is simple and straightforward, there is more to a good interview than many people realize. Here, I distill some of the best practices.

Why Do User Interviews?

Interviews give insights into what users think about a site, an application, a product, or a process. They can point out what site content is memorable, what people feel is important on the site, and what ideas for improvement they may have. They can be done in a variety of situations:

  • before you have a design, to inform personas, journey maps, feature ideas, workflow ideas
  • to enrich a contextual inquiry study by supplementing observation with descriptions of tools, processes, bottlenecks, and how users perceive them
  • at the end of a usability test, to collect verbal responses related to observed behaviors
    • (Do defer the interview until after the behavioral observation segment of the usability study: if you ask questions before the participant tries to perform tasks with your design, you will have primed the user to pay special attention to whatever features or issues you asked about.)

How to Do a User Interview

First and foremost, think of an interview as a type of research study, not a sales session or an informal conversation. Then, use the following tips to make your interviews most effective.

Set a goal for the interview.

Ask product stakeholders what they want to learn. From their desires, determine the main goal, ensuring that it’s realistic. Too broad of a goal, like learn about users, is a likely to make interviews fail, because it will not focus your questions in a direction relevant to your design needs. A concise, concrete goal related to a specific aspect of the users’ behavior or attitudes can bring the team to consensus, and direct how you’ll construct the interview.

Examples of good interview goals:

How do nurses feel about logging medical data, and what are the processes they believe they use?

Learn how architects share CAD drawings with engineers, and where they feel there are challenges and opportunities.

Find out how bicycle couriers get the best route directions, and what they feel works well, where they think there are issues, and how they think things could be improved.

Make the user feel as comfortable as possible. Create a rapport with the user.

People are more likely to remember, talk, and let their guard down if they feel relaxed and trust the interviewer and the process. Here are some tips for an effective interview.

  1. Have a video call or phone call (or at least some interaction) with the user before the interview itself.
  2. Before the interview day, and also at the start of the actual interview, explain the reason for the interview, and how the data from it will be used.
  3. Make the user feel heard by taking notes, nodding, frequent eye contact, offering acknowledgments like “I see,” and repeating the words the user said.
  4. Let users finish their thoughts. Do not interrupt them.
  5. Don’t rush the user. Pause. Slow down your pace of speech. Talking slowly has a calming effect and indicates that you are not anxious and that you have time to listen.
  6. Start with questions that are easy to answer and that are unlikely to be interpreted as personal or judgmental. For example, instead of “What was the last book you read?” try “What do you like to do in your spare time?” The latter is open-ended, while the former assumes the user read a book recently; those who did not may feel stupid.
  7. Show some empathy by asking related questions. But recall that it is difficult to act sympathetic without also being leading or making assumptions. For example, imagine that a user said he could not reach the customer-support team. You can show some concern by asking the user to elaborate: “You couldn’t reach support. Can you tell me more about that?” You could even try a question like “How did that make you feel?” but only if the user did not already indicate how he felt. For example, if the user already verbally or even nonverbally expressed frustration when recalling the event, then asking how he felt would seem as though the interviewer had not been listening. As an empathetic human being, you might want to say, “That must have been frustrating,” or “I’m sorry your time was wasted like that.” But those would be leading points. Instead, asking a question that relates to the users’ feelings can show that you are listening and feel for their plight. At the absolute end of the interview, you can express some of these more apologetic sentiments.
  8. Be authentic, and don’t fake empathy. Acting can make you appear disingenuous. It is better to be yourself; don’t say something if you don’t genuinely feel it.

Keep in mind that there’s a big difference between rapport and friendship. The user does not have to really like you, think you’re funny, or want to invite you out for a cup of coffee in order to trust you enough to be interviewed.

Prepare questions before the interview.

While you will likely think of questions while sitting with the user, do bring to the interview a list of questions you aim to have answered. A question list ensures that you will:

  • be able to get your team’s feedback about your questions before the interview
  • remember everything that you wanted to know and ask users about as many of the right topics as possible during the interview
  • construct clear, nonleading questions better than you would in the moment
  • overcome your stress or fatigue by having questions on hand to refer to

Anticipate different responses, and construct followup questions based on your research goals.

Of course, the whole reason you are doing interviews is because you don’t already know or feel completely confident about what people will say. Yet, anticipating answers to the best of your ability can help you better prepare for the interview.

Think about what you would do if you hit a dead end — in other words, if the user did not have a response for your question. Are there ways in which you can help the user to find an answer? For example, imagine you are working on a new travel website, and that a participant was recruited because she has booked travel online within the last 6 months. Let’s pretend that some of the research goals of the interview are:

  1. Do people remember how they chose vacation destinations?
  2. What’s memorable about vacations?
  3. What do users feel is easy about booking travel now?

To begin, ask users if they can recall a time when they booked travel. Prepare additional questions in case they can’t remember a relevant event right away. See the image below for a possible flow addressing that situation.

On the left side, the interviewer asks a few questions and gets an answer from the user. On the right side, the interviewer asks several questions before getting the answer.
Examples of how two different people might respond to the same question followup questions (in grey boxes) that the interviewer may ask to get to the same place.

Write dialog-provoking interview questions.

  • In each question, ask for just one thing. Instead of “Do you use a navigation system, and if so, which one?” try “How often do you use a navigation system?” then follow up with “Which one or ones do you use?”
  • Jog the memory by asking about specific events rather than about general processes. Remembering an incident will nudge the user’s memory and enable them to talk about precise occurrences.

For example, imagine the interviewer is a doctor who wants to know the last time a patient had an asthma attack. She reviewed the patent’s history and anticipated some questions. An interview might go like the one in the image below.

A doctor asks about asthma attacks and the patient says she had none. But when the doctor probes about travel and exercise it reminds the user that she did, indeed, have an asthma attack.
Example questions (in grey boxes) that a doctor might ask to learn about how a patient’s asthma attacks were triggered.
  • After you ask about an event (e.g., an asthma attack), wait a few moments to give the user and opportunity to think about that event. Then begin asking questions about the event, such as, “When did that happen?” or “What were you doing before that happened?”

Avoid leading, closed, or vague questions.

Ideally, your questions should elicit rich, unbiased answers from the interviewee.

  • Leading questions prime the user by inadvertently suggesting a response. For example, a question like “Why do you enjoy using the Acme product so much?” suggests that the user uses the product and enjoys using it. A better question might be “Why do you use the Acme product?”
  • Closed questions elicit “yes” or “no” answers. For example, if an interviewer asks, “So, you use the Acme product each morning?” then the participant could sincerely respond with just a, “yes”, and not elaborate. A better question might be “Can you tell me about how you use Acme?”

A caveat: while closed questions are less likely to elicit wordy answers, they are easier for users than open-ended questions. Sometimes, you can precede an open-ended question with a closed one to ease the user into a topic or protect users from feeling stupid when they don’t remember an event.

For example:

  • “Do you remember when that happened?”
  • “Yes.”
  • “When was it?”

(This type of question sequence is okay during a user interview, but is less appropriate in a usability test, where we want to limit interaction with the user as much as possible.) Vague, ambiguous questions are difficult to understand and often confuse participants. They can also make people feel uncomfortable or guilty for not understanding what you mean. To figure out if a question is too vague, consider informally testing it with random people to see if they understand what you mean.

Prepare more questions than you believe you will have time to ask.

Some participants like to talk and give very long answers to questions. Others need prompting in the form of followup questions to deliver the same amount of information. Be ready to address both situations.

Practice your go-to followup questions.

Have at the ready some clear phrases to prompt users to elaborate an answer. I have used:

Can you tell me more about that? I want to make sure I understand this. Can you explain more?
These questions can be used in virtually any situation.

Locations for Interviews

User interviews can be conducted in many different locations — at the user’s site, in a controlled environment like a lab, or remotely, using online-meeting tools.

Consider these factors when choosing locations:

  • User convenience and comfort: Which location will be most comfortable and easiest for the users? Will it be more likely that they will not cancel if the session is in their office or at their homes?
  • Team convenience: Do you want your team to observe the interviews?
  • Context and examples: Is it important that users have their own tools and other environmental elements at the interview? Artifacts can nudge the interviewee’s memory and can also paint a better picture of the users’ processes for the interviewer. However, sometimes getting people out of their usual environments can help them think freely and creatively.
  • Bias: Is the location likely to sway the users’ stories? If you brought people to your Acme office and asked about Acme usage, will they say more nice things about Acme than if they were in a different location? (Spoiler alert: the answer is yes.)

Interviews vs. Usability Tests

Panel 1, interview: interviewer and user are facing one another, asking onen-ended questions, and has a design to refer to. Panel 2, usability test: researcher is observing, user thinks aloud, focus on the design.

 

Some researchers confuse the user-interview method with the usability-testing method. While the methods do have some commonalities and a user-testing session may include an interview at the end, the differences are many and important. Some of these differences are summarized in the table below.

Differences Between User Interviews and Usability Tests

 

Interview

Usability Test

A design (early sketch, prototype, or working software) is necessary for the study.

No

It’s possible to ask questions in the absence of any design.


 

Yes

In a usability test, users interact with the design.


 

User data is behavioral.

No
Users report their beliefs and perceptions in an interview.

Yes
Researchers observe what the users do.

(Some) data is self-reported.

Yes

Yes

In a user test, researchers base their findings not only on what people do, but also on what people say.

The participant must talk a lot to for the research to be effective.

Yes
Interviews rely on the user giving opinions, recalling events, and discussing them.

No
A usability tests can be informative even if the user doesn’t talk much.

Facilitators/interviewers maintain normal eye contact with the user, as they would in any conversation.

Yes
The interviewer often faces the user or sits by her, and looks at her as though they are having a conversation.

No
Usability-test facilitators avoid the user’s direct line of vision and sit next to and a bit behind the user: ideally users suspend disbelief and act as if they were on their own.

Facilitator creates a somewhat strong rapport with the participant.

Yes
Interviewers usually need to bond at least slightly with the user to elicit information.

No
Usability tests facilitators should be warm, polite, straightforward, and trustworthy in the session setup. But, during the session, they should fade into the background of the test as much as possible.

What You’ll Learn

Before you do a user interview, consider what it is that you want to learn, then choose your research method. To help decide between an interview and a usability test, refer to the table below.

Types of Things Learned in Interviews vs. Usability Tests

 

Interview

Usability Test

Whether a design is easy to use

No
 

Yes

What makes a design easy or difficult

No

Yes

Whether people believe they would use a design

Yes

Yes

Whether people would use a design

Maybe

Maybe

Note that neither user interviews nor usability tests are guaranteed to tell you whether people will actually use a design. Asking users “Would you use this?” prompts them to rationalize their answer and potentially ignore certain aspects of the reality that are likely to affect their behavior but may go against their response. And a usability test encourages participants to engage with a design more than they might normally do (as they complete different tasks); in doing so, they may discover features or qualities that can ultimately affect their willingness to use the design.

A word of advice: don’t choose to do an interview just because you don’t know how to do a usability test or because you can’t stay silent while a participant uses a design. Almost anyone can [learn how to] do a usability test.

Limitations of Interviews

Unlike behavioral data that captures how participants interact with a design, data from interviews is self-reported — it reflects users’ perceptions and feelings about a process, a site, or an interaction. Like any self-reported data (included that from focus groups and surveys), interview data is tenuous because:

  1. Human memory is flawed, so people don’t recall events fully or accurately.
  2. Participants don’t know exactly what is relevant for the interviewer, so sometimes leave out details. They usually don’t think minor interactions are important enough to bring up.
  3. Some people are proud or private, others are shy and easy to embarrass. Thus, not everybody will share every detail with a stranger.

Conclusion

Interviews are a quick and easy way to get a sense of how users feel, think, and what they perceive to be true. Do them, but complement them with observation-based research to attain an accurate and thorough sense of what users really do and a higher feeling of confidence with the information you collect.

Learn more: User Interviews, Advanced techniques to uncover values, motivations, and desires, a full-day course at the UX Conference.

Reference

Flanagan, John C. (1954). The Critical Incident Technique. Psychological Bulletin. 51(4), 327-358. http://dx.doi.org/10.1037/h0061470