User interviews are a core user experience method. They are a great way to gain foundational knowledge about the problems your users are facing. And a lot of people mess it up.
User interviewing isn’t just chatting with people. It’s not just asking them what they like and dislike. User interviewing is about trying to get to the core of what a user is trying to do and what their problems are.
User interviewing needs to be approached with proper rigor and respect. A lot of people think a user interview is just an informal chat with a person to run an idea or prototype past them. This is false. That is garbage.
User interviews need to be rigorous. The questions need to be well thought out and purposeful, otherwise you won’t get data worth acting upon. The only thing worse than no data is bad data.
User interviews are also a foundational user-centered design method that is best early on in the process, before you have an idea and prototypes. Usability testing and other methods are better for getting feedback from users on prototypes.
User interviewing is not about asking people want they want you to build or do. It’s not even about asking them what they want. User interviewing is about learning what a user is trying to do and understanding what is and isn’t working for them.
User interviewing is a good method for figuring out users’ problems. User-centered design and design thinking are all about making sure we identify the correct problems and then figuring out how to solve them for people.
User interviewing is a core user-centered design method. You’ll notice that working with users is my first guideline for thoughtful product design.
Here are my core tips for user interviewing
Don’t ask leading or directed questions
Leading questions can take a few different forms — all bad. The most obvious form is a leading question designed to elicit a specific response.
If you are asking those kids of leading questions, you aren’t ready for user interviewing or you are not open to honest feedback. But a lot of other leading questions are because people almost get nervous with asking stark, non-leading, open-ended questions.
Instead of just asking the question, people will ask the question and then suggest a possible answer or two. A common, benign example would be, “Got exciting plans for the weekend? Maybe watching some football?”
What happens is that you have prejudiced the answer by priming thoughts in a person’s mind. You want answers to be as expansive as possible. This means don’t provide users with possible answers.
But in this example, we have now primed a person to think about football. They will spend a noticeable amount of time thinking about whether or not they are actually going to watch football, and if so, they’ll tell you about it.
People do this all the time. It’s almost a nervous tick, like they can’t just ask a stark question. Embrace asking stark questions. This is a user interview, not a chat with a friend.
Don’t ask people what they want
Users don’t know what they want. They’re not professional designers.
Focus on trying to find out users’ problems and what they are trying to do.
Your questions should not be, “what would you like us to do?” Rather they should be, “what are you trying to do?”
Think of this possibly apocryphal example from Henry Ford:
“If I had asked people what they wanted, they would have said faster horses.”
Proper user interviewing would have discovered that people may have wanted faster forms of transportation or more reliable ones or ones that don’t require so much feeding and upkeep.
Users aren’t designers. They can’t conceive of the unbuilt. Our job as designers is to find out what people’s problems are, and then build the future.
Ask open-ended questions
Open-ended questions, especially when you are doing foundational user research, are a key to finding out what people are trying to do and what their problems are.
Avoid yes/no questions, which I detail more below. You want to ask open-end questions because they yield expansive responses.
Also avoid vague words. Vague words are open to interpretation, which might lead you to be misled by someone’s response. For example, asking a user “is this a useful feature?” could give you bad data.
What makes for a useful feature? Many people will answer that question from the lens that, “I’m sure someone will find this feature useful, therefor it is a useful feature.” But that’s not what you were trying to find out. You want to know if this particular user finds this feature useful.
Instead, ask something like “is this feature valuable to the work you do right now?”
This is a very similar question, but it has two key distinctions: It is explicitly asking a person if this feature is valuable to their own work and it is asking if it is valuable right now. Valuable is also more specific than useful.
A lot of things have some degree of utility for people, but being valuable goes beyond mere minor utility.
Don’t ask yes/no questions
Occasionally, you may need to ask a yes/no follow-up question for clarification, but in general, user interviewing is specifically to get expansive responses. Asking yes/no questions will rarely yield good, detailed, expansive responses.
You may not even be intending to ask a yes/no question, but if you give people the opportunity to respond with one word, they often will.
For example, you may ask, “do you send text messages,” thinking that the person will respond that they do and then detail what kinds of text messages they send and to whom. But it’s actually very likely they’ll just respond with “yes.”
Instead, if what you are trying to figure out is what kinds of messages they send and to whom, you need to be direct. “What kinds of text messages do you send?”
If the person sends text messages, they’ll detail what they send and why. If the answer is that they do not send text messages, they’ll tell you that, and there is a good chance that they’ll explain why they don’t send text messages.
Don’t make assumptions. Ask the stupid questions.
You can miss a lot of key information if you don’t ask basic questions. This often happens because you assume you already know the answer.
Also, basic questions are a great way to ease a person into an interview. I tend to start every user interview with basic questions that a user doesn’t have to think much about. This will get them talking and comfortable. And, hey, maybe I’ll learn something I didn’t know before.
For instance, we start user interviews with a question like, “What does your organization do and how does your role fit in there?”
I largely know the answers to these two questions, but I’ll often learn additional details and nuance that I didn’t already know. This may come in handy for follow-up questions or when I’m putting together the user scenario.
Have a set of questions you use every time
For your user interviews to be methodologically valid, you need to use the same base set of questions every time. Do not wing it and just ask random stuff when you interview people.
Coming up with a good set of user interview questions can take time. It sometimes takes me hours, and I have a ton of old user interview scripts to look to for inspiration. I may also test a script out on a few users and refine it before taking it out to a large set of users.
Our user interview scripts range from about 15 questions to upwards of 40. This will depend on how complex the product is, and if we are doing an interview about the entire product or just part of the product.
Previously I mentioned that user interviews are more valuable earlier on in the user-centered design process. I find that when people don’t have a good set of questions and are kind of just asking generic stuff it’s because they are using it later on in the product development process, just trying to see if they can get people to say that they think a prototype is interesting.
Ideally, you want to test a prototype with an actual task list to verify that users can do what they need to do with your product. You will follow up that usability study with an interview and survey, but user interviewing cannot take the place of testing.
Ask the same question from multiple angles
A lot of people need to be deep in a user interview before they can really get to the core of what they think. I have found that asking the same general question from multiple angles can yield a lot more useful data than just asking it once.
The key to doing this is to spread these questions throughout your user interview script. Putting them back to back to back will not go well. Rather you want to ask the question once, ask a bunch more questions to get the person thinking deeper and then ask a similar question from another angle to see if you can get deeper information. You may repeat this one or two me times.
I only do this with a question or two that gets at the heart of what we are trying to discover with this user research.
Never mention other users
This is another way to prejudice answers from users. Never mention how other users find using your product.
Here is an example: “A lot of people say it’s easy to use this search engine. Do you agree?”
You’ve just told them that a lot of people find this easy to use. Basically, you are asking, “are you smart like these other people or are you dumb and unable to figure this out?” Of course a person is going to respond that they agree!
Instead, ask a simple, stark question like this: “Describe your experience with this search engine.”
Notice how this question is also open ended and can go in a lot of different directions. The version that mentions other users is a closed question that is basically a yes/no question.
Ask follow up questions
While you should ask the same set of base questions with every user, you should also ask follow up questions if they can help clarify what a user has said and if they can give you additional insight.
If it makes a lot of sense to immediately ask a follow-up question, I will. Other times, a user will say something that makes me think of another question I want to ask, but I don’t want to interrupt the user interview script.
I’ll write down what I want to ask and then ask it later on when it makes more sense.
Make note of the questions that don’t work
Some questions are duds. They may not elicit much information. Other questions just confuse people. Occasionally you may have a questions that causes people to react negatively.
Even if you commit to not swapping out a user interview question during a round of user interviewing, you should make note of which of your questions don’t work so that you can make sure not to use them in future user research.
One of the reason that I insist on always taking notes or filling out empathy maps while I interview people is to force myself to slow down. Even if I am with a dedicated note taker, I also take my own notes. This is critical.
The reason is that if an interviewer has nothing else to do besides ask questions, she will move too quickly and often not let a person fully think through an answer.
Embrace awkward silences. Do not fill them. Never make small talk during this.
Most of the time, the person being interviewed will fill the awkward silence with more info. I often find the best information comes out during these silences.
Find something to do while you interview people that forces you to slow down and process information.
Don’t be afraid of the truth
When you look at the questions a lot of people ask of their users, it’s clear they are afraid of the truth. User interview questions should cause users to think and reflect and that may mean they have some rather negative things to say about your product.
If you find that your user interviewing is not yielding a lot of good data, it may be because you are afraid of the truth, and you are asking a lot of softball questions.
If you can’t, user research isn’t for you.
If you are doing user research as a consultant on someone else’s product, it’s pretty easy to be neutral, but many people struggle to be neutral when it is their own company or product.
This is one reason a lot of product managers don’t make for good interviewers. They are too invested in their own work and will often look to conduct research to validate what they have already done — instead of doing user research to find out the truth.
Part of being neutral is not defending your product or company when someone says something during an interview. Your job during a user interview is to be a psychologist intently listening to someone’s problems and not a PR person looking to spin everything.
Ask a user to show you
If a user mentions something interesting that you could actually see, have the user show it to you. I was recently doing some user interviewing about the reporting capabilities in one of our products, and a user mentioned that they take the data from our product and have a designer create a custom report, instead of using the built-in reports.
Seeking to better understand why this is, I asked the user to send us the report they have their designer create and also the report with the same data as it comes out of our system.
The data in both versions of the report was the same. But the user wanted a very stripped down, layperson friendly version that we didn’t offer. Seeing both versions side by side as we talked through it was invaluable.
This is also why interviewing people in person is superior to remote. When you go in person, a user interview can begun to morph into a kind of light contextual inquiry.
One at a time if possible
Focus groups are fairly worthless. Try to prevent your user interviews from devolving into focus groups.
Ideally, you are interviewing one person at a time. Sometimes a company will want you to talk to a few people at a time, and you may not be able to talk them out of this. Group dynamics will take over, and you won’t get an accurate picture of everyone’s true feelings, but you should at least get an accurate picture of at least one person at that company’s true feelings.
Interviewing 2-3 people at a time from one company isn’t that bad. It will, however, make transcription incredibly difficult. You will also need more note takers and more people to help out with empathy maps and user scenarios.
One key tip is to encourage everyone in the room to answer the question. Don’t let one person speak for the whole room.
Record the interview if possible
Not everyone will agree to be recorded, but you should always ask and try. Audio recordings at a minimum. Video is even better.
Video is particularly valuable if there is a chance that this interview turns into light contextual inquiry where a person starts showing you how they use your product (or similar products).
Recording interviews means that people who didn’t attend the interview can see and hear for themselves.
Take detailed notes
Recording or not, I always take very detailed notes, filled with direct quotes. I don’t reference the recordings regularly. I record interviews mostly for other people to listen or watch. Recordings are also needed if you want to transcribe the interview.
Good, detailed notes complete with direct quotes and tagging of key themes will make synthesizing the data later on much quicker.
We also make empathy maps and user scenarios of every user interview we conduct. We try to do these live during the interview, if possible, but having detailed notes to immediately refer to after the interview makes this process much easier and faster.
Synthesize findings and make recommendations
The synthesis of your findings and the recommendations you provide are more valuable than the raw materials you have created. Most people are not interested in reading your notes or listening to a recording.
What people are ultimately paying you for is strong synthesis of all of the data you collected and actionable recommendations. This synthesis should come with actionable recommendations. This is the first step to converting user research to change.
It used to be that people made 100-page reports of findings. No one wants to read those. Don’t make one.
What I have found that works the best is a deck of findings that has a lot of bullets and annotated screenshots/photos. I may also provide a written executive summary memo of my findings for people who want to read over something relatively quickly and get the lay of the land.
We also create synthesized empathy maps and user scenarios from all of our rounds of user research. These are another high-level, digestible artifact that non-user research experts can understand and utilize.