7 DEADLY SINS OF HUMAN CENTERED DESIGN RESEARCH
At Peer Insight we think the end user should settle boardroom debates, and that means we spend a lot of time out in the field doing qualitative research. Over countless conversations with end users, we’ve honed a list of qualitative research No-No's for whoever’s conducting the conversation. Here are a few things you might be doing that jeopardize your qualitative research:
1. You’re doing most of the talking
Hopefully you’ve recorded some audio of your conversation. When you go back and listen, pay attention to who’s hogging the airtime. It should be your interviewee, not you! Remember the whole point is to get out of the boardroom and into the market to hear from the end user. To actually hear from them, you have to give them the mic. One way to avoid this trap is to bring activities and prototypes, such as storyboards, for your end user to respond to. That way you can let the visuals and the end user do the talking. Aim for 90% interviewee, 10% you, and then hopefully you’ll land around 80:20.
2. You’re asking leading questions
Start very broad. If you ask a narrow question and your respondent answered “yes,” then it was a leading question. If your question starts with “Don’t you…” then it was a leading question. To avoid leading questions, instead try to ask open-ended questions that still get at the same subject matter. For example rather than asking, “Would you rather have this on your mobile phone?” Ask instead, “How would you ideally access this information?” Sometimes it is necessary to ask a very broad “yes” or “no” question before an open-ended question so that you aren’t assuming too much. For example, before asking how they would like to access the information, ask “Do you ever want or need to access information on the go? Tell me more.” By asking “why” or “how so” at the end of a broad “yes” or “no” question, you turn it into an open-ended question. This way you aren’t assuming they want or need to access their information by jumping straight to the how they would like to do it.
Also be careful of the multiple choice trap. For example, you may be tempted to ask, “Would you rather get this information on your [a] desktop or [b] mobile device?” when the end user might really want the info on the [c] display screen of her car! You want your end user to surprise you – that’s called an insight! Be sure to give them room to do so.
Also remember your interviewee is a nice person, that you likely have incentivized. That creates a sense of reciprocity, urging them to tell you what you want to hear. Make sure they can’t ascertain what the ‘right answer’ is in the first place.
3. You’re selling or teaching
Often times after we’d ideated and come up with a potential solution to our end user’s pain points, we go back out to the field to share 2D and 3D prototypes of that solution with them. It’s all about co-creation and learning “What Wows” the user. It’s only human for the interviewer to want the participant to say “Wow!” so when presenting the concept we can’t help but ‘sell it’ at least a little bit. Don’t do it! If you just can’t help yourself, bring in stimuli that explain the concept for you, so you can’t even be tempted. For Peer Insight, often this is a quick two-minute video or if we need to be faster, or scrappier, a storyboard.
Also, check to see if you feel married to your idea. Remember it’s not your idea. You can’t own them, and it’s not a horse race. The potential concepts should be evolving, merging and morphing. So what’s the point in selling something when you want the user to change it?
Often it is very difficult not to teach or correct the interviewee if they express misconceptions about how something currently works or if it currently exists. This is the juicy stuff – this is the reason why you are there talking to them. Customer’s perception is reality! Whether the customer is right or wrong about facts or details, their perception is what we have to take as fact. Our job is not to correct or teach them. Our job will be to design for these perceptions.
4. You’re interviewing your own employees
We’ve all been there. Your target end user can be found within your own organization and they’re right there ready to be interviewed. You only have to walk down the hall! Here’s the rub though, as colleagues they have a deep-rooted social obligation to tell you what you want to hear, either on a conscious or sub-conscious level. They have a cultural maybe even contractual obligation to support your company’s endeavors that prevents them from being able to offer valid data. Also, chances are they consume whatever your company offers in a different way because of their insider knowledge, employee discount, etc.
5. You’ve turned off your beginner’s mind
Suspend your expertise during these interviews. At the start of an interview sometimes we say, “Pretend I am from another planet and I don’t know anything about how things are done here or why. I will be asking a tone of what may seem like very obvious questions but just bear with me. I’m wanting to make sure I don’t misinterpret any of your answers.” As beginners, we have the permission to ask the simple, basic questions. So use that hall pass to ask a lot of questions like “what does ‘efficiency’ mean to you?” instead of assuming you are both on the same page. We find our definitions are often not aligned. This is another reason why you do interviews instead of surveys. With a survey, you have no idea how they are interpreting the question or their answer.
6. Your confirmation bias is showing
My mom used to say I had selective hearing because my ears would perk up for dinner time, but would then magically go deaf when it was dishes time. The same phenomenon happens during interviews. We hear and write down all the quotes that support our hypothesis and we conveniently miss any disconfirming data. To stem this proven psychological phenomenon bring another person or two to keep you honest. Also, explicitly ask whoever’s taking notes on the conversation to listen for and document disconfirming data, and then make it a big chunk of your debrief template after the interview. Words to live by: “build as if you’re right and test as if you’re wrong.”
7. You’re chasing “Say” data when you could be grabbing “Do” data
We all know that having a customer say they’d pay for something doesn’t mean when it comes time to check out that they’ll actually swipe their card. Often organizations have trouble imagining tests in between a survey or interview and launching an initial commercial version, but there are all sorts of ways to devise experiments in the interim that capture do data, rather than surveys that capture say data.
Recently, we launched an Alpha I or early pilot for a client that wanted to obtain do data around willingness to pay. It took some time to stand up that small pilot, but in the meantime we used our recruiting process to obtain great do data. We setup real Adwords and other social media ads describing the service that led to a real landing page where they could sign up to be one of our paying participants. Clicking our Ad over another in a real life context was a behavior that increased our confidence in their need for the service. Likewise the time it took them to fill out the online form necessary to sign up for the pilot had a real cost associated (we all hate filling those online forms out right?). This kind of data makes a room of investors more likely to move a project forward, getting your pilot funded in the first place.
Lastly, remember that your research should be informing the design and build of something. Don’t research for the sake of a beautiful PowerPoint deck that will collect dust on a desk. Make sure that research leads to creation!
- Brandon Chinn
Special Thanks to my colleague Julia Sorzano for all her builds, adds and edits!