Everyone says you should combine “what” and “why,” but it’s not often done. That needs to change.
Human insights tests and surveys together can answer questions that neither could alone, and also make your findings more persuasive. So it’s surprising that companies often fail to combine the two methods. We’ll show you how to mix them to make your work more productive and compelling.
If you go to a market research conference, you may notice a subtle division among the attendees: the “qualies” versus the “quanties.” You can stereotype the quanties, who specialize in statistically-rigorous quantitative research, as precise and intense people, perhaps a little nerdy, never far from the calculator app on their phones. For the qualies, who focus on non-statistical qualitative research, picture the host and guests of a television talk show, drinks in hand, trading stories. The two groups eye each other uneasily during the breaks, and when you attend a session you always want to check whether it’s qual or quant focused so you make sure you’re attending the right church.
Those are just stereotypes, of course. Each individual is unique, and many researchers recognize that there’s value in both methods. But I think it’s fair to say that qual and quant represent two different schools of thought which often mix awkwardly, if at all.
Combining the two methods can feel as difficult as mixing oil and water. But in your kitchen that’s what makes a good salad dressing, and in business mixing quant and qual methods can make for great business decisions. If you do it right, you get the precision of quant research with the emotional understanding of qual, giving you a much deeper understanding of markets and customers. How to bring them together? We’ve talked with companies that do it, and tried it ourselves, and we found two recipes that work reliably.
The most common use of mixed methods we see is the combination of a quantitative survey with human insights tests. There are two specific ways they’re most often used together:
A good quantitative survey is expensive. Depending on the size of the sample and the type of people you need, you can easily spend many tens of thousands of dollars just purchasing responses. Add in the time needed to analyze and communicate the results, and the cost of a do-over is immense. You need to be sure that the survey is right the first time.
It’s surprisingly difficult to design an effective survey on your own. Companies often develop their own jargon that’s different from the language of the outside world, and can creep into survey questions, creating confusion. It’s also disturbingly easy for survey participants to misunderstand a question, or to be offended by it — either of which can skew the results.
A very easy way to avoid these problems is to run a usability test on the survey itself. The steps are simple:
Your human insights system will send you back a video of someone taking your test as they explain what they are thinking. A single participant is usually enough to identify problems, but you can use two or three if you want to be especially careful.
This is a great approach to use when you’re trying to get a deep understanding of a market change or a rapidly evolving issue. When things are changing quickly, you usually need to understand two things:
A survey can tell you how many people feel a particular way, while human insights tests can tell you why they feel that way and what they’re thinking. Only both methods together can give you a full picture of the trend.
At UserTesting, we used this approach during the Covid-19 pandemic to understand how consumer attitudes were changing. We started with a quant survey, and then used human insights tests to probe for the ideas and attitudes that drove the survey results. By doing the survey first, we were able to focus the human insights tests on the survey results that were most surprising.
Here’s what we did:
The results were incredibly insightful. To give you two examples:
When we presented our findings, mixing charts from the survey with video of people explaining their responses was extremely persuasive. We were able to show motivations and emotions along with exactly how many people felt a particular way. You can see some of the study results here:
A note on open-ended responses in surveys. To get insights on motivations and attitudes within surveys, it’s common practice to use open-ended text questions. Those are better than nothing, but when we’ve compared those responses to the responses in human insights tests, the contrast is striking. People taking an online survey are usually trying to move fast, so you’ll usually get only a few words in the survey’s text field. This is especially true for people taking a survey via smartphone. There’s also often a difference in the tone of replies. People often seem to use open-ended survey answers as a way to vent, so we see a lot of angry or even abusive answers. In a human insights test, where participants know they’ll be evaluated on the completeness of their answers, they often go out of their way to explain themselves. You’ll still see emotional responses, but they don’t usually have the casual, dismissive tone we see in a lot of surveys. You can decide for yourself if that’s a bias or a source of strength, but it’s definitely different.
Boutique research firm Harvest Insights wrote a good article on the quant-versus-qual culture clash: https://www.harvest-insights.com/blog/market-research/blurred-lines-the-qualquant-edition
Dovetail described several mixed methods techniques: https://dovetailapp.com/blog/mixed-methods-research/
Here’s a test plan we used for our Covid-related tests. (Note that because we planned to show the results to our customers outside of UT, we needed to ask permission to show participants’ faces.)
Screener
In this test, we’ll be asking you to turn on your phone’s camera to show your face. Are you willing to do this?
Yes [Accept]
No [Reject]
Video clips from this test may be used by UserTesting in an online report. Your name will not be used. Do you consent to this use of the video?
Yes [Accept]
No [Reject]
Introduction
This is not a usability test. We’re doing research on the Covid-19 pandemic and want to learn more about your reactions to it. We’re going to show you a survey about the pandemic. There are about 10 questions, and you should spend 1-2 minutes answering each one.
Tasks
Click on the link below to go to the survey. Please follow the instructions there, and remember to THINK OUT LOUD.
<insert survey link here>
About The Center for Human Insight
We created this resource to help you use human insight for business decision-making.