When was the last time you received unexpected feedback? It could be during a brainstorming session, a performance review, or the latest usability study you conducted.
Unexpected product feedback may not always be welcome. You might not agree with someone’s perspective, or you may be taken aback by a customer being apathetic to the new feature you’ve been tirelessly championing to your team. Hearing surprising insights about your product can initially be soul-crushing, especially when you sit close to your work and have invested a lot of time into its development.
While they might not always be comforting to hear, surprising feedback is inevitable and can lead to the most lightbulb moments or discoveries you otherwise wouldn’t have attained. Being conscious of both the intended and unintended consequences of what you’re building is critical. As a researcher or designer, you often don’t have the final say about what to build, but you are the voice of your users.
After all, you don’t know what you don’t know—and assumptions can be costly. But the more you hear surprising insights from your customer feedback system, the easier it’ll get. Here’s what you need to know about actively seeking unexpected feedback instead of shuddering at the thought.
While you might be running a test on one specific feature, your users might pay attention to something entirely different about the product—and have a lot to say about it. But wait—before you hit fast forward, take the time to hear them out.
Kate Thacker, principal product designer at USA TODAY, says, “When I first started using the UserTesting platform, my instinct was to dismiss any feedback that wasn’t directly related to the project. For example, we often receive negative user testing feedback about ads, and I wouldn’t even note that input as it didn’t seem pertinent to the task at hand. But, if someone is distracted by ads, they’re less likely to notice the shiny new feature you’ve added to the page. I’ve learned to note anything a user focuses on during the task so I can reference it later if it begins to feel like a theme.”
During any study, it’s recommended to anticipate unexpected feedback and plan accordingly. In your latest study, you might receive subtle feedback, or you might gain insights that lead to a complete 180 of a project.
Thacker gives an example of a previous project where their Data Insights team realized that many news subscribers didn’t fully understand their subscription benefits before canceling. A product manager suggested the creation of a subscriber homepage in an account management system to boost awareness of subscription benefits. To see their assumptions in action, they conducted a study to assess how subscribers engage with their benefits. They realized that most people only turn to account management if they’re experiencing an issue.
In a month, the team realized that a new page within account management wasn’t enough to increase retention—which was still a worthy study and conclusion! It teaches the valuable lesson that the best solution isn’t always attained on the first try (or even the second or third). Trying and testing the new account management page gave the team a valuable data point, and ultimately pointed them in a new direction—helping subscribers engage through small, contextual reminders.
Not all feedback is equal, and just because your customer said it doesn’t mean it’s always worth writing down. Therefore, you’ll have to learn to differentiate between one-off comments and broader insights. More often than not, your project can’t be finished with one study alone. You'll see themes arise by documenting and tracking feedback across multiple tests.
Thacker shares her process for documenting feedback: using Airtable, she puts all tests into a “sources” tab and adds micro and macro insights to their separate columns.
However, don’t be too quick to dismiss feedback, no matter how minor it may seem. You might have written off a piece of feedback as insignificant until you hear it again and again from other users. A divergent insight can be the tip of the iceberg and pinpoint a contradiction of previous research, conclusions, or an unveiled problem that hasn't yet been addressed.
Thacker advises, "If you think this is a new or divergent insight, change one variable at a time and re-test to see how feedback changes (or doesn't). What happens if you shorten the copy, change the color, or reword your test question? You'll need to do a little legwork to see if a minor change leads to a solution before you blow up an entire project with unexpected feedback."
Ever heard of the saying that no answer is an answer? Or no action is an action? The same concept applies to user feedback. You might handle this scenario by holding back on giving specific instructions and seeing how users respond. For example, suppose you're running a test on how users perceive a particular email, but your organization owns multiple newsletters. In that case, you may include various emails in a lineup and ask users to pick the one they're more interested in. Your users won't be aware of which email you're prioritizing, which will give you more unbiased results. To top off the study, you can reach out to each person individually and ask them to provide feedback on something specific.
If someone is unlikely to take an action in a paid research study, they're probably unlikely to do it in real life, too. Reading between the lines isn't always easy or comfortable, but it’s critical to pay attention to both what's said and what isn't.
As often as our customers receive unexpected feedback, the teams at UserTesting do too. Corey Hatcher, Principal People Strategist at UserTesting, previously conducted a study to clarify the recruiting process for hiring managers by creating a training guide. The feedback her team received showed that managers valued the transparency of the guide and were willing to contribute to help us improve.
However, a surprising bit of feedback was that some talent team members accidentally left out portions of the process when explaining the hiring journey to managers. This insight allowed us to take a closer look at our talent enablement materials to ensure that our employees were being given the tools and guidance they needed to better equip the organization for success.
Most UserTesting organizations find some unexpected feedback from using the human insight platform, almost always for the better. Pet insurance organization Everypaw prepped for a TV ad campaign to increase its market share. To ensure it was as effective as possible, the organization turned to UserTesting for feedback on the 30-second ad. Unsurprisingly, the team received transformative feedback. Here's what they learned.
Early on, the team anticipated the ad's usage of CGI to be distracting, but test participants were pleasantly entertained. Additionally, the offer of a free dog activity tracker wasn't as exciting to users as expected. And finally, the ending screen of the ad was of a rotating phone. Everypaw worried it would draw attention away from one of the calls to action. However, the team was once again surprised that users liked the phone at the end and still visited the website as intended.
This success story reiterates that you're not your user, and you'll only know what your customers are thinking by asking. Additionally, while unexpected feedback can be stereotyped as something negative, Everypaw’s success story shows that you can be expecting your customers not to enjoy a feature—only to find that they love it.
Gathering real human insight enables your team to capture not just quantitative feedback but also close the empathy gap (which may be preventing users from becoming customers). The best study results aren't always anticipated, and the only thing you can predict is unpredictability.
While receiving feedback might always come with a hint of worry that it’ll be negative or lead to rework, the most effective team members are the ones that embrace and appreciate it. Ideally, this is the type of culture you have within your team and your organization.
By understanding the full value of user feedback and being open to changing your mind, you can turn assumptions into answers. So next time, instead of being deterred by unexpected feedback, by knowing its potential and how it can influence positive outcomes, try welcoming it with open arms.
Bring customer perspectives into more decisions across teams and experiences to design, develop, and deliver products with more confidence and less risk.