Amplifying impact: The Human Insight Summit 2023

Posted on September 14, 2023
10 min read

Share

The Human Insight Summit (THiS) is UserTesting’s annual customer conference, and this year, we visited the Emerald City in the Pacific Northwest—Seattle, Washington—to sip coffee, admire the Space Needle, and chow down on local seafood. 

Each year, the summit brings together the experience research and insights community to help organizations understand their customers by showing and sharing, first-hand, the possibilities of using human insight throughout an organization.

More than 1,100 in-person and virtual attendees joined to learn, share, and be inspired. They arrived at a space aimed to provide psychological safety to share their real-life experience, challenges, and questions. 

Here’s what happened at THiS23.

What were the top takeaways from The Human Insight Summit?

From interactive workshops, panel discussions, and casual conversations, there was a lot of valuable discourse going on at THiS. However, two themes continuously to bubbled to the surface. 

Connecting to business value and showing a return on investment

According to Forrester, CX leaders face difficult investment decisions and pressure to prove their value to their organization and make an impact. To support their organization’s goals, leaders plan to drive customer-centric programs and actions, with more than half planning to build capabilities in technologies that help them understand customer insights and experiences with their brand in 2024.

A central theme at this year’s event was how experience research teams prove their value. UserTesting’s Principle Customer Experience Consultant, Lija Hogan, had three suggestions for achieving this:

  1. Attach your work to KPIs and metrics the organization prioritizes.
  2. Treat the handoff between design and development like a critical moment
  3. Operate at a scale that will move the organization forward

In an interview with Matt Menz, VP of Customer Experience at AWS Amazon, he said, “So, the key on both ROI and really making an impact is finding that sweet spot to what information you need and what research methodology is the best to get that decision made.” 

Yet, inserting yourself into the processes of other teams and the organization can sound scary. How do you avoid becoming a bottleneck? 

That brings us to our next theme. 

Amplifying your impact with trusted, verifiable AI

At UserTesting, our vision is to deliver a movement for empathy. As so much of our life goes digital, keeping people at the center of product design and decision-making is foundational. 

However, we can’t ignore the reality of where we are today. For one, we’re all dealing with a tremendous amount of volatility. Secondly, the world and the speed of tech are different. 

That’s why we have to show up with data, an ROI, and the ability to demonstrate our value. This is where AI comes in. 

Despite AI being newsworthy, we know people have varying thoughts and levels of trust in it. At the event, we surveyed THiS attendees to understand better what excites them about AI in experience research. Here's how attendees responded to which AI insight benefit are you most excited about?

  • Insights can be surfaced automatically (37%)
  • Easy to accelerate and scale my research efforts (36%)
  • Getting answers at my fingertips (27%)

While experience researchers are excited about the possibility of saving hours by not having to do simple, repeatable tasks, there were still serious reservations about AI, such as: 

  • Believing that human touch is still critical in research 
  • Needing the ability to verify AI’s accuracy
  • Wanting assurance that AI won’t take their job

Ranjitha Kumar, UserTesting’s Chief Scientist, interviewed a panel of industry experts who talked about the challenges facing us when using or creating with AI. 

Layla Zomarot, Senior Product Designer at Shopify, shared how a new AI feature fell flat with customers when they fast-tracked it to launch without going through the normal process. 

David Evans, Sr. Manager of Customer Research at Microsoft, added that it’s essential to be transparent when generating content with AI or to let users know when something is AI-generated. He emphasized that elevating fact-checking and correcting what's generated from large language models could be a uniquely human contribution. He asked, what if every company and team had a human who took responsibility for the text or imagery created no matter how it was created?

At UserTesting, we’re listening to these concerns as we develop AI. As we build these models, we must think about diversity and inclusion. It's crucial that the AI is fair and equitable and doesn't do harm. We must manage the possibility of hallucinations and bad data points that take people in the wrong direction. And we understand transparency and control are essential. 

What product features were unveiled at THiS?

At UserTesting, we’ve spent the past 15 years building a solution to make human perspectives accessible and to prove the value of listening to customers. We’re building a unified experience research platform that does it all for you, delivering all methods, all participants, and all analyses in an easy-to-use, intuitive solution.

Here are the product features we announced at this year’s THiS. 

UserTesting AI

Empathy starts with listening to and hearing what users say, but finding the most critical moments in many hours of video is incredibly time-consuming. That’s where ML comes in. We empathize with users by hearing what they say and understanding what they do. Bring both together to see how verbal feedback lines up with actual behavior. Feel confident in the insights you gain and the decisions you make. With UserTesting AI, we’ve combined existing features into a new offering so you never miss a crucial insight and can make customer-centric decisions at agile speeds.

  • Sentiment Analysis - Take what participants say and detect how they felt—positive or negative—letting you quickly find moments of delight or where users offered criticism.
  • Smart tags - Detect nuances in participant feedback, like when they express a specific expectation or talk about something like the price—which may be positive, negative, or neither.
  • Keyword map - When you want an overview of task results, Keyword Map overlays sentiment onto how participants most commonly described an experience.
  • Friction detection - Friction detection highlights where someone was having difficulty using an experience, regardless of what they were saying—or even when they were so frustrated that they weren’t saying anything at all, so you can skip ahead to key moments and quickly pinpoint where you need to iterate.
  • Interactive pathflow - Get a birds-eye-view of how participants navigated your experience with Interactive Path Flows and see which screens everyone visited or where someone took an unusual path.
  • Intent paths - See what participants were trying to do when they went from one screen to another—not just the screens they visited, but how and why they got there.

To take this a step further, we’re launching AI Insight Summary. With the click of a button, generate a summary of your results that identifies key themes, transforming the audio feedback and participant's behavior into a text summary. 

Let’s take a look at how it works.

AI Insight Summary

With the click of a button, AI Insight Summary quickly summarizes key insights to save researchers valuable time and resources. The AI Insight Summary does much more than summarize transcripts—it uncovers hidden insights by looking for patterns and trends across task results. Easily see if your product delivers a positive experience or quickly pinpoint friction points that must be addressed to improve the experience. 

For example, AI Insight Summary can show how many participants completed a purchase, how many expressed frustration during checkout, and if site components such as images and guides helped optimize the experience. The interactive nature of the feature allows customers to drill down and quickly understand the why behind these actions and sentiments. 

Remote video URL

We know it’s powerful to have source videos to back up your research reports and product recommendations. These source videos can also act as a means to give you confidence that what the AI is summarizing is grounded in facts. 

That’s why, when you click “View source” in AI Insight Summary, you’ll immediately jump to the participants’ video to see the human moments behind the insights. You’ll also see an auto-generated summary of each video. From here, you can quickly figure out where to make clips to share with your team or drill down to do further analysis. 

With this feature, you have answers at your fingertips—and insights ready to share with your team in minutes. 

AI Insight Summary is now available in beta for UserTesting Ultimate plan customers.

ConnectTech

Connecting insights is vital to amplifying your impact. New UserTesting integrations help builders, designers, and creators connect human insight into powerful, customer-driven stories that drive decisions forward with confidence. This helps enhance the solutions that our customers use frequently across collaboration, design and development, customer relationship management, and digital optimization and analytics to make it easier to collaborate and share insights.

That’s why we launched the ConnectTech partnership program to streamline how teams share research and learnings and collaborate with their partners and stakeholders across product, design, research, and marketing.

Coming soon: Survey in UserTesting

During the product keynote, we shared a sneak peek at future capabilities, including an upcoming beta for surveys. Adding survey capabilities to the Human Insight Platform will allow you to quickly launch surveys without audio or video recording directly from UserTesting. With this new feature, we’re also expanding available audience capabilities to help fill higher sample size tests. 

New survey capabilities are heading to beta in December.

Moderated testing improvements

Moderated testing makes up about 10% of all tests conducted across the UserTesting and UserZoom platforms, with about 100,000 tests done this year. In the coming weeks, we’re launching new features in UserTesting’s Live Conversation–like having multiple moderators, making it easier for multiple team members to collaborate on live sessions. 

Moving forward, we’ll also introduce session chat and notes, giving teams new ways to collaborate and capture key insights during Live Conversation sessions. 

Embedded insights hub

It’s critical for teams to build collective knowledge and share insights and stories inside research and design teams and across their organization. This helps break down silos, reduce rework, connect employees with customers, and accelerate a customer-centric culture. 

That’s why, in the future, the EnjoyHQ insights hub will become more embedded inside our experience research platform, eliminating the need for another solution for the team. An embedded insight hub will give experience researchers the ability to organize their research findings easily, understand themes, and visualize critical learnings. Last month, we released functionality to bring the integration of EnjoyHQ with UserTesting and UserZoom closer. Now, information flows more efficiently, resulting in seamless navigation. Access to past tests, findings, and audiences in a single location allows you to create real-time recommendations while driving research quality.

Who earned an illumi award at THiS?

UserTesting's annual illumi awards recognize teams who use human insight to expertly design products and services, improve their marketing, or deliver impressive customer service. Here’s a quick look at a few organizations that took home an award this year.

Burger King earns the Outstanding Enterprise Success Story award

Utilizing insights from UserTesting participants, Burger King improved their app, which receives hundreds of thousands of orders weekly.

Additionally, they adjusted their drive-through experience and upgraded in-store kiosks based on audience feedback. Improvements to the kiosk experience alone led to a 2.5% increase in order size.

American Airlines earns the Customer-driven Evangelist award

American Airlines tested their digital experiences for searching and booking flights with UserTesting. The organization quantified the usability and enjoyment of its website by QXscoring four essential customer experiences:

  • Booking a flight
  • Changing and managing your flight
  • Check-in
  • AAdvantage program

With results in hand, the airline plans to improve online booking functions to make choosing flights better and easier for customers.

From left to right: Mark Behar, Manager of Customer Marketing Programs at UserTesting, Becky Sherman, Director of User Experience at American Airlines, Lucas Lemasters, UX Research Principal at American Airlines, Asal Johnson,  User Experience Researcher at American Airlines, Michelle Huff, CMO at UserTesting

Banco Sabadell earns the Groundbreaker award

Banco Sabadell ran more than 500 studies with UserTesting over the past year. Testing with customers helped the bank launch its first online account signup process. 

From left to right: Mark Behar, Manager of Customer Marketing Programs at UserTesting, Silvestre ‘Silver’ Bruna, Design Director at Banco Sabadell, Alejandro ‘Alex’ de Fuenmayor, DesignOps Lead at Banco Sabadell, Michelle Huff, CMO at UserTesting

NRG Energy earns the Creative Development award

Reliant is the flagship brand for NRG. Insights from UserTesting inform and influence Reliant’s marketing campaigns. Participants helped shape the features of Hugo, Reliant’s adorable armadillo mascot, and refined the Agent Hugo ad, which tested better after getting customer feedback.

From left to right: Mark Behar, Manager of Customer Marketing Programs at UserTesting, Pam Roper, Senior Manager of Market Research at NRG, Karen Harvie, Director of Consumer Insights at NRG, Michelle Huff, CMO at UserTesting

In this Article

    Related Blog Posts

    • Top down view of 4 colleagues at a round desk in a meeting

      Blog

      How to build a customer experience strategy framework

      In today’s competitive market, great customer experience is a key driver of success. As...
    • Photo of UserTesting THiS London stage

      Blog

      Digital innovation and insights driving customer-centric transformation: THiS Connect London 2024

      The Human Insight Summit (THiS) Connect: London 2024 was a must-attend event for digital...
    • Blog

      How to achieve product-market fit

      According to CISQ, $2.26 trillion is spent on software re-work in the US So...