Remote video URL
Track : Designing and building insights-driven products

Harmonizing insights: how to strike the right chord with your customers and win as a team

Presenting:

Aaron Walter

Co-host and Co-founder, Design Better Podcast

Elijah Woolery

Co-host and Co-founder, Design Better Podcast

Jason Giles

Vice President - Product Design, UserTesting

In this discussion, we’ll dive into UserTesting’s journey of turning their UX research process into a “jazz band” act—transitioning rigid structure for flexibility, improvisation, and learning through doing. We’ll explore how this approach empowers teams to balance freedom with quality, manage risk, and scale insights across the company. From training non-researchers to refining company practices with feedback, we’ll share how this agile, music-inspired model drives better outcomes and what's next as we continue to jam.

And now, please welcome your host of the Design Better podcast, Eli Woolery and Aaron Walter.

 Jason Giles is tuned into the habits of successful product design teams. Not only because he's been leading them for fifteen plus years, but also because his team at user testing makes essential...

And now, please welcome your host of the Design Better podcast, Eli Woolery and Aaron Walter.

 Jason Giles is tuned into the habits of successful product design teams. Not only because he's been leading them for fifteen plus years, but also because his team at user testing makes essential tools used by top design teams around the world.

 And Jason thinks of collaboration and the process around it in his team as similar to a jazz band where improvisation and exploration go hand in hand. And that peaked our curiosity to learn more.

 Jason joins us today for a special live episode recorded on stage in Austin, Texas at the user testing summit user testing human insight summit.

 Look at this amazing crowd. We got a alright alright being in Austin here.

 This is an exceptionally good looking crowd here, Eli.

 This is Design Better where we explore creativity at the intersection of design and technology.

 I'm Aaron Walter. I'm Eli Woolery.

 And you can learn more about the show and listen to our conversations with guests like David Sedaris, Eileen Fisher, the band Okay Go, and Pixar cofounder or Pixar founder, Ed Catmull at design better podcast dot com.

 Jason Giles, welcome to Design Better.

 Thank you, guys. So nice to be here. Welcome to our studio here. Studio Yeah. There. I know.

 Just so y'all know, this is Aaron's actual office. He happens to have a really great record collection featuring a few things here. Yeah.

 I saw the Mingus up there.

 Yeah. Pretty slick.

 Well, speaking of jazz, we were talk we've been talking with you for a little while about the work that you and your team are doing.

 And this is top of mind for us too because we just interviewed this amazing jazz musician, Kamasi Washington.

 So what we talk with you about is that you take a jazz band approach to your work with your team at user testing, and that favors this flexibility of learning through doing. Why do you think this model works so well for your you and your team?

 Yeah. So when we, I've I've done this a few times with a bunch of different other companies. And traditionally, I think about this around how to how do I scale insights across the organization. I'm a big fan of people that are making the day to day decisions, actually getting that direct customer feedback. So it's part of what I believe in, but I've tried a lot of different approaches.

 And, effectively, what I see is there's two major approaches. There's what I call the orchestra model.

 So these are really good for maybe larger organizations, a little higher risk aversion.

 And that requires like kind of training individual designers or or, PMs to actually be like really proficient at user research and that's great.

 The other end of the spectrum is the jazz band.

 That's good for environments that might be a little bit have a culture of learn by doing.

 Maybe have a bitter bigger higher appetite for a little bit of risk, maybe sometimes make some mistakes. And, for us, we've played with multiple different approaches. I'm very thrilled with kind of where we've landed, but, we're continuing to riff. I mean, this is jazz. So, we're constantly refining.

 So as a as a student of jazz and having talked to a lot of jazz musicians, one thing that that we've noticed is that jazz bands, they're very generous, and they create a lot of space for others to collaborate, kind of, explore on their own.

 Presumably, like, if you're taking this less structured approach where more people can be involved in the research process and the design conversation, that's gonna have cultural implications. How do you see that changing?

 Have cultural implications. How do you see that changing the culture of a team? Yeah. For sure.

 And sometimes there's outright resistance to it to have that kind of flexibility.

 One of the things that we've done is, we we build, like, even to who we hire, it's part of the hiring process. When we're hire when we're bringing in a designer, we set that expectation that, hey, we're gonna expect you to do this for the designers and the PMs to, in their, career model. Part of what is expected as a competency is to be able to do to get their own customer feedback. So that kinda helps from a structural perspective. You're bringing in new folks with that attitude.

 But then there's also the aspect of, you know, some of our our research folks that weren't familiar with this, there's a little, hesitancy as well. And it's like, hey, let's give it a go. I think the key thing that for those guys is to really focus on why.

 And we have some amazing researchers, big throbbing brains.

 I don't want to apply them to evaluating prototypes or doing usability. We've got big problems. And so often, the researchers are spending a lot of their time on stuff that just doesn't have the highest impact. So when I frame it in that way, that tends to get them on board at least to try it. Mhmm. And then as you start seeing results, then that kinda clears the way.

 Maybe you could talk us through a little bit of the the risks and the benefits of this improv approach. I imagine you get broader team involvement, but maybe it comes at a cost. Let perhaps less rigor or certainty. You mentioned you mentioned larger companies that might be more risk averse might have trouble with this approach. So tell us how you balanced all that out.

 Yeah. So I've been at user testing for five years. And when I joined, this idea of enabling non researchers to get feedback wasn't new. The founders did that as they built the original product.

 And so I came in and was like, okay, this is great.

 But what I quickly learned was as I looked into design decisions that were being made, like, that doesn't, like, look right.

 And I had access to the tests. And as I cracked them open, I was seeing biased questions.

 I was seeing faulty analysis. In some cases, it was almost like they were weaponizing the research. Mhmm. So clearly, just handing a bunch of instruments to a room of kids and expecting that you're gonna make something good wasn't probably the right approach.

 And in fact, we went to the other extreme then. We went really into kind of this more documentation and structured where we gated activities behind, certifications.

 And that also didn't really give us the results that we wanted, so we kinda pulled that back. And now we've gotten back into this place where it's like there's enough guardrails, and we're able to have oversight and see the the results. But, you know, it took some calibration to get us to that right place.

 Yeah. So I, I I founded the UX and and research teams at Mailchimp, and we I wish we would have had user testing back then, because a lot of what we were trying to do, it would have made things easier.

 We used hacks like Evernote to get all of our data into one place, and we could search. What I noticed was when we brought more people into the research process, when they could ask crazy questions, and get an answer, that caused more people to ask questions about the product. Talk to us about how that works at user testing and how that influences innovative thinking.

 Yeah. I mean, the more what we find is that the more folks that are involved in the process, the more committed they are either to solving the customer problem or a route just having more engagement and and including the customer is part of the thinking. One of my and this happens has happened over the years, but there's these certain moments where, I'll be in a meeting and I'll hear, like, a tester or a QA person or a developer, and there'll be a discussion. He'll be like, well, I remember seeing in that customer interview that the customer did this, this, and this. And, like, to me, that's this huge milestone. I'm like, yes, we're driving that kind of customer centricity through the organization and they're excited and they're proud and they're more motivated to actually deliver the right experience.

 So that it's it's kind of critical in developing that culture because if you're just keeping it within the context of your design or research or product team, you know, there's just so much opportunity, as we think about we talk about customer focused companies. Mhmm. You need to think about the entire company. Yeah.

 And you start start from the middle and work out.

 So a part of the vision that's clear from the demos that were shown earlier is this idea that you you can centralize your research in one place whereas before it was very fragmented and you might not even be aware that a study was done over here, over here. With this new system, you can essentially tap into research from across the company, across many years. So how do you see customers mining that research and making it actionable even if they're not on the research team per se or they're maybe in adjacent roles?

 Yeah. I mean, knowledge is power.

 I think, just being able to open up access and be able to ask questions around and knowing what's available. I know for my research team, we get asked the same questions a lot.

 And so to be able to just have a self serve, interface where folks can kinda come in. But then also just starting to build that muscle of an organization of knowing that answers about your customers are just a click away. Mhmm. And to start build changing that mindset and get that kind of habit of like, well, wait, it's not.

 Traditionally, if you request a research project or have to reach out to another organization to get an answer, it's a lot of friction. It could take weeks, maybe even longer. And that idea that it's right at your fingertips. Yeah.

 I think that really could change the behavior of how companies are making decisions.

 So one thing that was alluded to in the the product keynote today was that, and I've heard this so many times in product teams.

 That's qualitative data from ten people. How do we know that that is factual, that this is enough to invest our time and resources into developing this or solving this this problem?

 When when more of the company has access to qualitative and quantitative data, do you see any, change in perspective on how, say, product managers, or engineers or even executives who might have a more analytical approach to their work, do they think of qualitative research differently?

 It's a good question. I mean, I think the role of a designer or researcher or product person is, you wanna get a solution out to market. What does that require? It requires influencing people who make the decisions.

 Traditionally, in a business, you're gonna have a lot of folks that are very data oriented. They're about the numbers. They feel comfortable. They're they've been trained in understanding quantitative methods to couple that with stories.

 Yeah. You know, I'm a I'm a designer by trade. I'm a big kinda qual heart guy. I like the stories because it brings that data to life, and I think it's really the powers in the coupling of those those two together.

 Because there's I mean, I remember the first time I was was at Microsoft. We had the formal labs, and I'm watching over the glass, and I watch a test participant start to cry because my design is so bad. That's it's so powerful. That emotional response, it's lasted I I still remember what she looks like.

 Yeah.

 You know?

 And these customer stories that we see, this is the power of the video and the story and the storytelling that we all do as designers and now couple it with the data. And you probably, depending on who it is, you lead with one or the other depending on who you're trying to influence and and really support.

 So do you compensate those users with, some sort of mental health gift card?

 Totally.

 It's okay. We've all made people cry from the designs that we've created over the years.

 No. I'm not alone. Yeah.

 So we've been running these series of workshops with companies. Some of our tech companies that are actually working on artificial intelligence and generative AI.

 And, you know, one insight we're getting is just that even on those teams, they don't have a lot of time to play with these tools. And part of the exercise we run them through is just mapping AI to different parts of the design, design thinking process. And people often highlight risks or maybe opportunities where they wanna use it. And one thing that often comes up is this idea that maybe eventually, the AI will be smart enough that you can throw a design at it and it can give you more qualitative feedback and see where the errors might lie.

 Now I think you may have already answered this question with your last story, but I'm curious your thoughts because it's clear that user testing, you know, over its history has a very, you know, human centered focus. You're actually watching humans interact with products, getting real time feedback that's not just, you know, quantitative, but very qualitative. Like, are they frustrated, angry, delighted? So how do you kind of make that argument that, yes, AI is a great kind of co pilot and tool, but we have to keep humans as part of the loop in the design process?

 Yeah.

 I mean, at the end of the day, it's it's human centered design.

 That that's what we all believe in and, like, it's how we need to be delivering products and you can't take the human out of that either with testing with real people, or by having the discernment, the creativity that a human's gonna have in ideating solutions or or really having an understanding.

 That said, to superpower our staff with these capabilities that, you know, I don't it's not gonna be long before, oh, by the way, we the video happened to notice that this person, even though they said this, were uncomfortable or they were showing these things. Like, that's that used to be sci fi. Now we're realizing with the advancements of technology, things are moving really quickly. So from my perspective, I'm I'm seeing bionic researchers that are just, you know, super powered. And then at the same time, I think maybe it was Michelle who talked about this yesterday, just applying it to all the time consuming stuff that nobody wants to do anyways. Yeah.

 So I'm I'm quite excited about you know, as a designer, I'm I'm very I'm a skeptic by nature, but I've seen enough and I see the path forward. And, I'm really really excited about what's what's coming down the pipe.

 So one of the the the challenges of winning the the research battle of getting more people in the company involved in it and asking questions, is that research teams are bombarded with questions, and they sort of get, like, up to their eyeballs and tasks, and they just have to start saying no. That can be frustrating, and it can keep the team focused on all these short term, what's what's what are we doing this quarter?

 Yeah.

 What's our KPI for this? And, you know, you lose sight of the big picture thinking that is this one of the great powers of a research team. Hundred percent. Looking at horizon two and beyond, how do you think about balancing those? Like, let's let's work on refinement and sanding off the edges of product, but let's also keep our eyes to the horizon of changes in culture, changes in, devices that that could totally disrupt a business.

 Oh, hundred percent. I mean, we were just talking about how technology is changing so fast. Well, behaviors are just it's interesting because we've we've been looking and talking to a lot of folks in this room about attitudes around AI and I will tell you six months ago, it was high skeptics and we'll see.

 A recent study that we just did, we're seeing sentiment of, like, oh, yeah. We well, we expect there to be an automated summary for this and this and this. I mean, it's it's happening quickly. So that's all to say you need your research team focused on changing behaviors and and where things are going.

 For us, to do that, we we do this we do the our scale program where we've got we set a goal that, eighty percent of any evaluative research is done by a non researcher to allow our research team, which is what they are focused on now, on road map planning and changing behaviors. And it was really exciting because in March, we've got the we do an operations review, who's doing what tests and and what type of tests, and we hit eighty one percent in March. So I threw that out there just as kind of like a high level goal.

 It was pretty exciting to know that, now our researchers are spending so much of their time on the stuff that really is impacting the business at a larger scale. That's great.

 So part of the potential, as we've been talking about, for these products is to democratize the design and research that happens within an organization.

 At the same time, you know, researchers go through some amount of training to understand how to effectively present insights or, you know, build a survey or whatever the task is. So how do you train people to that are outside your research team to contribute without diluting the quality of the insights?

 Yeah. I mean, this kinda goes back to the orchestra versus jazz approach. So an orchestra and I think, I'm terrible with names. I'm gonna awesome dude from US Bank yesterday was talking about Caleb.

 Caleb. Thank you. Yeah. Was talking about their really awesome structured program. It reminded me of, like, oh, that's a great example of orchestra approach.

 On the jazz band side, while we still do we do tend to focus on micro learnings, we also focus it because we're not a huge team, so we can personalize it a little bit. And so it's like, teaching kids how to play, learn music.

 Start simple.

 Right? And celebrate wins, and then kinda build out build out from there. From a quality perspective, we've set up a cadence and a structure so that I would say the the biggest unlock that we had versus other failures that we've had before, we have a really strong relationship between our header design and and, one of our lead leader product leaders. They meet biweekly to review all the questions that are coming in.

 Together, they do a triage of what is the tactical research. Great. Or we already know that. We don't need research.

 Or here's a big strategic stuff, and we're gonna have the researcher take the lead on that.

 Not only is that great from, kind of making sure the right people are looking at the right questions, but they have visibility into the results. So our research team, they're meeting twice a week. They see all the tests that are running. They're doing the coaching.

 There's little pairings between the researcher and multiple design teams. So every designer and PM knows who their researcher is. But there's visibility into what happens. And there's course corrections.

 That was a leading question. Okay. Let's work on that next time with this particular designer. So it's it's a little bit high touch. But again, it's this is why I love the jazz approach.

 You know, we we figured out, oh, you hit a wrong note.

 Okay. Let's refine that a little bit, but let's just keep going. Anyone who plays jazz knows there's no wrong notes. It's just something you hit on the way to another note.

 Yeah. As long as you're not making a big business decision.

 That's right. That's right.

 Well, we always talked, to all of our guests. We always ask what are you reading, listening, watching to that's interesting and inspiring outside of your work? Because we're we're more than just our work. We're we're curious in other ways.

 Yeah. We we, we are. I will say so, a couple years ago, I moved to Edinburgh. So I work out of our Scotland office. And, we have a very diverse locate geographically diverse team.

 And I should have read this book five years ago, but I didn't. But I finally read it, just recently. It's, the the culture map.

 And, again, terrible names. I'm sorry, author. People will Google it.

 And it'll be in the show notes too.

 Thank you. Yes. Thank you.

 It just it it was just a game changer, not only to understand how people are processing information. I love managing. I love being a leader and designing teams, but understanding the nuances between just the geographic, differences. But, also, why?

 Because the German, education system teaches people to think like this. Oh, because the Spanish, you know, because of their history. Anyways, I'm sure you probably have all read it. I'm, like, late to the game.

 I have not read it. It sounds fascinating. It is excellent.

 Is is anything influenced by the weather? I I have this running theory that our dispositions are largely shaped by the weather.

 You know, it doesn't touch on that. I I don't recall that being missed.

 But coming from LA and then moving to Edinburgh, I can tell you my disposition has changed dramatically, based on the weather.

 So for sure. Yeah.

 Great. That's a great book. Anything you're listening to or watching?

 I wish I had something, like, really exciting. You know what I've gotten back into is, so I'm a a a fake drummer by heart, and I'm rediscovering TOOL.

 And, I'm just We got we got TOOL fans in the audience here.

 I'm telling you, just and I I I finally, I'm also late to the game. I've got vinyl now. So that's what I do in cold Scottish weather is I come home, pour a whiskey, put some tool on the the album, and that's that's my life. Yeah.

 Jason, it sounds like you're winning at life. Yeah.

 You know, I'm I'm gonna complain. Yeah.

 Yeah.

 Fantastic. Well, thank you so much for joining us on Design Better for this live, episode. And thank you wonderful audience for for being here with us.