Chase Clark
Lead UX Researcher, Calm
Join Chase Clark, Lead UX Researcher at Calm, as he shares how their team leveraged user insights to drive innovation in mental health. This session explores their approach to understanding why users disengage from their meditation app and how they identified critical unmet needs. Learn about Calm's methods for understanding users, their innovative testing processes, and how insights into unmet needs were transformed into testable new ideas, ultimately improving product features. Attendees will gain practical techniques for capturing and applying customer feedback to drive innovation.
Hello there. My name is Anissa, and I am a corporate events marketing manager here at UserTesting.
We're really eager to kick off today's webinar. Before we get started, I have a few housekeeping items to cover.
If you need, you can turn on closed captioning. There is a toggle at the bottom right of your screen, and you can also choose your preferred language.
Please use the chat to share your thoughts and take a look, or the raise hand button is at the bottom of your screen. After today's presentation, our speaker will be participating in a live...
Hello there. My name is Anissa, and I am a corporate events marketing manager here at UserTesting.
We're really eager to kick off today's webinar. Before we get started, I have a few housekeeping items to cover.
If you need, you can turn on closed captioning. There is a toggle at the bottom right of your screen, and you can also choose your preferred language.
Please use the chat to share your thoughts and take a look, or the raise hand button is at the bottom of your screen. After today's presentation, our speaker will be participating in a live q and a. Sometimes it is easier to talk through a question, so attendees have the option to either drop the question in the chat field or you can use the raise hand button to come on stage.
All that said, I'm excited to invite our speaker today. He is a lead UX researcher over at Calm. Please help me welcome Chase Clark.
Thanks, Anissa.
Hey, everybody. My name is Chase.
I'll do a quick intro here, and we'll get going. But thanks everyone for joining the, session. And like we mentioned, there'll be, opportunity for, questions, as we go forward. Can you can you see my screen okay?
Cool.
So we'll go to the next slide and talk a little bit about, myself for a little bit. I know you all love to hear that. So a little bit of experience to assist at the stage here. These are some of the companies I've worked with.
Going back a couple places at Zipcar, we designed to get people in the car somewhere they wanted to go. At Asics, we designed to get the whole world running. At WorkRise, we designed to get people jobs now into the field. And at Com, we're designing to make the world happier and healthier through mental wellness.
There's a bit of a pattern here where each of the companies, is designing for human behavior change with real world action. So, like, get somewhere, track a run, get hired, meditate, fall asleep, things like that. So for each of these different roles, I've had to really completely understand the customer landscapes in each of them to help increase engagement through innovation. So we'll be talking about that a lot today. And today, mostly talking about comm, where I've been for the last four years as lead researcher.
I started, a couple years ago, and I've been building the research function. So a lot of this is kind of like cheat sheets of things that we put in place and learned throughout the year, couple years. And we're a small team doing what I feel like is is pretty important work.
So it's been crucial for us to be really smart and just targeted about how and why we conduct the research that we do. If we go to the next slide, please.
So we're going to kind of walk through these different, phases. This is the framework that we leverage at comm to innovate based on user engagement, unmet needs, and then the opportunities that come out of that. I'm gonna explain how these phases are different and answer different questions and which, can be boiled down to, like, the signals you're trying to get to and how you want to move to the next phase of the framework.
Stepping through this is gonna give you a really solid foundation for innovation, and it also shows you the difference between the phases and like why you'd use different methods, in different ones. So let's go to the next slide please.
This is, the high level view of the framework. We'll keep referencing this. I'm not going to go through every part of it right now. We'll step through them individually, but I'll just kind of walk through how it's built.
The left side, you've got kind of like the levels we step through and that's customer, problem, concept, and feature. And for each of those levels as you're learning about these things to help inform how you wanna innovate, we've got, basically, like a template of questions we wanna answer for each of those phases, the specific signals we're looking for in each of those, and then the methods that we use to get that signal. And I'd say the goal here for you all is to kind of walk away with, this as a template for a framework that you can either take back to your own company and either run with it or iterate on it and change it based on, like, what is different about, how you you try to innovate and do things for your roles here.
But it should be a great starting place and, it's something that hopefully is is helpful for everybody.
So next slide, we're gonna jump into the first phase here. So that's understanding customers. We go go ahead and go to the next one.
What we wanna start to do here is just, like, understand your customer types, their problems, the sizing and severity of each. You've probably already got a good understanding of this, but never actually gonna, like, double down on these these things and just get really specific and convert that stuff into artifacts to share out for your team.
For comm, you know, this looks a little bit different than it probably does for your company, you know, and every company is different. So it's really important here to kind of dive in and start to understand the types of customers you've got. We serve lots of different types of customers trying to solve different problems, like, all the way from, like, falling asleep to making a habit with mindfulness to, you know, establishing a meditation practice or solving for panic attacks in the moment. So a lot of different customers for us to think about. But, again, this will be unique to every company and every product.
And your customer understanding could be as simple as, like, one customer, ramen enthusiast, or two customers, like Facebook market by seller and buyer, or like I mentioned press dot com, like, many types of different people trying to do many different things.
So whatever your customer landscape looks like, you're gonna wanna understand each of those types of customers, the size of each group, the problems that they're trying to solve, and the size of those problems itself.
So in the next couple of slides, we'll talk about how to, like, really efficiently understand that, capture that information.
So let's go to the next slide.
So back to our initial framework here, in this understanding customers phase, the questions we're really trying to ask are, you know, TAM, like, how big is the audience and how large is each archetype inside of that? Basic demographic stuff, like how we wanna define the people we're talking to and target them, and then most importantly, really, like, what problems we're trying to solve. The signal we're looking for here is, like, the size of the prod the audience, the the problems people are trying to solve, the size of those problems.
And my favorite part of this too is just kind of satisfaction gap of like the problem, how important it is to solve, and their ability to do so. And then the most typical methods we use here, survey, internal data, secondary research, things like that. But again, feel free to use these as recipes where if you've got different methods and tools and signals you're looking for, roll with that as you go forward in your own stuff.
Next slide, please.
This is an example of a survey where we'll capture the, the satisfaction gap. So a question we'd love to ask people is, you know, what's the primary reason for download dot com in the first place?
And then we kinda start to see the problems for coming in to solve. This is an example one for trouble sleeping. Like, sleep is a big problem space for us. There's a lot of different elements inside of that that people try to solve for.
And so we ask people, you know, to to rate for all these different problems that we started to find inside of the umbrella of sleep. You know, what's the most important to you and what's the most difficult for you to to get done? And that's just a great way for us to help prioritize, like, where should we go, and start innovating first. And you can even boil this down to a simple metric like the satisfaction gap of, like, those two, levers and say say like, hey.
Clearly we should start focusing on this problem, it's gonna have the most impact.
Let's hop to the next slide.
A big thing we like to do in this phase two, is coming up with personas. Like, every every company I've been at, every team treats these a little bit differently.
So, you know, throughout the years, I've kind of come up with, our our own internal approach here. But, you know, I'm sure you've all experienced, like, overly complicated personas. Like, you know, a team will come up with someone named Jacob who's, like, a forty seven year old Airbnb super host who makes a hundred eleven thousand dollars and has one point five children. And it's just a lot of detail that really doesn't impact the way you're designing the product and things like that. And the thing we wanna focus on here is, like, keeping it simple. You know, you wanna make, like, Pokemon cards and not look at PDF articles that people can't memorize.
So what we like to do to make really simple actionable personas is take, a customer archetype and the problem we're trying to solve, mash those those things together, and come away with with the persona.
So it's not complex. It's not overly detailed, and it's something you can really have stick in your head and and think about it. We don't really use any names for ours, like individual personal, you know, first person names, or unnecessary details. It's really just kinda like their problem to solve and their, the behavior they gotta do that.
What we like to do in terms of, like, using spectrums to make those archetypes that I mentioned, is really just think about, like, what is the the main differentiator and how someone will either want to or will use your product.
We use for the most part, like, reactive to proactive. Someone who's being reactive to a problem versus someone who's being proactive to solve a mental health or sleep problem is gonna behave very differently, and we use that as our main lever to understand, you know, how should we design things differently for them. That's gonna be different for you. Like, it could be things like time commitment, skill level, knowledge, tenure, lots of different things to think of. Let's go to the next slide, please.
So, again, like I mentioned, we use a really simple personas. They're based on combining that archetype, with the main problem they're trying to solve. So an example here is, like, someone's trying to fall asleep, and they're being more reactive and proactive.
That helps you really understand, like, what they're trying to do in the moment, capture that mental model, understand what they're trying to solve, and build something that's specifically helpful for them. Next slide, please.
So quick tip here. Those customer architects' problems and personal numbers should be really simple to remember, and just naturally woven into your team's everyday conversations.
We can go ahead to the next slide, where we're gonna start diving into problems.
This might be a little out there.
Can we go back one, Lisa?
Perfect. Yeah.
Cool. So what we wanna really do focus on in this step is just mapping out user processes.
This is, like, hugely valuable for us in terms of, like, understanding where people are, what they're trying to do, and what's getting in the way. So at this point, you should have a list of, like, the main problems your users are trying to solve by using your product. You should also know, like, the rough or specific size of each of those problems. If you get specific, that's awesome, but the point is be able to just kinda prioritize based on what's bigger, what's noisier, what's more painful, things like that. And you wanna know the sizing of those major archetypes as well.
So next, we really wanna do a deep dive into, like, one or more of those specific problems your customers are trying to solve. You can do this kinda one at a time. This this could scale up and down. And to do that, what we're gonna do is, map these things out and really just dive in and understand.
So back to our framework here. The questions we're trying to ask in this phase is, you know, how do you try to solve this problem now? Like, literally step by step, what do you do to fall back asleep at three in the morning or, you know, order a latte while you're driving through rush hour? What works?
What doesn't work? What does progress look like for you? And then for archetypes, like, how do they generally behave as a customer and what can we expect them to do and not to do? And then the signal we're looking for here is, like, stories, really, you know, just qualitative stories, workflows that people are doing, whether successful or not, probably more so unsuccessful because that's where you're trying to innovate, unmet needs in terms of what they're trying to do, and then, helpful design constraint called it is and it isn't.
So it's like, you know, based on what we've heard, this should be this and shouldn't be that.
Favorite methods here are qualitative interviews, problem mapping, and secondary research.
So our next slide, we've got an example of how we just talk to people about these these problems. So, there's a couple ways to do this. One is to do interviews live one on one. We do a lot of that, and that's awesome. But we've been doing more and more just, user testing sessions where we'll have people talk through how they try to solve these problems. And that's been a great way for us to get ten, twenty, thirty people talking about how they try to, you know, navigate their way through a high acuity, anxiety moment in a matter of hours instead of, like, a week or two. So we'll watch this this video real quick as an example of, like, someone talking about, you know, how they have problems with sleep.
With your relate what is your relationship with sleep sleep like? Well, as you can see, I don't get much sleep. I'm always fighting to get some sleep.
So when I don't sleep, I usually just work all day or I just find different ways or extra work to do because I don't really sleep well.
I usually always get approximately, like, four hours of of sleep.
And when I do think I'm gonna get more sleep, my cat wakes me up, and I can't go back to sleep. So that's my relationship with sleep.
People are great at just explaining their process, about solving problems. And we'll do things like asking them questions like, what's your relationship with sleep? Like, just kind of rephrasing problem questions to kind of get more, these great human stories. You know, like talking about the last time you had trouble with it. You know, asking the specific steps they go through. Asking what pain points they come up against and then seeing, like, what kind of progress they're trying to make, what they've tried, what shortcuts they do, what could be easier, things like that.
So these sessions end up being, like, ten, fifteen minutes long and they're just gold mines in terms of, like, understanding, how people solve problems.
Let's hop to the next slide.
So after that, after you've listened to, you know, ten is great, more is awesome, but usually you can kind of get to like that point of diminishing returns where you've heard ten people talk about how they try to fall asleep at night, or whatever the problem is they're trying to solve. From here, you want to move into what we call problem maps.
We these are basically journey maps, but we're just really focusing them through the lens of people trying to solve a problem, very much like a job to be done approach.
They're really exce very similar to user journeys, but it orients around the problem specifically. Like, what's this person trying to do? What gets in the way, and what hurts.
The map consists of the phases of solving a problem. So at the top here, you can see, like, kind of those black boxes where it's like, this is the first phase, the second phase, and what we go through. Below that, the individual steps that they do, and we take each of those steps and rank them on a y axis of, you know, what we're trying to solve for. Like, for this one, it's, like, love to loathe. Like, what parts of the sleep process do you just really hate and what parts are great? Things like that.
And we also wanna see, like, below that, all these little color cards are, like, insights where, we're capturing problems, we're capturing good things, we're already generating how might we's based on problems, notes and questions, things like that.
Next slide, please.
So, again, this is an example of what we did for sleep. You can see there's big ups and downs, and lots of red cards and, like, little purple how might we's attached to each of those things. And this is a great resource to keep coming back to. As you're learning things, you can add to it, and this is, like, a really good, living document that's just really, really valuable. And, again, you'll kinda understand how each of these phases flow into the next one. So you've you've started to understand your customer, now you're understanding problems, those things flow into each other, and they'll flow into the next phase too. Next slide, please.
Again, to to increase like, if you're thinking about, like, trying to increase engagement with their product, you've got to understand, like, what's not working for a lot of people, what doesn't make sense to them, what they're trying to do, what they're going through in their life, what kind of like better future version of themselves they're trying to become by using your product. And so you're looking for all these like stumbling blocks, for what's getting in the way right now. And what that lets you figure out is like what are the unmet needs? And then what we do with those is we take them and we start to plot them out onto this kind of, like, sign constraint cheat sheet of, like, okay.
Based on what we've heard from customers, this new solution that doesn't exist yet, it needs to be this. It shouldn't be that. It's gotta do this, and it shouldn't do that. And these are high level, and they're not one to one to what we've heard from people, but it's kinda like reading between the lines.
And trying to figure out, like, okay. If you're trying to make progress towards this problem and this is getting in the way, the new solution's gotta look like this.
So let's jump to the next slide.
So the thing to remember here is, like, unlike those customer, insights from the previous phases which are memorized, this is like a living document meant to be updated and referenced regularly. So you can keep jumping back into this, and solving different parts of the problem.
Cool. Let's hop to the next slide. So, this is a really fun part of the phase we're gonna jump into where we'll be talking about how to test out these new concepts. So let's go to the next slide.
So you're obviously gonna wanna generate concepts. You're gonna wanna test those things. You've you've been talking to customers. You know what the problem is.
You understand how they approach it. So this is the really fun part. And it's always, you know, it's always fun talking to users, but this is where you get to be creative and generate those ideas that you didn't think of before. So, again, the goal of this phase, generate concepts and prototype, the shortlist and test those out.
And we look at this as two phases really, like, you know, coming up, with a large list of potential concepts and then choosing a short list to kind of prototype and and get you back on.
So let's go to the next slide, please.
For this phase, we're really looking at the questions of, like, like I mentioned before, like, what concept should we even pursue?
Do people understand what the new concept is or are? Does a new concept solve their problem, whether perceived or actual? So that means, like, if you're picking up on a shelf, do you think, oh, this will solve my problem versus once you've gone home and used it, does it really solve your problem?
Also, like, how willing are people to try this thing? How willing are they to pay for it? Those are the really big questions we're looking for. So signal to capture here, like, comprehension, do they want it, buy in, and efficacy.
And this is the the the one where you can use the most creative methods as well. You can do things like card sorting, force ranking.
Prototype testing is a big one. Customer co design is cool, and what we like to we like to call just general sneaky business to kind of, like, understand, like, what is the signal you're trying to get? Because if you just ask someone, like, hey. Would you pay for this thing? You know, they're they're gonna predict the future or say something nice to make you happy or they don't really know, but, you're really looking for things to help get super sharp signal.
So let's hop to the next slide.
So, again, this this flow here looks like, brainstorming. Right? So use that customer archetype, the problem they're trying to solve, the problem map you've got, your unmet needs, and brainstorm a bunch of things through those design constraints.
And I know you've all probably done really successful brainstorms in the past, so we won't dig into the how to do that. But, again, you're using what you've learned in those previous phases, and you're just getting sharper and sharper in terms of the focus you're making. So from there, have the internal teams vote on, like, the interesting ideas to them based on what we've already learned. But even more importantly, have, like, potential customers vote on the ideas that are most interesting to them.
And this is kinda like that sneaky business signal where we can get it by, coming up with creative ways to to get signal in terms of, like, do you is there buy in here? Like, do you want this thing? And we'll talk about that a bit in this case study coming up. But remember, you're trying to capture really specific signals, like, do they understand it?
Do they want it? Things like that.
From there, you wanna start prototyping. You know, you've got a short list of ideas, create prototypes for one or multiple.
One is great if that's all you have time for. Multiples cool because they can kind of compete against each other.
And then you test those things, like using tools like user testing, of course, using the same discussion guide against each of them, and you can start to see where things are helping people and not and just get a really good signal there. And, again, you wanna see which concept score is best on those questions we're trying to answer, like, do people understand it? Is it gonna solve their problem? Are they willing to try it, pay for it? Things like that.
So let's hop to the next slide, please.
So from here, you know, don't be surprised if multiple concepts succeed at this stage. I know it's still not, you know, washed out the real world yet, but all the things you're coming up with are grounded on really actionable customer insights that you've gathered in those previous steps. So you're always starting on a great foundation.
So let's jump into testing features.
Next slide.
The goal of this phase is really to, like, refine those rough concepts now that you've got conviction in them. Let's say you've run some tests and you find that, like, this is making a lot of sense to people. They really want it. They understand it. It seems like it's solving a problem.
Then you wanna go in and and just start refining those things in terms of, like, the features. So this should be pretty basic and familiar to you, but, again, one thing we wanna think of is not jumping ahead of ourselves. Right? Again, we don't know our customers and how they're solving a specific problem. We shouldn't be jumping into features to help them solve for that. We wanna make sure we understand those things before we get here. So just make sure you've stepped through those previous phases before jumping into this.
Cool. So let's go to the next slide.
And again, back to our framework here. The questions we're trying to ask here are very simple. Do people understand how this thing works? Can they use it? You know, maybe do they love it? That's kind of like an optional one. And the sig that we're trying to get is pretty crisp as well where it's like task analysis and sentiment.
Could you get from point a to b? Did you like it? Things like that. And then methods here again, usability testing and then also, live usage data.
From here, let's jump into a case study where we've leveraged this same thing at Calm. We can go to the next slide.
Well, a big project that we worked on was, you know, trying to tackle stress and anxiety.
When I started, we jumped into defining our customer base, and, you know, we we all kind of thought, like, oh, it's calm. It's a meditation app. People wanna come in to meditate. But the more we started digging into it, the more we saw lots of people trying to, you know, solve for anxiety and stress and not specifically trying to meditate. So So that was a big flag for us to say, like, oh, we need to come up with some new ideas here. So this case study is gonna walk through how we went from, like, that major challenge to four user validated solutions in in literally four months.
So hopping on to the next slide.
So we wanted to understand our stress anxiety users. Like, we've all been stressed. We've all been anxious. We know what that means, but, we saw the size of those cohorts. We started, you know, ranking the problems they have, and then we started mapping out how they try to solve for it and saw lots of things that we could do for people.
We learned from surveys and internal data that there's a huge cohort of people trying to use Calm to solve for those problems of stress and anxiety.
You know, it seemed meant it seems pretty obvious for mental health app, but the size of the group was something for us to pay attention to.
Those customers also mapped that more reactive archetype like we mentioned. So they weren't trying to be proactive. They weren't trying to sit down for ten minutes at a time every day and kinda, like, get ahead of the problem. They're trying to put on fire at the moment.
We saw a really big satisfaction gap for their ability to reduce stress anxiety at the moment and their desire to do so. So, again, they really wanna do it. They're not good at doing it. So opportunity for us to come in and innovate.
You know, at this stage, we could have focused our efforts on making it easier to find meditations about stress anxiety or, like, something similar. But luckily, we moved to the next phase of understanding the specifics of the problem, and found some really cool pockets where we could we could play.
And, again, as it turned out, this is a super different goal compared to someone trying to meditate, so those things just weren't aligned.
So next slide, please.
So we went into that problem mapping phase. You know? We said, like, what are people doing to reduce stress anxiety in the moment? And we ran a series of user interviews. We did, remote unmoderated, and we did remote moderated. So a combination of, like, just letting people talk into the phone and a combination of just, like, having a one on one moderated discussion.
So we could really dig into that process people tried to do to reduce their stress anxiety, specifically in a reactive moment.
And like I kind of mentioned already, we uncovered just a lot of unmet needs and that we turned those into design constraints, and we compiled that that problem map where we said, oh, people start here. They're doing this. They're trying to get here, and the goals were different, and we learned a lot of things by just talking to, you know, several users and understanding what they're trying to do.
We also found out through some secondary research here that people who were responding to acute stress anxiety needed, what we call what they call exteroceptive and not interoceptive mindfulness techniques. So the difference there being if you're stressed out and anxious and you close your eyes and go into your head, it backfires, versus if you open your eyes and you look at something to focus on, that helps you calm down.
We learned a lot more than what we get into specific specifics of because this is really fascinating stuff that I could talk about all day, but we'll just kinda keep moving forward and walk through the next step here. So next slide, please.
So we're into the concept phase. We headed into ideation with insight driven design constraints. And this is I'll walk you through how we determine which concepts to move into prototype testing.
So, again, we ran that cross functional brainstorm.
We came up with a lot of different concepts to consider.
People had specific design constraints from the earlier phases, so that helped narrow on ideas. And then we just top load of concepts where we said, internally, what do we have the most, conviction around that we should go in and test and move forward with?
For that external signal from customers, potential customers, we did, like, a fake app AI, IA test. So we ran a test on user testing where we pretended to be this new startup called Anxly, where we were, like, dunking on meditation, being bad, and, you know, we're a different new solution to your stress and anxiety problems.
And we asked participants to help us with our new app we're building, and specifically to kind of, like, build out the information architecture. So we weren't making this new app. We didn't care about the IA. What we wanted to do is have people sort these new concepts out into what should be on their home screen, what should be on the discover tab, what should be on the profile tab, what should be in the more section.
So we were looking for specific signals here, but we're doing it in a sneaky way where we thought, like, okay, if you put something on your home screen up top, that seems important to you. You're gonna wanna use it. If you stuff it over in the more section, we shouldn't even think about that. If you put on your profile, you're probably never gonna go there. And if it's in Discover, you're probably not gonna look for it because you're someone who's doing things at the moment and you need to get something done. So, again, we're looking for the signals of, like, would this concept solve the problem from a perceived standpoint?
Would they wanna try it? Things like that. So this gave us some really cool signal of people sorting things into different sections, and then we asked specific follow-up questions like, oh, if this if our app can only build three things, what should we do? And if we had to get rid of three ideas, what should we get rid of?
And so looking for signal there in terms of, like, what to pursue.
So I'll hop to the next slide.
So, prioritizing prototype efforts based on user feedback. You know, we let people build up this fake app, and we just watch them do it. We've made notes about what they were sorting where. So we kind of took that internal data and the external data and said, like, alright, let's start prototyping some of these things out, and figure out what to actually test. So next slide, please.
We came up with four different concepts.
We did this pretty quickly, and we wanna be able to see, like, how well each of the things these things would perform. So going back to that, like, concept signal of, like, does this solve your problem? That's what we're trying to find out here. And, actually, do do you want it?
Do you wanna try it? Things like that. For each concept, the prototype was a different format, but the user testing test was the same. So we asked the same questions every time, followed the same format and flow and things like that.
People were able to actually experience the new concepts too without us having to go to code and spend lots of money and time building it, launching it, queuing it, and things like that.
So we went into testing with four concepts. We did immersive breathing, which was like this guided breath work video that really focused on, like, hey, you are having, a really anxious and stressful moment. Let's walk you through it. Again, leveraging all those design constraints we heard before where, you know, it's eyes open, you're listening to something, you're seeing something, you're doing something.
We also did a different one called escapes where people were walked through, like a a jungle and a forest and there's a narrator kind of like talking them through things and they did activities, things like that.
We made a Figma prototype of a photo gratitude journal because, gratitude helps combat anxiety and we kind of walk people through how that would work. Then we also did these text based anti stress and anxiety activities where you just kind of tapped along and read things and did activities.
And we wanted to go through each of those and ask the same questions. And we would do things like, have you come in, talk about your relationship with stress and anxiety, rate on a one to five scale your stress and anxiety level right in the moment, do the new concept, rate your level again, and kind of talk through it. So that's a really rich signal there. So next slide, please.
We asked, like I said, the same questions for each different prototype and it was cool to have people kind of talk about the story beforehand, experience the actual thing, and then, see how it worked for them and then ask them, try to capture some of that signal too of, like, do you want this thing? Are you willing to try it? And things like that. And one approach we did there was, again, this is a fake app called Angstly that's not real. People think they're beta testing this thing that's, you know, in development, and we would ask them at the end, like, hey, if you want beta access to this new thing or we're building it, put your email down.
So this has kind of captured that signal of, like, buy in. It's not the same as, like, putting their credit card number down, but we don't wanna do that because it was, you know, smoke and mirrors. But if someone puts down an email address, on a testing video because they do want this thing you're building out, that's signal you can pay attention to and say, like, oh, this one got seventy percent buy in versus this one got twenty, and things like that. But what you also wanna listen to is just the tone of people's voice as they're saying these things. So this is a great area where it just doesn't really shine for us, to capture that. So play this quick video here to kinda just walk through a few people saying how they wanted the product.
Be good. Optional. If you would like to request access to the beta of this app, please leave your email address. Otherwise, just type no.
I'm telling you right now, I want this app. Like like I said, I'm, like, sad to think it's just a prototype right now. Like, I want it. So, yeah, I would love it.
Please please send it. I actually would like access to this beta app.
I have tried meditation apps, ton more of those where you just got, like, this level of excitement and, like, a tone of voice and, just really great quality of signal that you couldn't get from a survey or things like that.
So let's hop to the next slide.
So we, you know, back into that process again, stress anxiety, big problem.
This is how people try to solve for it now, not working. These are some concepts we've got, shortlist of this concept, tested four of them out, and then we had an unexpected result here where the simplest little one, want, like I'm air quoting one, but this one turned out to be the most effective for people. The concept that got the best buy in for, you know, the signal of buy in, it was also the highest positive change in that stress anxiety score so it seemed to work and solve the problem. And it got that highest beta email opt in where it was like a seventy percent average of customers wanted access to this thing.
And, you know, running a test on ten people will not stat sig in any way, but we ran the tests over and over and over and ended up doing hundreds of these. And that that average, you know, hovered around seventy, sometimes just higher. So just really great signal to see in terms of how to test those things quickly. So, very positive signal in terms of what we wanna learn, you know, which concepts we pursue.
This was the winner here. Did people understand it? Yes. Did they did it solve their problem?
Yes. And were they willing to try it? Yes. So those were, like, the green flags for us to move forward.
So next slide here.
We we use that feedback from users.
You know, they they really valued having a quick easy guided solution for stress anxiety relief. So this has been a really successful project for us.
We built this out. We've launched this. It was scalable. It's text based. You can translate it really easily.
We AB tested it just to double confirm, and have rolled out since that successful test. And the team has since then been prioritizing a lot of non audio mindfulness solutions, based on that feedback. So this is a really cool example of learning all these different things, launching something that worked, and then using that as a a launching pad to launch more things. So let's hop to the next slide real quick.
And, you know, the reason why we thought this was so successful was because our new concept was based on those insights. We understood who the customers are, what they needed, what they were looking for, what wasn't working, that helped us innovate here. We took those design constraints, we brought that into the process of coming up with ideas, and we were under you know, able to understand, you know, the real goal of those users and their limitations.
So we were able to design something to to actually help them make progress at the moment, which is really cool.
Next slide please.
So like I kind of teased up before, innovation fuels innovation. After launching this TapTivities, we've been developing a suite of non audio mindfulness solutions geared towards people just like this who are trying to do more in the moment relief, and get results, like, quicker. And we're we're trying to bridge them up to doing longer meditation sessions or or not, you know. We've just been learning more and more how there's so many different people who come into, the app trying to to solve problems for themselves.
So we've been really busy since then. We'll go to the next slide. These are just kind of images we can hop through real quick. But we're really reinventing the whole way that people interact with Calm.
For a long time, most of Calm was audio based. A lot of it expected you to either be in bed trying to fall asleep or sitting down with your eyes closed for for ten plus minutes that time. And we're trying to really just meet people where they are and make, you know, quicker hits, effective things that are targeted based on what they're trying to do. And we've been working on a lot of new different ways to do that.
So we can hop through those next two screens pretty quick, and then jump into key takeaways.
Cool.
I know I've been talking a lot. I'm excited to hear some of your questions and hear from you all and see some some faces that are not my own. But the key takeaways here is that each phase has really distinct objectives and outcomes and it feeds into the next one.
And the cool thing is, like, once you've kind of, like, done the first couple, you don't need to go back and constantly do them over again. They kind of waterfall down into the other ones. So you wanna understand the customers. Talk to people. Build your archetypes or personas or whatever your version of that is. But just getting a really good picture of, like, what's the size, what's the noisy problems, where are the opportunities.
Going into that, diving into problems, you know, create a really detailed evolving problem map that highlights those points, reference it a lot, get stakeholder buy in, really roll that thing out and have people seeing it. Then once you've got a good template, people are gonna be used to like, oh, I know the problem map for this problem and this problem, this problem. I can read it and just jump in right away.
Concept testing, again, don't be surprised if multiple concepts work well. You base them on really actionable customer insights. You've got a great foundation.
Work with your stakeholders to really align this on business impact and feasibility of those chosen concepts.
And then again, the last one here, testing features, this focus on usability and refining your concept into clear actionable flows. It just helps you really make, you know, iterations on top of things you're already confident in.
So that's it. I'm going to turn it over to Anissa to open up for questions and excited to hear from from you all.
Awesome, Chase. Chase. I am excited to continue on this conversation with some questions. I am just gonna bring up your, slide one more time because we have some questions in the chat.
We are gonna need to include these, with the recording. So if you've watched any of the other sessions from the human insight summit, we have all of those decks linked. And Chase also has this awesome template, slide towards the end of his deck, so you all will have access to this and, plus, recommend some books or framework. So this was one of the most popular sessions we had in Austin.
Really glad we could bring Chase back for a live conversation today. And we do have a several loaded questions that people have submitted. But as Chase mentioned, we would love to see a couple more faces up on these screens. So if you are having some good Internet connections today, you will see the raise hand button at the bottom of your screen.
Sometimes it's just easier to talk through those questions. So I will go ahead and get some of the questions entered.
I do see a few people raising hands, so we'll just do a little bit of a balancing act there.
But, Chase, let me just pull up one of the earlier questions that we got in the comment section. So while they agree with overly detail oriented personas without adding simple facets such as names, how are you eliciting empathy for the archetype within project teams?
Yeah. Great question. I think a lot of empathy building comes from the problem itself. Right?
Like, it's it's I think, it's one thing to say, like, oh, this is the the archetype or the persona and this is what they want. That that's helpful to build empathy as well. But once you see the different steps that go through it and which ones hurt the most, that is really helpful to have be figure out, like, oh, this is I'm I'm getting excited about this problem now because I thought it was just, oh, you come and do this, but turns out, it's several steps and people are really struggling and the the satisfaction gap is is huge, but they can't do it. And they've tried multiple things and nothing works.
That's a great way to build empathy with yourself and your team and your broader company, to be able to show people, like, no. This is how bad it is out there. This is the severity. And then, obviously, those, like, user testing clips of people just talking to their phone.
We've got so many clips that we share where, we ask people what your relationship with stress and anxiety is, and they'll tell really personal stories. Or what's your relationship with sleep like? And they'll tell, a lot of stories. And then also just the size of these things.
Right? It's hard not to empathize with a problem that's really, really big and growing and not getting better. So all those things kind of help layer it in. And that way you're not just, leaning on this one redundant persona to do a lot of that work for you.
It's like, okay. Yeah. You've got it here. You've got it in this phase, this phase, and this phase.
And the more you understand all these things, the more it goes out of it be. So great question.
Thanks. Now I'm gonna bring up one of our hand raisers. Logan Cooper actually submitted a few questions in the chat. So if you want to reference that, I can pull it on stage just to help folks, follow along, but feel free to take it. It's nice to see you today. Thanks for joining us.
Thank you so much. Thank you to user testing, you, miss Anissa, and, of course, you, Chase, and everybody else for showing up. This has been really, really helpful, for me as somebody who's interested in UXR, so thank you.
I was curious about Chase's perspective, especially given the UX research case study that we just went through, based on what research is not. And I was also bouncing this off of, this book I'm reading by called Just Enough Research by Erica Hall in which the author states that research is not asking people what they like. So I frequently struggle with this as somebody who's interested, Chase, in doing what you do. And I was curious, and this also links to my other questions as well, such as, you know, how what is a good time to use things like AB testing?
Testing? How do you know what exactly what kinds of research methods to use so that you're pulling insights that you can actually use to better improve your products and or services?
Yeah. Great question, and thanks, Logan. I think a lot of it comes back to, like, every one of those phases. I think, what you want to do is work with your broader team to help identify those. Right? I think the phase on the left, like, customer problem concept, those are pretty stapled down. Like, it's hard to argue with.
You can temple help templatize those questions on the question side of the team where it's like, do we care about TAM more than we care about sides of problem or do we care about this? And then work with those co workers to identify like, hey, what's the signal we want for these things? The template I put up is like a template, a starting point, but your company might be a little different. They might say, like, well, we don't really trust stories for for buying, and we don't trust the signal for that.
So it's like, cool. What what signal do you wanna see? What can we get out of research? And that helps you identify the methods you wanna use because, some methods aren't gonna give you the full story.
Like, you mentioned AB testing. AB testing is great for concepts and great for features, but it's really expensive. And you've gotta build the whole thing and you've gotta get it out there. And you've gotta use a lot of different people, to build this confidence and and calm AB tests a lot of things over the years.
And a lot of things didn't work and they're expensive to test. I think what you want to have is like other ways to really build conviction internally, to say like we're we're really confident in this and trying this thing out. And then we'll AB test it as well. But you don't if you start AB testing everything, we spend so much time and energy, and then you get back signal just like thumbs up, thumbs down.
You don't know why those things didn't work. You can follow-up with interviews and things like that, but that's also expensive. So, I I I like the idea of, like, trying to be, like, what people are, like, seventy percent confident in an idea. You can't be a hundred percent confident with just research.
Like, it's just not possible. And some of that signal is, you know, you can trust all of it. Right? But some of it, like, nerves are, like, are you, like, they try this thing or do you want this thing?
You can't get a hundred percent correct answer on that from users unless it's in the real world and they can, like, put their credit card down. So it is tricky, but I think the best thing to do is just, like, work with your team on, like, what are the questions we wanna get answered? What signal do we care about to move to that next phase? Because for the top two phases of, like, customer and problem, it's hard to argue with that.
You're just understanding. Right? You're just understanding who the people are, what they're gonna do, what gets in the way. Where it gets really tricky is that concept layer, and that's the hardest one to crack.
But it's where you get the most creative. It's where you gotta be a little sneaky. It's where you gotta ask questions kinda sideways.
And it's not a perfect science. It does come down to, like, you know, okay. We've got all this data. We've got hundreds of data points.
What do we feel excited about? Like, what do we think is gonna work? And then we build the conviction enough to test it out further. Does that answer your question?
It does. It it actually makes a lot of sense, and I really appreciate you, just taking the time to really give such a thoughtful answer. But, yeah, I'll quickly get off the stage so that other folks can ask their questions, but thank you so much for your time.
Cool. Thank you.
Okay, Chase. I'll pull up another question. And then for the audience, if y'all want to keep voting on questions that will just kinda help us prioritize them, this one has two votes. So, Chase, question, why do you prefer problem map versus the journey map?
Yeah. So good question. It is a journey map. Right? But I think there's a couple of things happening here.
One, it's just a branding thing. It's an internal branding thing. Right? Like, if you just say, like, oh, we made a journey map.
It's got a bunch of stuff in it. There's a lot of data. It it's it's great. Right?
But I know we've all been in situations where we've seen these, like, really polished journey maps where it looks really great. It's like a lot of small text, and it's just all these different things.
But going back to the idea of, like, branding and about problems, it helps people build empathy, like we mentioned before. It helps people actually, like, hey. We're not just trying to, like, look at something happening in the world. We're talking about a very specific problem someone's trying to solve.
And the reason why people pay for products is they're trying to make progress towards a problem. So why would we map that entire journey when we could map the steps for problems someone's trying to solve? So they're very, very similar, but again, it's kind of like rolling your own and coming up with a template that works in your organization where it's like, for us, we want to be really, really specifically focused on this is the problem someone's trying to solve, this is where it's not working for them, and this is where our product is failing them and where we need to innovate versus looking at something bigger.
And it helps you to map out, like, for our example for us, we do a lot of stuff with sleep. It's a big use case for us. We've mapped out sleep problems at multiple different levels. Right?
You've got the entire end to end journey of, like, what sleep different things are, but then you've got specific problem maps for, like, reading more or you're trying to wind down if you've been working until midnight, things like that. So it scales really well too. But end of the day, it's the same thing as a journey map. It's just, like, really making the team think about the problem, what gets in the way, and what progress people are trying to make.
Perfect. We can pivot to the next question. I just had it pinned.
Sorry. My pinned question went away. So I'm gonna pivot and share this one.
You mentioned getting cross functional teams involved for brainstorming sessions. Is there a set framework that you're using for that?
I wouldn't say a set framework, but I think the big thing here is, in that phase of, like, the problem phase, you wanna make sure you're opening that up for people to either, like, sit in on those conversations or watch some of the user testing videos, so they're coming into that brainstorm phase with the context. Right? You can you can wrap it up in, you know, archetypes and things like that, and that helps them kind of just jump in and go. But, I feel like the more FaceTime people have had with your customers, whether it's, like, a recording or a live conversation, that's what helps them really understand, like, how to make that brainstorm.
So, again, this is, like, per per product or per team, but we try to get people who are part of the company that maybe think differently from a designer or think differently from a product manager. We'll pull people in from, like, marketing, from content teams, from engineering.
So I think I don't have a framework for it, but the big things are, get people as close to the research as possible at a, you know, light touch way where they don't have to spend a ton of time doing it. Get them looking at people's faces as they talk. Give them good constraints, and and get a a mix of different types of people, in your organization so that you get lots of different ideas.
Perfect. And then for the next question, do you have advice on how to approach understanding what customer segments exist in creating archetypes from the ground up?
Yep. So some of that's captured in, like, the the the signal and the, questions and stuff like that. But, high level, you can I don't love surveys for lots of things, but surveys are good for just understanding, like, how many people are here for anxiety versus meditation, how many people struggle falling asleep at night? So use surveys to kinda capture things like, what do we do it for comms?
Will it be every several months, send out surveys where it's like, what was your main reason for downloading comm? Like, why are you trying to get started with the app? And we've got the same different options inside of that that we ask people over and over. It's changed a little bit over the years, but Firebase are good for that.
And then once you see, like, the size of those different things, you can dial in more of, like, qualitative stuff. But also just starting to talk to customers. Like, you know, Calm is used by millions of people. Not every product is that same size.
Some are bigger, some are smaller. But even just, like, getting ten people on user testing calls or, like, a Zoom conversation will start to show patterns of, like, oh, you know, I my my first my first project to Calm was defining those archetypes, and we did surveys, but then we just interview people. And we just kept seeing this pattern of, like, oh, it's hard to meditate. Okay.
We have to do other things. You know? So talk to people, surveys, start from there.
Awesome. We are coming up on time, so if anyone else wants to raise their hand and join the stage, feel free to do so now. But we do have several questions in the queue.
So I'll pull this one up now. After you've gone through this process, how often do you revisit each phase? What are some examples of when you might retest or go back to start phases?
Yeah. This is a great question.
For customers, you you don't go back and revisit that phase a lot, but you'll keep talking to people as you go through the other phases and as you go forward in time. And what that helps you do is either, like, resolidify, like, oh, we keep hearing the same things over and over and we haven't solved. Or over time, new new solutions might have cropped up whether they're direct competition or adjacent. So the customer phase and the problem phase don't get revisited a ton.
It's almost like as you step through the process, you revisit more. Right? Like, you're gonna have to try out more concepts. You're gonna have to build more features and test features more often.
Not every concept is gonna work. Not every feature is gonna work. So those get revisited a lot, but the first two phases of customer problem are usually relatively evergreen where you can lean on them over and over. Like, for instance, some of our, personas and archetypes are have been the same for four years and they haven't changed.
Like, other competition might crop up, but otherwise, still, people are stressed out. They're anxious because of x, y, and z. They try to solve it like this. They don't have a good problem they don't have a good solution to their problem.
So those those are really valuable rounds of research that you can do in not a lot of time. They yield a lot of results, and they're really shelf stable for a long time.
Thanks. I think you may have answered this next question on a podcast episode before.
But does your team utilize trauma informed research practices when interviewing users about mental health concerns like anxiety or panic attacks? And if so, what does that look like?
I can't get into too many specifics around this, but we just try to be really respectful of people. We're just looking for really good stories, and we do have them test things out like we saw before in those concepts. But we also have different divisions of the company where we've got like a comm health operating that's more clinical, and we do more stuff like that over there.
I'm what we call the d to c side where it's just anybody who's downloading that app.
We try to keep a lot of the UXR, you know, more casual and just understanding what are you trying to do, what what problems you're gonna solve, and then we've got other teams internally that focus on some other elements of that. So can't get into that too many specifics, but, you know, just try to we try to come at it from a lot of different ways and try to come up with a lot of different things, that can help people out because it's a it's a big problem.
We started doing this research, like, during COVID, and things have not gotten better or improved, you know, for that problem. It wasn't like it was just a specific slice of time. It's a big thing that we're trying to do a lot to help with and really wrap our arms around, and there's just a lot of ways you have to approach it.
Okay. Well, we can wrap up early. I will just bring one final question onto the stage. It's from Logan who was with us earlier.
When is the best time for researchers to use AB testing?
Yeah. Again, this is a great question. I think this is almost not a research thing. Like, I I, you know, I could be wrong here, but it's gonna be, you know, company to company.
I we prioritize stuff, in terms of research objectives. It's like, is this research heavy? Is it research light? Or is it a product decision?
Or is it a design decision? And then we've done our earlier framework work of, like, defining the customer, the size of the customers, those satisfaction gaps, what the problem is, need some concepts at that concept level, test of the features. AB testing is a product job at that point. We say, hey.
You've you've gotten all the information. Research can't answer everything for you. So I think it's really important to know the things you can and cannot answer with qualitative research. The Navy testing is one of those things where it's like it's really to design a product at that point.
Awesome. Well, thanks again for your time today, Chase, and thanks for everyone who was engaged in the chat and dropped questions.
With that, I'm happy to give you back to calm to finish out your job for the day, but we loved having you over here with user testing for a few minutes of your day or almost an hour, really. Anything else to close us out?
No. Just thanks for you all. And, yeah, I appreciate all the questions, and hopefully some of those templates, will be helpful in that deck. But I appreciate everybody.
Thanks, Chase. Bye.
Bye.