Victoria Sosik
Senior Director of Experience Research, Verizon
John Hunoval
Associate Director - Consumer Insights, Verizon
Join Victoria Sosik and John Hunoval of Verizon, as they share unique strategies that look to enhance User Researchers’ toolkits. They’ll share innovative approaches used at Verizon that look to blend the worlds of User and Market Research, effectively expanding the reach of each individual practice. They’ll share case studies showing how they’ve implemented these ideas and how they have resulted in improved customer experiences and more commercially successful product development.
Thank you. Yes. So I'm Victoria.
I'm John.
And I think that was plenty of intro, so we'll just jump right in.
So raise your hand if you have read or participated in some discussion around this article that Judd Anton wrote on the UX research reckoning. Okay. I see...
Thank you. Yes. So I'm Victoria.
I'm John.
And I think that was plenty of intro, so we'll just jump right in.
So raise your hand if you have read or participated in some discussion around this article that Judd Anton wrote on the UX research reckoning. Okay. I see some familiarity.
So, for for those of you that were oh, I'm not seeing any speaker notes. Can we grab those up there?
But for those of you that are not familiar with this, this was a an article posted by Judd that kind of made the case that, UX researchers were not proving enough business value. And that was evidenced by the fact that we were disproportionately laid off compared to other functions in the past several years. Now and I think we're not quite there we go. One more slide.
Now we're on. Great. So, he, you know, he makes this argument that that the layoffs were kind of telling us we're not providing enough value. And, for him, it meant that we were doing the wrong kinds of research.
And he claims that, what was the wrong kind of research was research that again wasn't providing that business value and that we're spending too much time in middle range research, defined as user understanding and product development. And that we were doing this at the expense of macro research, which is more strategic in nature, future thinking, providing concrete frameworks, and micro research, which is closer to technical usability, eye tracking, and kind of detailed interaction development. And again, his thesis is that in this middle range, the juice just wasn't worth the squeeze.
So a lot of this article did resonate quite a bit with me, as I had spent the past five years building and evolving a, you know, what was then a small UX research team within an organization that was traditionally market research, and insights.
And, you know, one thing that struck me right off the bat was how there tended to be a different world view, amongst the market researchers compared to the UX researchers. UX researchers tend to be user customer first, focus on user problems, whereas the market researchers tended to lead with the business problem and what was trying to be, you know, accomplished by the business.
And it was kind of an interesting journey, as I I mentioned over the past few years, where I came to this team not really knowing what market research was quite frankly, because at Google it really wasn't a thing.
There may be a few kind of market researchers hiding somewhere, but we didn't interact with them. And pretty much all the research was done by UX researchers, all the product research, and things that I think many people would consider market research.
So kind of going from not knowing what it was to kind of trying to stake a claim for this small and growing u UX research team within the market research team, trying to show that we could do more than just usability because again that was what the team kind of thought we were here for.
And stepping on toes, kind of causing some messiness. That's something that, you know, happy to talk about offline over some wine. Did a did a lot of growing there.
And then towards where we are now, which is a much stronger collaboration, a valuing of each other's perspectives, methods, world views, in what we think is a much stronger value to the business. So, again, we're gonna kind of show where we are now, kind of what what we think is the the reason to incorporate these different lenses, these different methods, these different world views into your practice. But also just wanna acknowledge the fact that it took us a while to get here and it was it was messy.
So with that, we're gonna spend the next twenty minutes starting with just a grounding in what's gonna be super familiar to everyone here, what you may see as a kind of typical UX research roadmap, and then highlighting some areas that we can expand that by layering on the market research lens. Then I'm gonna pass it over to John to really go into depth on some methods that are kind of common in market research to answer those kinds of questions, and then share a case study of kind of a strategic project that we collaborated on across disciplines, to the at Verizon and kind of how we were able to drive impact and what we kind of gained, from that collaboration.
So starting with a UX research roadmap. Again, I know that there's no one single one, but this probably looks familiar to to many of you starting with maybe foundational research, whether it be diaries, interviews.
Could it line get lining into some concept testing where we evaluate features, maybe prioritize some features, get user reactions to to a concept.
As development goes on, we probably do rounds of iterative usability testing where we improve the overall flows and experience. And then hopefully at the end, once we have an actual product in the world, we're doing some evaluative testing of the overall experience.
You know, again, throughout this, we're asking questions like what are our users' needs? What are their pain points? What do they like about an idea? What do they dislike? Can users complete core tasks?
How easy is something to use and are our users satisfied?
Do those, like, do these feel familiar to you? I'm seeing some okay. Cool.
But if we're to kind of come back to Judd's argument, a lot of this work really does fall in that middle range, in user understanding and product development. And if we were to try to heed his call and shrink that middle range, what might our roadmap look like? You know, as I mentioned earlier, very strong proponent of layering on market research lens. So we use that as an opportunity to maybe highlight some other opportunities for macro and micro research. Again, not saying that there aren't any opportunities within the UX discipline, not saying that these things are not valuable activities at all, but just trying to kind of go through thought experiment of where we might be able to go beyond.
So again, layering on this lens, we can start asking questions more around our brand, how our brand is perceived and how, might customers think that we have the the value and the right to play in certain areas? What's the competitive landscape? Where's an opportunity to innovate? What's the market opportunity, the business value that we think we can gain for our company in this, landscape?
And what feature sets might provide the furthest reach to help us reach new customers that can then kind of become, you know, a business opportunity for us? In the micro side, asking questions like, how much money will this thing make? How should we talk about this product in our communications and our marketing in order to really drive traffic? Are they resonating? And how might we optimize our pricing so that we can, again, increase our business value?
So now I'm passing it over to John who's gonna kinda go into more of some of the methods that, is are typical in market research to address some of these questions and then kinda bring it together with how we work together to create this kind of comprehensive UX and market research, roadmap within one of our biggest strategic projects.
Very good. Thank you, Victoria.
Alright. So, for the rest of the presentation, we're going to be talking about a few different market research areas and then methodologies that hopefully you can take away from this and utilize to enhance your practice. So we have three types of research up here, really four different methodologies, including attitudes and usage, what we call take rate analysis, and there are two different pieces within that that we'll talk through, and then finishing up with communications testing.
All different ways of expanding outside of the middle range and reaffirming, research's value to the business.
Okay. So AMU. This is a survey based methodology that oftentimes represents a macro approach to research, typically something that you would do at the beginning of a product development cycle or at a refresh.
And this really provides one of the widest apertures available in all of market research because you're exploring your customers, your prospects, your product, and your brand, which sets you up to make decisions across a number of different kind of major touch points. One is understanding who to sell to. So thinking about your total addressable market, who is that person? How big is this market? And importantly, what are those users' needs?
From a product development perspective, this is an approach that can illuminate the importance of specific features in your product, how they're resonating with your users, and then also, where you may be at risk given what your competition is doing.
And then from a marketing perspective, this is a method that can really help detail what your prospects look like, the types of things they value, and how they are either similar or different to your existing customer base. So then you as a researcher can make a recommendation on how to best expand your customer base.
So one question before I give you some insight into what we're showing on the right hand side. How many of you guys are iPhone folks? Okay. How many are of you are Android users?
This is kind of the dynamic that we see within our customer base as well. And while we love our iPhone users, we also love our Android users. So one of the challenges that we face year in and year out is positioning our Android products in a way that we can stop or at least slow down, basically, our whole customer base wanting iPhones. Diversity of device choice is a really important tenant of our business.
So we've utilized ANUs, to dig into how we might effectively do that. So on the top, you're seeing a profiling. This is all ANU data. You're seeing a profiling of our Apple customers relative to our Android customers.
How are they similar? How are they different?
Android customers. How are they similar? How are they different? And how do they self identify? And then in the bottom example, you're seeing a profile profiling of our customers relative to how much they spend on their device and then some key diagnostics and stats, based on that profile. All to say, this is data that we use to position our Android devices.
This is data that we use to construct promos to support those products, and this is a method that we use to support how we actually communicate all of that to market.
So many of us are familiar with the idea of concept testing as a way to gauge interest in an idea. And and while we certainly do that too, our product teams are often tasked with this idea of building a business case to rationalize their investment in a potential product. So we oftentimes use concept testing later in the development cycle when articulation, features, pricing, all of the things are baked, and then the testing becomes our primary input into what we call the financial business case, which is take rate analysis.
So concept testing in this instance centers on action metrics. Because we're using this method later on in the development cycle, we wanna understand how many customers, how many prospects are interested in signing up, subscribing, purchasing something from us.
And like any concept test, overstatement of that intended action is really an expected outcome here.
And that's sometimes fine. Overstatement of action is sometimes fine, when you're not utilizing your data as an input into a financial business case. But when it is, we wanna make sure that those estimates are as accurate as possible. So we find the best practice is to calibrate our data, and that's what you're seeing on the left hand side here. These are four different potential new product ideas, all fine tuned using what we call an awareness based calibration scheme. Basically, we're detailing the marketing funnel, so all major touchpoints that marketers could impact pre, decision to purchase.
And we're honing in our estimate on how many people will follow through what they're actually telling us in a survey.
And and from utilizing this method, we've gotten a lot more accurate in estimating not only take rates, but revenue that is attached to those take rates.
So another much more stress, much more intensive way to do take rate analysis is Conjoint.
How many of us are familiar with Conjoint?
Okay. That's good. So Conjoint not only allows us to understand how many customers or prospects might be interested in purchasing something, but it allows us to understand the buying dynamic in more detail. So how are consumers actually thinking about making a purchase, and how do the things that we are positioning in our products inform, or or change the decision making around a purchase.
So we're using an example here from one of the one of the device protection offerings that we've built over the years. And this gives us as researchers using Conjunct gives us as researchers really two primary outputs. The the top example here is a percentage by percentage breakdown of how various component pieces of a product are actually contributing to the decision to purchase that product. So really cool, a really cool visualization there. And then on the bottom, this is probably what you're most used to if you've heard the term conjoint, before, actually getting out of that research process, and it's a simulator.
And a simulator enables us to do something that we call, scenario planning. So we're able to go in and play with different variables and say, hey. This is how one of these products constructed in a certain way might resonate in the marketplace. We're also able to say, hey. We're thinking about offering two things that are kind of similar. Let's measure and look at something like the potential for cannibalization.
Or we're able to say, we're thinking about offering product x. Our competitors, we already know offer product y. How might those two things coexist in the marketplace?
And from a financial perspective, this gives us the ability to really understand the the, trade off between including certain things within a product relative to the pricing decisions that may be required to offer those things in a product and ultimately providing us a very specific input that we can then take to our finance partners and put into something like a financial business case. Alright. So communications testing. As you're kinda drifting along the road map and you're, approaching the far end, communication case communication testing represents a macro I'm sorry, a micro research activity, that's that's really critical.
So this is a method that helps understand further what a customer values and what aspects of a product or service they're resonating with. And I think importantly, it allows you as a researcher to say, this is how much we need to focus on the product itself relative to the brand itself. So our message, what we're putting out there in the marketplace, is as impactful as possible. And this is an exercise that we go through for everything that Verizon offers, things that you see on TV as well as things that are are marketed and exclusively communicated digitally.
And there are two different versions to communications testing. The top example is the the simple version where it's a very self diagnostic, self reported diagnostic approach to comms testing where you're measuring, the communication on key categories, things like resonance, things like engagement, things like connection or how potent the message actually is.
On the bottom, you're seeing the detailed version of this. So in this example, we're actually able to measure at various points throughout a communications piece, how consumers are engaging or feeling about what they're experiencing. Think of this akin to kind of old school dial testing, but in the the twenty first century.
Another thing that this does is it enables you if you if you utilize communications testing, it positions you nicely as a researcher between your product teams and your stakeholders over there, as well as your comms based marketing team so you can kind of speak both languages. But, also, if you're an organization that uses marketing mix modeling, you can take a lot of these insights and put it directly into that method. And there's a whole another talk on marketing mix modeling, so I won't get into the details.
But this is a statistical, and mathematically based estimation scheme that takes research data, media data, as well as marketing mix data, and it looks to estimate how impactful a certain communication is, on sales. So you can say that commercial that you just saw on TV contributed x, y, or z to the revenue that we're seeing show up on a P and L statement.
So let's bring this to life. So we we bring the worlds of, user research and market research together as often as possible.
But I'd like to use our in market plan construct, we call it my plan, as a way to show how we actually do this in practice.
So each year, we go through and we refresh the plans that are available to customers. You know, some years, we might be doing something simple like adding a new tier, adding a new inclusion. Other years, we're talking a much more serious overhaul.
This is always a very intensive effort, includes most organizations within the enterprise. We're talking CX, research, design, strategy, marketing. You name it, they are included. And this is an effort that stretches all the way up to the c suite. So a really, really intensive piece of work.
So rewinding about four years at this point.
No. Two years at this point. Twenty twenty two hits, and our previous plan architecture had been in market at that point for about four or five years. And based on some of our, tracking data and a desire to evolve what we are offering in a more major way, leadership makes the decision that it is time for a complete redo of the plan architecture.
So really, really starting from scratch. Meaning everything was on the table. The one nonnegotiable here was that we had to center this architecture, this this plan construct around an unmet consumer need. So in order to help our stakeholder teams execute on this challenge, we developed a combined team of user researchers and market researchers to plan a road map that allowed research to stretch all the way across this very, very intensive product development, cycle.
We wanted research to act as a connection point between all of these various stakeholder organizations that were involved in this project. And then within our own research organization, really foster, an environment of direct collaboration.
So even if a study was more user research focused or a study was more market research focused, this is a combined team. All aspects of that team were involved in each research activity so insights could translate across everything that we're doing and do so with velocity. So really resulting in an approach that, that yielded a lot of insights about the market as well as our customer.
So the road map. So the the road map started, and the the first major phase here is what we call exploration. So the major objective here is to identify an unmet consumer need to ultimately center center this planned construction around. So we led off with, with an ANU, market research, led first steps. And in looking at the market, we were seeing a lot of signals that indicated that, control and customization were two things that consumers felt were underserved, yet they valued and they wanted more of in their wireless plans.
So we took all of that data, and then our user research team came in, and they led a huge internal work session with all of these various stakeholder groups to digest that information and start the brainstorming process about what our planned construction might look like to help address that unmet need. From there, we conducted some focus groups, to to pair down our nine finalist options, pair down a bit of concept testing as well so we can understand really where where the energy with our consumers were and what we might want to take into future testing.
So once a plan structure was agreed on, initial prototypes were built in both navigation and concept testing began.
So the nav testing focused on the buying experience, the purchase flows, while the content testing was really looking to evaluate the terminology that was being used to address both core as well as unique aspects of these new plans, and there are plenty of unique aspects of these new plans. And then from there, we we finished up this initial plan structure phase by doing some additional focus groups that look to see if value was being communicated through these plans and if those core tenets of control and personalization were actually coming through.
So once the primary elements of the plan were constructed, buy flows were established, We moved into fine tuning the value proposition and the experience, and this tends to be where middle range research creeps in. So you're getting tight to launch, the clock is ticking, and you start getting as a researcher a lot of those middle range type requests. I'd like to do concept testing. I'd like to do usability testing.
Oh, wait. We need to be be doing more usability testing. Those types of things, things that are often check the box types of requests, that could oftentimes prompt interesting results, but are not seen by the business as delivering as much value as research that plays in the front of this road map or the back of this road map. So our goal here again was to develop a road map that spent a lot more time outside of the middle range and do so intentionally.
The final testing phase. So this represents the most unique methodology and I think highlights the true power, the true combined power of user research and market research. We call it dynamic stimulus testing. So this is a single methodology that allows us to produce take rates that are high fidelity enough that they can go into the financial business case, while at the same time conduct usability testing. And the way that we do this is we work with our design partners to create a very high fidelity, clickable, and trackable prototype of the entire plan and phone purchasing experience, and we bake it into a survey.
And as a consumer is going through, we can see all from a click through perspective which customers are taking which plans while seeing how well they are actually able to complete a task.
So from a single methodology, we are getting two very diverse and high fidelity outputs.
So, you know, a a really cool application of these two worlds coming together in real life.
And we feel as close as in, I think, as you can get to doing in market testing and learning, but still maintaining an experimental environment.
Last phase, communications optimization. This was primarily a market research led touch point. But, again, user researchers are part of this team. They are involved in all research activities. So while we went through these various phases of comms testing, our user researchers were were intimately involved to make sure that insights and ways that we were, we were seeing consumers describe certain aspects of these plans were actually showing up, in the communications and ads that we were developing.
So why do this? Why go through the process of looking to, incorporate market research into your practice? There are really three different buckets of benefits that we're seeing. Let's focus on the first two, consumer benefit and business benefit. So because this was a combined, user research, market research led road map, we had a really good and holistic understanding of the market and our customer, which ultimately enabled, our business teams to build a plan construct that for the first time in the industry allowed a customer to build their wireless plan from the ground up, really addressing a key need that we identified and we saw in the marketplace.
Based on some survey research, more than half of customers who have migrated over into this my plan structure are reporting that it provides them more value than their previous plan constructed.
And then one year post launch, so this is about May of twenty twenty four, at the time, more than forty percent of customers had already migrated into MyPlan, just reaffirming the construct's value. There's something there that's actually, providing them a benefit.
From a business perspective, so q one twenty twenty four, as well as a couple of days ago during our q three, earnings call, These are the two best quarters that we've had for net additions, net additions being, like, the lifeblood of our industry and really what moves stock price. Two best quarters that we've had for net additions in, I wanna say, like, four or five years.
And much of that success is attributed to my plan, which again came out of this joint road mapping, research experience.
And then lastly, the new construct has exceeded a number of different KPIs, and goals set by the business. So from a consumer perspective, benefits, and then from a business perspective, we're seeing benefits.
But there's also the benefits to a research organization.
One thing that we're seeing is that because you are providing a higher degree of coverage across the road map, research now can act as a basis for decision making in many different areas that it previously may not have been able to.
It also creates connection across different aspects of the business. It allows research to break down silos, and silos is a huge theme of the day. Break down silos across organizations and, again, enable conversations that otherwise may not have happened.
It also allows you as a researcher to tell a more complete story. Because you're covering more of the road map, you can tell an end to end story about how your customer, is perceiving something, how they are interacting with something, and ultimately, when it gets to market, how they are how are they feeling about that experience. And because of that, and you as you think about managing up and kind of furthering your influence as a researcher, telling an end to end story is a really effective technique for socializing with executives, something that they they really like to say. And then last but not least, we've seen improved business results.
So there's there's some monetary advantage to doing this as well.
Alright. So so how do you take everything that we've talked about today and put it into practice? There are a lot of ways. These are just a few of them. There's the explore and learn side, and then there is the collaborate side. On the explore and learn side, find a market researcher within your organization, chat with them, pick their brain about how they might think about addressing some of the key problems that you're thinking of, or even build on top of a research approach that you might be putting into action.
Try it out. There's a ton of really great DIY tools out there. Don't be scared.
Experiment on your own. It's not brain surgery. It's just research, so try it out.
Seek out formal training. There are a lot of really great, learning organizations out there that train on the concept of some of the things that we've talked about or as specific as training on the the, different methodologies that we've talked about, something that's really, really specific like Conjoint. There are actual training courses to learn more about. And then from a collaborate perspective, once you find a market researcher, see how you can sequence some of your work with their work that iterates and build on each other.
Try what we've tried. There's a lot here. We understand this is a process that we we've you know, it's taken years to actually get to, but create a joint research plan.
You can always say, you know, this isn't working, kinda back out of it. And then if all else fails, hire a vendor. Again, there's a ton of really great market research vendors out there. I'm happy to talk to you about, you know, our experiences with a number of them, see how they can help have those conversations, and get their perspective on on how they might see some of the challenges that you're looking to, address.
So what we hope that you're taking away from this is that, you know, we do think that, you know, as UX researchers, we should be looking to provide maximum business value through the work that we do. And that widening your aperture, thinking about other lenses, can really help us deliver value in new ways, and kind of focus out on some more macro and micro research. And, you know, personally, we think that the the market research discipline really does provide some interesting ideas, some interesting kind of inspiration on how we can do that. And I think we have a few minutes left for questions.
Thank you both for this, talk, lecture, ideas.
My question is, how do you approach this from a UX researchers perspective, to not step on the toes of market researchers, especially if they're wearing multiple hats at the company?
Yeah. I mean, I think there's first of all, I I did that for, like, a year. Right. And, like, caused some organizational angst.
Again, I think it was necessary organizational angst, but, I I think leaning to the collaborate. Right? There's I'm sure there's you can find a friendly market researcher, right, who's willing to kind of talk and and, you know, just figure out how you might be able to piece things together. I'd say like the holy grail in my opinion is where we kinda landed.
And it takes the right kind of leader for this. But we got to the point where between my team on experience and at the time we didn't have this hybrid market, so all of market research was in a separate team.
We, as leaders, got to the point where we were like, look, there's a huge gray area. Let's just, like, own that. There's a few things that people would agree our market, a few that they'd agree our UX. But realistically, there's a lot that we can both do.
We may do them differently. We may bring different, lenses to it, but, like, we can both do it. So let's just talk about it. Right?
And, like, and like own the fact that we're going to have some back and forth and we don't need to clearly delineate. We can come show up together in a way that we get, we both get more credit. So I, you know, I think it's having open conversations, being, you know, willing to give up a a little and being willing to push the boundaries. And you're gonna make mistakes and you can recover from them.
So not great not not super straightforward, but I think just, you know, you feel feel your way through it.
And the only thing I would add is this is an exercise in relationship building. And then once you get there, this whole process becomes, much easier. Yeah.
So I have a question about, just the overall process. How did you all, overcome any road plots or speed bumps that popped up during your market research and layering layering that into your research road map?
I'm trying to what road I mean, that my plan was actually pretty Pretty steep. Some some of the typical ones that I see are competing timelines. Right? There's times when just like there's a a qualitative exercise that we're gonna do that's gonna be a few months long and market research is gonna do a survey that's gonna have some overlaps and they're gonna be done and we're not gonna be done yet.
So I think there's some timeline pressures that but I think by by just kind of, again, talking through, like, how these things are gonna sequence and see if they're really mapping them out, you can help to alleviate by calling out other research across market and UX on, like, next steps. Right? As opposed to only focusing on next steps UX, next steps markets. That's huge.
It's a great opportunity to triangulate across. And there's times when we find conflicting, right, insights.
It happens. Again, different methods, different perspectives. So I think being really honest and transparent about the pros and cons of any given method and trying to just use those conflicts as opportunities for discussion and to drive, like, get conversation and, like, a strategic outcome even if the insights themselves may seem contradictory. Usually, there's an explanation that actually can lead to interesting conversations. Those are two.
Yeah. And the I think the tension is actually a benefit in this instance where where insights don't necessarily match up perfectly from the market and user side. It begets more research. It begets questions. It begets conversation, and it might take us down a different path that ultimately could yield an insight that otherwise may not have surfaced. So I I view the tension and just the challenge of all of this as the real opportunity here.
Alright.
Real quick. So you mentioned market research and user research. Did you have to consider any other groups such as customer experience or an innovation team that may be also conducting some other type of research? And there's also, of course, the data analysts. There's always information out there.
So just curious if there was any any consideration or did that happen with your swell of Every day of our lives, the last, I believe.
Yeah. And this was clearly a bit of a simplified, abstracted away. Right? Because, yes, there were as John mentioned, we were interacting across the enterprise.
There was a there was even a b to b version of my plan happening at the same time, which we didn't even talk on. But that was we were trying to kinda cut the dots between the consumer and the b to b as well. So, yes. We did.
I think approaching them in similar ways. Right? Thinking about how you add those activities into the roadmap, how you triangulate insights, how you explain comp like, same things hold, just we couldn't fit on the slide.