Those involved with the success of a website have an array of research tools and tactics at their disposal. Just to name a few methods, you could use:
It’s easy to drown in a sea of options. Everything from quantitative data (such as web analytics) to qualitative approaches (such as usability testing) can inform our web strategy and design decisions. It’s difficult to know when and how to use each option, and sometimes, there is no “right” answer.
Some methods that focus on collecting user feedback, such as focus groups and surveys, capture self-reported data. They capture what people say they like (or don’t) and whether they consider something to be useful, engaging, or well-designed (or not).
Unfortunately, we can’t use self-reported information as our only data source, because what people say they do is often different from what they actually do.
Also, we have to consider how users may be prompted to provide responses. For example, a required feedback form presented to users as part of a website experience likely won’t gather the most useful or valid data---and the way a question is written may bias responses and taint the data.
What people say they do is often different from what they actually do.What’s most useful is observing what users actually do. This can be done using methods that capture behavior, such as web analytics or usability studies. Web analytics can tell us what people do on a website. Usability testing can tell us why they behave that way. This is much more insightful than reviewing a self-reported average rating from an online survey.
Teams and stakeholders who work on an organization's website often have a few research tools or methods they are most comfortable with. Researchers love usability testing and other methodologies that involve observing users doing activities. Marketers and optimization experts focus on web analytics, because conversion rates and other measures often speak for themselves. It’s very easy to go with what we know and love, even if there are better alternatives.
It’s important to keep in mind the options we have when approaching a research question.
Here at UserTesting, our clients will often ask us to run a usability test to answer a specific question they have. Sure, we can run a study to get some information, but another method might be more appropriate. We often suggest another approach (and support that effort), or we run a usability study to complement the data they collect with another research method.
For example, a client recently asked us to run a usability study to test the structure of their site. They wanted to be sure users could find what they were looking for. We suggested that they run a tree test (a method for evaluating the findability of items within a site) with a larger group of users, along with a usability test with a smaller group of users. Using both methodologies provided more insight than a standalone usability test would.
In an ideal scenario, you’ll have the option to combine more than one method or data source. For example, running a usability study to find out why a specific design performed better in an A/B test is the perfect marriage of methodologies.
There is no “golden child’ in our toolbox of research methods or data sources. Every data source has its own set of strengths and weaknesses. The most successful research plans are those that integrate both qualitative and quantitative methods.
We have a lot of sticks in the fire: methodologies, tools, and overall strategy. So, where do you go from here?
Try not to get overwhelmed. Start small. What project is coming down the line? And what questions do you need to be answered? How can you inform those questions with research?
Once you attack this first project, look ahead. What’s coming in the next quarter? What about in the next year? And how can you start planning your research approach? Good research is thoughtful and planned ahead of time. If you spend the time to consider the big questions you're trying to answer, you'll be well on your way to a successful research plan.