Common usability testing questions

Posted on September 22, 2022
6 min read

Share

A tricky part of qualitative usability testing is crafting test questions. The way you write or ask questions can affect the answers you get. There are plenty of usability testing tools available to help you. However, it’s easier than you think if you try the funnel technique, which instructs usability studies to start with broad questions and narrow in as you go, and follow these best practices: 

  • Create specific, thought-provoking questions 
  • Don’t ask leading questions 
  • Remind contributors to think out loud 
  • Don’t take answers at face value 

Before you start, here’s what you need to know about common usability testing questions. 

The two types of usability testing questions

For getting feedback, there are two common question types. You'll need to choose: open-ended vs closed-ended questions.

Open-ended questions

Open-ended questions encourage free-form answers. They cannot be answered by “yes” or “no” responses. They start with words like “how,” “what,” “when,” “where,” “which,” and “who.” 

TIP: Avoid “why” questions because they lead people to make up answers when they don’t have a response. Instead, say something like, “please tell me more about that.”

When conducting qualitative usability research, you want to ask more open-ended questions because that’s how you get human insight.  There's no statistical significance if you only run a handful of interviews. So there’s no point in getting answers that can be analyzed statistically, so focus on getting richer data. 

Here are examples of open-ended questions:

  • How does this product feature make you feel?
  • What, if anything, do you want to change about this?
  • How easy or difficult is this process? Please explain your answer.

Closed-ended questions

Closed-ended questions have a set of definitive responses, such as “yes,” “no,” “A,” “B,” “C,” etc.  These questions are great for unmoderated surveys or type box responses because your users don’t have to respond as much and instead offer validation or lack thereof. These answers can be analyzed statistically, so they’re better suited for quantitative research than qualitative

Here are examples of closed-ended questions:

  • Does this product feature make you feel empowered?
  • Do you want to change anything about this?
  • Is this process easy or difficult? 

Common usability test questions

Now that we know the types of questions available to us, here are the types of usability test questions you need to know about: 

Screener questions

Screener questions, also known as screeners, are questions intended to evaluate a contributor’s qualifications and target specific groups of contributors. These multiple choice questions eliminate contributors who don’t qualify to participate in your study. 

Screeners allow you to find contributors based on their demographics or statistical data collected for a particular population by identifying variables and subgroups, like the following examples. 

  • How old are you?
  • How do you describe your gender?
  • What’s your relationship status?
  • What’s your household income?
  • How do you describe your ethnicity?

You can also filter contributors based on psychographics, data that collects and categorizes the population by using characteristics like interests, activities, and opinions: 

  • How do you like to spend your free time?
  • What’s the last big ticket item you bought?
  • Have you ever boycotted a brand? Please explain. 
  • How many hours a day do you spend on your phone?

To get started, identify the right target audience before creating screener questions, which ensures you get actionable feedback. For example, if most of your customers fall into a specific age range or geographic location, these might be the parameters you set. However, if you want to hear from those who may not be so familiar with your product for an unbiased outlook, think about enlisting users from outside the usual demographic. 

The trick to effective screener questions is asking them in a way that identifies your audience without leading contributors to a particular answer or revealing specific information about your test. 

For example, if you’re looking to test out a new mobile app intended for parents who live in the midwestern United States, you want to find contributors who fit the criteria. Instead of asking someone if they live in the midwest United States, you would ask: in which of these regions do you live and give answer choices for many different areas. Add distractor answers to your screener questions to deflect from the correct answer. 

Here’s how we would find our target audience of parents who live in the midwestern United States via screener questions: 

  1. In which United States region do you currently reside?
    1. Northeast (Deny)
    2. Midwest (Accept)
    3. South  (Deny)
    4. West  (Deny)
    5. Alaska  (Deny)
    6. Hawaii  (Deny)
    7. None of the above  (Deny)
    8. I do not live in the United States  (Deny)
  2. Which of the following best describes your current status:
    1. Married with children  (Accept) 
    2. Divorced with children  (Accept)
    3. Never married with children  (Accept)
    4. Married with no children  (Deny)
    5. Divorced with no children  (Deny)
    6. Never married with no children  (Deny)

As you can see, we’ve woven the response we’re looking for with distractor responses to increase our odds of getting the right contributors without revealing details about the test or who we’re looking for. 

Pre-test usability questions

Now that you’ve set up screener questions to find the ideal contributors, it’s time to ask questions to learn more about your contributor before the test influences how they might answer. Pre-test usability questions give context to your contributor’s actions and test answers. They can be open or closed-ended questions. 

For example, you might want to know how experienced your contributor is with mobile apps before the usability study. This will help you better understand why they take specific actions. 

Here are some examples of pre-test questions: 

  • Tell me about your current role
  • Describe your family structure
  • What work-related mobile apps do you use?
  • How often do you perform a specific task?
  • How familiar are you with…?

Usability test questions

In-test usability questions are questions directly related to your testing objective. They should start general and get more specific. Always ask open-ended questions during your test. 

Whether qualitative or quantitative, usability testing helps you understand the what, why, and how behind your customers and their actions. You can discover bugs or errors, get customer feedback, know your audience, learn whether something works as expected, and more. 

Unmoderated usability test 

When running an unmoderated usability test, you want to ensure your questions allow for open and honest feedback. Letting contributors know when you’re open to critical or negative feedback is also a good idea. After a contributor finishes a task, here are some common open-ended questions to ask: 

  • What is your first impression of the task you just completed?
  • What, if anything, did you like about the experience?
  • What, if anything, did you not like about the experience? 
  • What, if anything, was unclear or confusing? 
  • Which of the two tasks did you prefer? Please explain your answer. 
  • What do you think about the process for [action]?

Moderated usability test 

When running a moderated usability test, the moderator can probe deeper into the contributor’s responses. A good rule of thumb is to stay quiet and let the contributor do most of the talking, but here are questions for promoting feedback: 

  • What’s your opinion on how the information is laid out?
  • I see that you [action]. Can you explain your thought process? 
  • Based on the previous task, how would you have preferred the information?
  • You seemed to rush through the last step. What were you thinking?

Follow-up usability test questions 

Follow-up usability test questions are a set of questions that end the sessions. These might include clarifying or probing questions. 

After a usability test, you have another chance to ask contributors about their experience for additional context. Get feedback on the experience overall and see if there’s something they want to talk about that you didn’t ask. Follow-up questions can be closed or open-ended. 

  • What was your overall impression of the experience?
  • What, if anything, surprised you about the experience?
  • If you could change anything about it, what would you change?
  • How did the experience compare to past experiences?
  • Is there anything we didn’t ask you about this experience that we should have?
  • What final comments do you want to make before ending this interview?


As you can see, no test is complete without usability questions. For inspiration, take a look at some more usability testing examples. Or, to start testing, browse the UserTesting templates gallery for inspiration for your next project. Complete with pre-built templates designed by research experts, they can be used as-is or customized to fit your needs.

Want to learn more?

Bring customer perspectives into more decisions across teams and experiences to design, develop, and deliver products with more confidence and less risk. 

In this Article

    Related Blog Posts

    • Top down view of 4 colleagues at a round desk in a meeting

      Blog

      How to build a customer experience strategy framework

      In today’s competitive market, great customer experience is a key driver of success. As...
    • Photo of UserTesting THiS London stage

      Blog

      Digital innovation and insights driving customer-centric transformation: THiS Connect London 2024

      The Human Insight Summit (THiS) Connect: London 2024 was a must-attend event for digital...
    • Blog

      How to achieve product-market fit

      According to CISQ, $2.26 trillion is spent on software re-work in the US So...