Working in a large organization with over 100+ employees? Discover how Dovetail can scale your ability to keep the customer at the center of every decision. Contact sales.
To create user-centered products, the user must be at the forefront of the design and development process. As products are developed, one way to ensure that users will enjoy and benefit from your products is to conduct usability tests —and use better usability testing questions.
Usability testing helps teams to create products and services for people, helping to ensure that what’s produced is fit-for-purpose, better than alternatives, and satisfying to use.
To conduct effective usability tests, asking your participants the right questions is critical. This guide can help you create better usability testing questions that deliver more accurate and beneficial feedback.
Bring together data and feedback from all your usability testing to quickly surface how you can reduce friction and delight your users.
User-centered design is the process of creating products with human behavior in mind. The process ensures that products are easy to use, solve key issues, and provide seamless experiences.
As part of user-centered design, teams allow users to test products and provide feedback. This helps designers optimize their creations and ultimately produce products that will better satisfy the final customers.
The process of usability testing is based on a number of core goals. These are to:
Ensure users can easily perform intended tasks
Gain awareness of potential issues, challenges, and areas of friction
Discover opportunities for improvements
Learn more about user behavior, needs, and pain points
To optimize products for competitivity in the marketplace
Commonly, usability testing is used for testing websites, apps, and software. It can also be used to test physical products, services, and IoT experiences.
Usability tests provide a significant amount of essential information for the UX team, design, and development teams.
Usability tests usually answer internal team questions, including:
Do our users find our product simple to use?
Are our users engaged when using the product?
Did our product satisfy our users?
Are there issues with the product that need to be resolved?
Could we deliver a better user experience?
These questions form the basis of the usability test, which can be considered a process to satisfy those questions.
Two main types of usability questions exist, and it’s helpful to decide which type you’ll use for the tests you run.
Typically, a combination of both is most effective. Particularly if you have a less open participant, sometimes, a closed-ended question can be both easy enough for them to answer but also not the most elaborate data collection.
Mixing them up and leveraging silence to encourage participants to “fill up” the silence with their thoughts and ideas is ideal.
These questions have predefined answers such as ‘yes,’ ‘no,’ or ‘I don’t know.’ Closed questions can be helpful for consistency and collecting accurate data quickly, but they do not encourage the user to elaborate or offer additional information that could be useful.
An example of a closed question is: Did you enjoy using this product? (Yes/No)
Open questions do not have a defined answer. They encourage the participant to give reasons and additional thoughts when responding.
While these questions can be more difficult to categorize and measure, they give color to the data. Through open-ended questions, you may gain off-the-cuff insights and spontaneous responses that you wouldn’t otherwise.
An example of an open-ended question is: What are some ways we could improve this product?
Whether you are writing website usability testing questions, or questions related to an app, asking the right questions heightens the chance of improving your product.
If your questions are too restrictive, you might not gain bolder insights that lead to more sellable products. On the other hand, questions that are too vague may confuse the participant or make them give information that doesn't help your cause.
The right usability questions are critical as they can:
Help you better understand your customers
Highlight core issues that need to be resolved
Lead to optimization that can make or break the product
Getting your usability questions right then is essential. Some of the best questions include the following:
Before diving into the usability test, learning about your participants is best practice. That way, you can align your findings with your specific audience and understand what is or isn’t relevant.
To gain clarity on your target market, narrow down your core audience and identify issues that might impact this group. Demographic questions help you to do this at the outset.
Ideal demographic questions include:
What is your age?
What is your occupation?
Who lives in your household?
What is your level of education?
To explore how technically savvy, informed, and relevant your users are, questions around prior knowledge can help.
Ideal prior knowledge questions include:
How technically savvy do you consider yourself to be?
How do you normally do this task?
Would you consider using a digital product to solve this problem?
How relevant do you think this problem is to you?
Oftentimes, being too direct with our questions could lead to unexpected misdirection. For example, the participant may say they are “very tech savvy” to my question but may not understand how to log into their email accounts. This may be due to different perspectives, for example, if none of their other friends even have email accounts.
To help mitigate such mishaps, more intelligent deeper questions can help us discover exactly where they are coming from. For example, if we’d like to know how tech-savvy they are, we could ask questions like:
What kind of phone do you have?
What’s your favorite app on your phone, and why?
How much do you spend on your phone on average?
What percentage of work do you have to do for your job on the computer?
Where a tech-savvy person may respond with things like “the latest iPhone,” “Coinbase,” “6 hrs,” and “all of it,” a less than savvy person may say “I don’t know,” “Candy Crush,” “2 hrs,” and “none of it.”
As your users complete the usability tests, observing their behavior is essential—typically monitored by a UX researcher. To gain insights into the user’s thoughts, feelings, and concerns, it’s critical to ask questions, too.
The main question you need to ask in a user test depends entirely on the task you are testing. Keep in mind you’re testing the technology and not the user. Therefore, first, determine the task you’d ideally expect the user to perform in the app. Then decide how to prompt them into trying to do that task themselves.
For example, if we want to see if a user can log in without our help, we can ask, “If you wanted to sign up, what would you do on this screen?” We will then show them our login screen and wait.
You could also remind them to think aloud. If the user looks to you for help or even asks directly, remind them politely that you aren’t there to help but to observe that as long as they’re thinking aloud, they are doing great.
Once you’ve completed the main task you’re testing, the next questions to ask in usability testing could include the following:
What were your first impressions?
Did you find this [app/website/product] simple to use?
Were you able to find the information you needed?
What were you expecting to find on this page/within this feature?
How simple or challenging is the process to navigate?
What are your thoughts when considering the design and layout?
Would you prefer the process to be simpler or smoother?
What motivated you to make this choice?
After the testing is completed, it’s very useful to gain the user’s final thoughts. What did they like and dislike? And what did they wish was better? These questions help you to discover what the user feels after using the product. This can help indicate whether they would want to come back and use it again.
Best practice questions include:
What are your overall thoughts about the product?
What did you not like about the product?
Are there specific things you enjoyed when using the product?
What do you wish would have been better?
Would you recommend this product to a friend or colleague?
Moderated tests involve a person or group of people overseeing the testing process. These tests can be held in person, such as in an office, or testing center, or remotely, such as over the phone or on a video call.
Moderated tests are ideal for observing behavior in real-time, for asking a series of questions and follow-up questions, and for a deep understanding of the users. Moderated tests are also ideal if the user needs help at any point to move on to the next stage.
Unmoderated tests are when users complete tests set out for them previously without guidance—completed remotely. The tests are usually recorded so that the testers still have the chance to observe behavior.
The benefit of unmoderated testing is that the user can complete the test at a time that suits them—timezones, scheduling, and conflicts are not an issue.
Unmoderated tests, however, can have challenges. If users become stuck, no one can help them move forward. It’s also challenging to ask follow-up questions or gain live feedback as the user progresses. Some of the data collected, then, may be limited.
Given that unmoderated questions must be written and delivered in advance, with no one in person to ask follow-ups, it’s important to tailor them for this testing type.
Unmoderated questions should be efficient for the user to complete—closed questions are favored over open questions—and they should relate specifically to the project's overall goals with no ambiguity.
Some best practice unmoderated questions include:
How simple did you find this process? (on a scale of 1 to 10)
How much did you enjoy using this product? (on a scale of 1 to 10)
Which process did you prefer to complete? (multiple choice answers)
Were you able to overcome any challenges that occurred? (‘yes’ or ‘no’)
What’s your overall impression of the product? (‘positive,’ ‘neutral,’ or ‘negative’)
In usability testing, ideally, you want to see how your participants can navigate through a series of tests while you observe their behavior. This is helpful to see how they could use the product without any assistance. We are trying to create the environment closest to if we weren’t there in order to see if the solution works for the user or not.
If your participants ask you a question, such as how to use a certain feature, ask them to think aloud and walk you through what they would do if you weren’t there. We’ll need to note the trouble spot as well as how the user expected it to work or look. Any difficulty is crucial and the most important piece of user testing.
Understanding those problems and fixing them more to expectations gives design and development teams clear direction on what to build next. Ideally, you want your participants to move through the tests themselves while you take notes.
Some questions are best avoided, too. That’s because they can cause conflicting data, inaccurate responses, or answers without much benefit to the team.
One core question type to avoid is leading questions. Leading questions either contain the answer or encourage the user to answer differently from how they would have.
Take the question, what did you enjoy when using the product? A moderator asking a leading question might say, it seemed as though you enjoyed using the product. By implying the answer, the questioner leads the participant to answer in the affirmative. This restricts genuine, unrestrained answers which provide true insight.
Unspecific questions can be problematic. Ensure that the questions relate to the product and don’t get too general or vague. Otherwise, the collected data won’t benefit decision-making as much as possible.
Giving users the chance to test your products gives new context and understanding. It can mean a fresh perspective and new insights and ideas that you may never otherwise discover.
Usability tests help ensure that your products are simple to use and will ultimately delight the marketplace. Asking the right questions during a usability test can be the difference between the testing that drastically improves your products or not.
Questions that get into your users' heads will ensure that you can streamline the offering, smooth out any issues, and stay competitive in a crowded product market.
While some may use the terms interchangeably, user testing and usability testing are two distinct concepts.
User testing is a process organizations use to ensure a market exists for their product. This is used to understand whether the proposed solutions solve a real-world problem for users.
Usability testing is a technique used by organizations to promote people-centered products. The process ensures that the products created are easy to use, fit for purpose, and better than alternative options in the market.
Ultimately, user testing asks, is there a market for this product? While usability testing asks, is this product usable for our customers?
Usability testing is used broadly across digital products, not just on websites. Usability testing is commonly conducted for apps, digital products, and websites. It’s also used to gather information about experiences, services, and physical spaces, among others.
Typically, a UX researcher or a UX research team will conduct usability tests. A neutral moderator—either someone from the team or an external resource—will sit in during the testing process to ask questions and take notes.
Once the UX team has conducted the tests and collated their findings, they’ll pass on a report with key takeaways to the design and development teams for action.
Do you want to discover previous user research faster?
Do you share your user research findings with others?
Do you analyze user research data?
Last updated: 18 April 2024
Last updated: 24 June 2023
Last updated: 29 May 2023
Last updated: 22 October 2024
Last updated: 22 October 2024
Last updated: 13 May 2024
Last updated: 22 October 2024
Last updated: 22 October 2024
Last updated: 24 October 2024
Last updated: 22 October 2024
Last updated: 30 September 2024
Last updated: 16 March 2024
Last updated: 24 September 2024
Last updated: 30 January 2024
Last updated: 30 January 2024
Last updated: 24 October 2024
Last updated: 22 October 2024
Last updated: 22 October 2024
Last updated: 22 October 2024
Last updated: 22 October 2024
Last updated: 22 October 2024
Last updated: 30 September 2024
Last updated: 24 September 2024
Last updated: 13 May 2024
Last updated: 18 April 2024
Last updated: 16 March 2024
Last updated: 30 January 2024
Last updated: 30 January 2024
Last updated: 24 June 2023
Last updated: 29 May 2023
Get started for free
or
By clicking “Continue with Google / Email” you agree to our User Terms of Service and Privacy Policy