Insight Out Conf 2025: April 23-24 San Francisco and onlineGet tickets
Go to app
GuidesUser experience (UX)

11 steps for creating a usability testing plan that works

Last updated

1 February 2025

Author

Dovetail Editorial Team

Working in a large organization with over 100+ employees? Discover how Dovetail can scale your ability to keep the customer at the center of every decision. Contact sales.

Short on time? Get an AI generated summary of this article instead

Users value convenience. They often prefer designs with intuitive functionality and simple navigation to designs with advanced functionality, as they can move through web pages or features without becoming lost, annoyed, or overwhelmed.

It all boils down to the core concept of usability. Ask yourself:

  • Can your target users interact with the tool easily?

  • Can users complete tasks quickly?

  • Is the tool satisfying to use?

  • Have you removed unnecessary complexity to leave a simple, intuitive, and effective design?

Usability gives users a good experience when navigating your product. Poor usability could harm your brand’s reputation by creating frustration and disappointment. It could even make users turn to your competitors for reasons you can’t quite quantify or understand.

Great usability doesn’t mean sticking to a playbook of universal norms and design principles. Instead, your teams should implement usability testing to refine your design and adapt it to your audience’s needs and preferences. It starts with a usability testing plan.

What is a usability testing plan?

Usability testing is the process of evaluating the ease with which your target users (who may be different from your target buyers) can use your product.

It involves assigning users specific tasks within an app, website, or other digital product, or you can simply encourage them to explore the product. Depending on the type of test and how you can observe it, you can gather insights about task completion, navigation, time spent on different elements, frustration cues, and potential challenges. These insights are highly valuable, as they can guide adjustments and help resolve inefficiencies.

A usability testing plan is the internal documentation and planning that aids project management. It guides the test, setup, and preparation, as well as how you assess and translate the results into actionable insights. Here are the most important parts of a usability testing plan:

  • Objective: set the specific goal or area of insight you want to focus on.

  • Timeline: forecast the schedule for every phase of preparation and implementation. Establish deadlines.

  • Target audience: identify who represents your target user group and consider how you will recruit them.

  • Test specifics: outline tasks, activities, and the environment in which you want test users to operate.

  • Analysis: determine the most valuable metrics, the data types you will gather, and how you will use them.

Why is a usability testing plan important?

Your usability testing plan provides a framework for maximizing the value of your usability testing. For example, you can ensure that the assigned tasks provide relevant data by aligning your goals, data collection, and analysis processes.

You can also use the plan to bring stakeholders on board, ensuring the objectives are top priorities for your design and/or sales teams. The process’s elements should be aligned from the start, helping you avoid delays and confusing data results.

Just as importantly, it standardizes your usability testing processes. Different stakeholders will handle different aspects of the project—the plan ensures your approach is uniform and cohesive. You’ll also have a clear framework for conducting usability tests in the future—whether it’s the same project or another one entirely.

Types of usability testing

Choosing the right type of usability testing for your project is critical in creating a usability testing plan that works for you. Consider these different types:

  • Moderated vs. unmoderated tests: facilitated sessions with a trained moderator guiding participants vs. independent testing without direct supervision

  • Qualitative vs. quantitative: gathering feedback such as reactions and opinions vs. collecting numerical data

  • Remote vs. in-person: conducted online from any location vs. face-to-face sessions in a controlled environment

  • Explorative vs. comparative: open exploration for freeform feedback vs. tests that compare specific results across 2–3 versions of a digital product

  • Desktop vs. mobile: testing on larger screen devices vs. smartphones and tablets to ensure a consistent and satisfactory experience

  • Guerrilla vs. longitudinal testing: rapid and informal testing vs. extended, in-depth usage over time

  • Accessibility vs. standard usability testing: testing with users who have additional needs vs. testing with general users

  • Low-fidelity vs. high-fidelity prototyping testing: testing basic wireframes or sketches vs. a near-final design

You can mix and match different types. For example, you might run a moderated qualitative test in person or remotely or an unmoderated test to gather quantitative feedback.

11 steps to create a usability testing plan

Creating a usability testing plan is a methodical process. Following these 11 steps enables you and your team to create a useful, straightforward plan.

1. Define your goals

Usability tests have a broad focus: to uncover friction and pain points in usability. You’ll need to define what you want to discover and the areas of usability you want to investigate. With specific goals, you can choose the best-aligned testing format. Here are some examples:

  • Measure user satisfaction: quantitative and explorative tests align with this goal, gathering user satisfaction and overall experience feedback.

  • Prioritize features or fixes: identify usability issues that most impact user experience and prioritize solutions. You might set moderated tests across multiple devices to zero in on specific issues.

  • Explore alternative designs: test multiple design options to identify the most effective solution. Comparative tests work best here.

  • Benchmark usability: compare usability metrics against competitors or previous product versions. You may find that longitudinal tests are the best choice depending on the specific uses you’re assessing. Replicate the conditions and methodologies across subsequent tests to get consistent insights over time.

  • Test navigation and information architecture: comparative tests are well-suited for ensuring users can find content or features easily.

  • Test-specific user scenarios: standard user testing, accessibility testing, and tests with specific user groups can help validate whether the product meets the needs of particular users or use cases.

2. Outline the logistics

This is the plan’s project management phase, where you outline logistical concerns and define your test’s concrete requirements. For example, you might set deadlines for the test and result analysis and a process for coordinating with the design team to action findings.

Whether the test occurs in person, on-premises, or virtually is a logistical concern. You must plan for the required equipment and space for an in-person test.

Don’t worry if you don’t have specifics during this phase. This process will give you a good idea of the tools and other logistics you need to define in detail later on.

3. Consider what tools you’ll need

At this stage, generalizations from step two need to become more specific. Consider what equipment your test users need, what data analysis and communication tools you need, and what programs can assist with the test itself. These might include project management software like Asana or Notion, video conferencing platforms like Zoom, Loom, or Slab, and AI-based analytics tools like Dovetail for pulling insights from raw data.

4. Determine the format

As you’re finalizing the test, it helps to consider the options as binary choices:

In-person or remote?

Will you invite participants to a testing area for a fully supervised test on your hardware under your preferred conditions? Or will you allow remote testing, where users assess your design using their own hardware in their preferred work environments?

Quantitative or qualitative data?

You might choose quantitative, qualitative, or both, depending on the insights you want to collect.

For example, you might time how long it takes users to complete each step of a task or how many clicks it takes to navigate through a design. Alternatively, you might survey users during and after the test to understand their feelings of frustration and uncertainty. Longer tests may have elements of both (though it’s important not to make the test so long and robust that it creates disruptive “noise” in your data).

Moderated or unmoderated?

Moderators are useful in interviews or interactive tests but can change user behavior or be an easy “out” for frustration points. Both moderated and unmoderated test types have pros and cons, so you’ll need to account for how a moderator can clarify or distort collected data.

5. Determine your sample size

How many participants do you want to test? Your chosen method will play a role in determining your sample size.

Interviewing dozens of participants can yield many insights, but interviewing hundreds or thousands will have diminishing returns. That said, testing 10,000 participants is not necessarily more difficult than testing 100, provided you can guarantee all the participants are in your target audience. 

Other factors to consider when finalizing your sample size are: 

  • The cost of implementation per participant

  • Subdividing user groups into different pools

  • The value of starting small and building your skills

  • Parameters, such as population size, margin of error, confidence level, and variance (using an online sample size calculator may help)

6. Write tasks that match your goals

What tasks or activities best align with your test goals and the type of data you need to obtain?

Broadly speaking, there are two task types in usability testing:

  1. Specific tasks—have a correct completion method

  2. Exploratory tasks—don’t have one right answer

Analysts can get valuable insights from both types of tasks. With specific tasks, you can identify failure points or potential sources of confusion. With exploratory tasks, you can learn how users interact with the design overall and see which types of behaviors are common and/or valuable.

7. Decide your evaluation metrics

When writing your tasks, choose your evaluation metrics. Each metric will help inform the monitoring software and analytics tools you should invest in.

Quantitative evaluation metrics include:

  • Time spent on tasks

  • Incidents of critical and non-critical errors

  • Percentage of users who complete tasks without errors

  • Percentage of completed tasks

Qualitative evaluation metrics include:

  • Likes

  • Dislikes

  • Recommendations and suggestions

  • Concerns (before and after user testing)

8. Incorporate a pilot test to get feedback

Perform a test run to reveal logistical concerns and resolve them. You can carry out a rehearsal with colleagues to run through the technical elements. Or, you can conduct a pilot test with a small percentage of your test pool to reveal potential areas of confusion and unclear wording. 

9. Recruit participants

You have several options for recruiting participants. You may want to request participants from your core client base so they can see the improvements you’re making, or you may offer users the chance to sign up. You can offer incentives for participation.

Ensure you leave plenty of time to get the participants you need and arrange the schedule. Also, consider using a platform to streamline the recruitment process, including your search for participants and keeping everyone organized.

10. Develop a usability report

Create a template that documents, structures, and communicates your findings and insights. Create it ahead of time so that it acts like a blank worksheet for your data analytics team to complete. They will know what insights they need before getting started, and you can do a review round to ensure the test is likely to provide relevant results.

11. Outline the next steps

It’s time to take action and perform checks. Assign team members specific implementation tasks, send invitation emails, and ensure the tools and interview rooms are reserved. Add any other steps relevant to your organization, such as arranging NDAs or getting final approval from directors.

Best practices for creating a usability testing plan

We recommend adhering to these best practices, even if you have a more custom approach to the planning process:

  • Workshop the wording for tasks and interview questions to remove as much bias as possible. Consider likely follow-up questions or have your moderator follow a specific usability testing script that curbs bias as much as possible.

  • Focus on realistic tasks that give you the most insight for a general audience or your core clientele.

  • Hold a review session to revise the plan, documents, and testing protocols when the project is complete. Your usability testing will improve over time.

Usability testing template

The right tools help you identify what’s working, what’s not, and how to improve—without guesswork.

Dovetail’s free usability testing template provides a clear framework for efficiently collecting and analyzing user feedback. Start making confident, informed design decisions today.

Should you be using a customer insights hub?

Do you want to discover previous user research faster?

Do you share your user research findings with others?

Do you analyze user research data?

Start for free today, add your research, and get to key insights faster

Get Dovetail free

Editor’s picks

User persona templates

Last updated: 24 June 2023

User persona templates

Last updated: 29 May 2023

What is the design cycle?

Last updated: 30 April 2024

How to conduct a successful UX audit

Last updated: 3 April 2024

Understanding digital experience

Last updated: 24 May 2024

Usability testing templates

Last updated: 13 May 2024

Related topics

Product developmentMarket researchPatient experienceCustomer researchSurveysResearch methodsEmployee experienceUser experience (UX)

A whole new way to understand your customer is here

Get Dovetail free

Product

PlatformProjectsChannelsAsk DovetailRecruitIntegrationsEnterpriseMagicAnalysisInsightsPricingRoadmap

Company

About us
Careers12
Legal
© Dovetail Research Pty. Ltd.
TermsPrivacy Policy

Product

PlatformProjectsChannelsAsk DovetailRecruitIntegrationsEnterpriseMagicAnalysisInsightsPricingRoadmap

Company

About us
Careers12
Legal
© Dovetail Research Pty. Ltd.
TermsPrivacy Policy

Log in or sign up

Get started for free


or


This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply. By clicking “Continue with Google / Email” you agree to our User Terms of Service and Privacy Policy