User testing guidelines

Discover the nine essential steps
Image
CX lessons from the c-suite

Note: In this resource, “user testing” refers to the general process while UserTesting signifies our organization and platform. 

Understanding your customers is the key to any successful business, to ensure that you provide them with what they want and minimize what they find frustrating or challenging. Whether it’s a physical product or a digital experience, including websites or apps, you’re more likely to meet—and exceed—customer expectations if you consider user testing throughout the ideation, development, and optimization process. That way, in every step of developing your product, you’re creating solutions and experiences that match what your customers are seeking.

The nine steps to successful user testing 

1. Define your objective 

While it may be tempting, it isn’t recommended to attempt to discover all issues and potential for improvement with your product in one single study. You risk receiving general feedback—and exhausting your test participants. By assigning one objective per study, you have a better chance of receiving specific responses that can guide you towards a tailored solution. 

Consider asking: 

  • What am I trying to learn? 
  • What are the outcomes, results, or KPIs that have the greatest impact on my business outcomes? 
  • Can users find the information they need? 

2. Create your test 

With the UserTesting platform, you have multiple options for creating a test. You can create one from scratch, with your choice of audience and test plan. Or, you can pick from our ever-growing, pre-made template library for more guidance. 

3. Identify the best method of getting the answers you’re seeking

UserTesting offers the choice between conducting an unmoderated test or a live conversation. While there are no hard and fast rules about which option is the better choice, you should try to pair your expected outcome with the type of study that will support your goal and needs. 

For instance, unmoderated tests are logistically deadline- and budget-friendly, and flexible, and work if you need a diverse set of contributors. Meanwhile, moderated tests, or live conversations, may be more expensive but allow for lengthier interaction between you and the contributor—which means opportunities for follow-up questions. 

4. Identify what you’re studying 

While your organization may have a prototype, mobile app, website, and service, you’ll receive optimal, and realistic, results if you focus on improving one product at a time. The UserTesting platform offers three product options to test: prototype, website, and app. 

5. Identify the types of people you want to include in your study

In most cases, the more diverse your participants are, the better. This could be in terms of age, gender, occupation, social networks, or more. UserTesting offers the option of building an audience from scratch or creating a link to share with those outside of the platform. 

And don’t forget to think outside of the box and factor in biases. For example, you may be testing the usability of a newly updated subscription app. Instead of opting for your typical customer, consider enlisting the voices of those who previously subscribed but no longer do. This way, you can receive insight into what made them leave and what it would take for them to subscribe again. 

Remember, if you get too specific with your demographic parameters, or set up too many screeners, you may not find the unicorn contributor you’re looking for. Being definitive about what you want in a contributor, but also flexible, will bring a better payoff than checking off every box on your list for the sake of it. 

6. Determine how many test participants to include

It’s been demonstrated that five participants will uncover 85 percent of the usability problems on a website and that additional users will produce diminishing returns. Resist the temptation to “boil the ocean” by doubling or tripling the number of participants in an attempt to uncover 100 percent of your usability problems. It’s easier and more efficient to: 

A. Run a test with five participants

B. Make changes

C. Run another test with a different set of five users

D. Continue iterating until all major challenges are resolved

If you’re looking for trends and insights beyond basic usability issues, it’s helpful to include a larger sample size. Here at UserTesting, five to eight participants are recommended for qualitative research while over 30 contributors can be enlisted for quantitative research

7. Build your test plan (for unmoderated studies only) 

If you’re creating an unmoderated test, start assembling your study by creating a test plan. Your test plan is the list of instructions that your participants will follow, the tasks they’ll complete, and the questions they’ll answer during the study.

Tip: Consider using both broad and specific tasks in your study. While broad tasks offer better diversity in responses, specific tasks are especially helpful when you’re seeking tailored feedback on a specific area. 

For example, you may want your contributor to go through a hypothetical checkout process on a website or app. Throughout these steps, you may want to sprinkle in questions on usability and whether one was able to successfully complete tasks or not. 

8. Launch a dry run test 

Before you launch your test to all participants, we recommend conducting a dry run (sometimes called a pilot study) with just one or two participants. This will

determine whether there are any flaws or confusing instructions within your original test plan. 

For instance, you may find that contributors’ test results are shorter than you expected, which could prompt you to add more detail to your task questions, or more task questions overall. While an extra step, taking the opportunity to make adjustments and improve the test plan before launching it fully will save you both time and budget. 

9. Analyze your results 

The moment you’ve been waiting for—the results are in. As you review answers to your questions, keep an eye out for similar responses and themes as well as any major deviations. If a significant number of your study participants provide similar feedback, this could signal an issue that impacts your larger customer base and deserves some attention. Or, if one or a small number of participants share a unique piece of feedback, you can hone in on this particular video to better understand why that participant or multiple participants had such a different experience. 

Take note of user frustrations as well as things that users find particularly helpful or exciting. By knowing what people love about your product experience, you avoid the possibility of “fixing” something that’s not really broken. Hearing about things that customers struggle with, as well as enjoy, can support informed discussions on future product and experiential improvements.

10. Share your findings 

After you’ve uncovered your findings, you can share them with your team and stakeholders to begin discussing the next steps. With the UserTesting platform, you can create a highlight reel that aggregates critical video clips. This is especially helpful when you have multiple participants with the same opinion—the curation of this feedback can be compelling.

Or, consider creating charts to represent interesting data and findings from your questions. This can be used to visually convey the volume of participants who say similar things (such as a word cloud) or to display the difference in opinions. In addition to videos or chats, you may even feel inclined to share participant quotations from the studies to back up your hypotheses or recommendations. Hearing straight from the voice of the customer is a powerful step in aligning team members and other stakeholders.

11. Champion a customer-focused attitude throughout your organization 

Whether you’re a beginner to user testing or not, we encourage you to gather customer insights throughout your product and campaign development process, and across multiple teams and departments.

The earlier in your process, the better, as it’ll help you better understand customer pain points and challenges, in order to more effectively brainstorm and support product and solution ideation. You can also hone in on the right customers and audiences to assess market opportunities and analyze product-market fit. Additionally, you can understand usability challenges in your prototypes or early-stage products to course-correct before you spend too much time or resources on the development.

As you launch new digital experiences, you should monitor how customers are reacting to and interacting with these new products. This yields ideas on how to continue evolving your product. Seeking frequent feedback and insights from your customers is the best way to keep your finger on the pulse of customer challenges and expectations. This ensures that you’re making the right decisions and taking all the right steps towards ongoing customer loyalty and satisfaction.

Want to learn more?

Grab a copy of User Tested: How the World's Top Companies Use Human Insight to Create Great Experiences, co-authored by UserTesting’s CIO Janelle Estes and CEO Andy MacMillan.

Image
Cover of User Tested Book