The ins and outs of building a user research education program

Posted on March 12, 2020
5 min read

Share

You care about the customer experience. And you want, even need, other people in your organization to care about it, too. But, if only a few people in the organization are experts at collecting customer insights, you’re missing out on a huge opportunity. 

You could add more experts, but that would only incrementally increase the ability to collect customer data. A better approach is to educate a wide swath of the organization about the value of customer insights. And increase the user research knowledge and skills of the people already involved in your organization’s projects.

In my recent webinar, we explored the importance of educating people in your organization about customer research. You can watch the full webinar anytime, or keep reading to get the highlights.

What are some core educational principles you should consider when educating your organization about collecting customer insights?

When you think about educating more people in your company about customer research and testing, the first step is to define the purpose of any content you’re going to create. Generally, any content you create will fall into one of three categories: awareness, knowledge, and skills.

  • Awareness: Is the content intended to create awareness about a topic? For example, a document showing that conducting a user interview is a way to collect user feedback is intended to create awareness.
  • Knowledge: Is the content intended to increase knowledge about a topic? An Intranet page describing the steps required to complete a set of user interviews would build the reader’s knowledge. They may not be able to execute the interviews themselves, but they’d come away knowing more about what activities are required and how long it might take.
  • Skills: Is the content intended to build a skill? Skill-building is most successful when it involves hands-on activities. A how-to course with activities along the way can build the learner’s skills.

The second step, before you put any effort into creating content, is to define the desired outcome. These are referred to as learning objectives.

Be as specific as possible because the more specific you are, the more focused—and better—your material will be. Learning objectives that are specific and focused help you decide the best way to design your educational content and help you understand if your content has achieved that goal.

Think of this the same way that you think about the objectives of your user research. If you have good research objectives, you’ll be able to create the most effective test plan and you’ll be able to measure if those objectives were met.

What’s a good approach to creating educational content?

When we created our Learning Navigator, we followed an approach we call the four D’s: decide, discover, design, and deliver:

  • Decide: Decide what you’re going to try to deliver. What topics are most important and who needs to hear about those topics?
  • Discover: Discover more about the needs of that target audience, as well as the mode of delivery that works for them. As with any good iterative process, discovery may make you revisit the decisions you made in the previous step.
  • Design: Design the content and delivery. Use learning objectives identified in the previous step and build your material around those objectives.
  • Deliver: Then Deliver and iterate. You may deliver to a friendly audience first, so that you can refine the content and delivery before expanding the audience.

What topics should we cover in our training?

Four typical customer research topics are: tool training, methods training, process training, and ongoing support. 

  • Tool training: Your team likely has some software tools that it uses, so having everyone on the team get a consistent baseline of training is critical. In addition, you want materials to address the frequently asked questions about each tool, especially the common “how do I get access.” Also think about your home-grown tools, such as templates the team should use. 
  • Methods training: Educate your team members on the baseline methodologies that you use. You may need some awareness and knowledge training about research 101, such as “what is qualitative testing,” “what is quantitative testing,” and “how is a user interview different than a usability test.” And you’ll want deeper knowledge and skills training on other topics, such as writing research objectives and screener questions. 
  • Process training. This is about how you run research at your organization. What do your project briefs look like, how do you use your internal tool for finding current customers to invite as participants in research, awareness of your central repository of research insights, as well as skills training in the art of submitting a good and meaningful insight into that repository.
  • Ongoing Support: Lastly, you want to ensure you provide ongoing support, such as mentoring and office hours support for your learners. Arguably, this is a mode of delivery, as well as a topic of content, but you need to think about how to get your team members good at executing research.

How do we evaluate the success of our educational program?

The Kirkpatrick Model is a great model for evaluating the success of education content. It includes four levels of evaluation: reaction, learning, behavior, and results:

  • Reaction: The most basic level is measuring the usage of—and reaction to—your program. Look at your consumption and engagement numbers. For example, we look at the number of active users and lessons completed. Also, collect qualitative feedback such as through event feedback surveys
  • Learning: The next tier is learning. Are your learning objectives being met? Knowledge checks and quizzes are some ways that evaluate this.
  • Behavior: Next, you want to evaluate if your education is changing behaviors. Do teams that consume your training complete more usability tests, for example?
  • Results: Which leads to measuring business results. Teams may be completing more tests, but are those tests being done well and driving the company’s business goals?

When you’re embarking on an education program, it’s critical that you set up measurements for evaluating your success. In addition to the above, office hours are another good place to evaluate learning. If learners have consumed your content and still show up at office hours with basic questions, then your content needs improvement.

Ultimately, the key is to spread the knowledge of collecting customer insights so that they can inform all the decisions the business makes, helping promote a more customer-centric culture.

In this Article