Tips for Using Written Response Questions in UX Research

| February 12, 2015
Sign up to get bi-weekly insights and best practices, and receive your free CX Industry Report.
Thank you!

Get ready for some great content coming to your inbox from the team at UserTesting!

written-response-header

Imagine this scenario.

You’ve just run a 5-person usability study on your e-commerce site.

User A told you that “…During the filtering process, it seemed like I could only research one brand at a time.”

Then User D mentioned that there was “…no way to sort items…”

Finally, User E explained that he wanted “…to make sure it had a USB port on it but didn’t see that filter. Also wanted expandable memory but didn’t see that filter either.” In fact, he was so bothered by the issue that he admitted he would “…go to a website that has filters for all of the features, not just a few.”

Are you picking up a trend here?

Now, what if I told you that I didn’t have to watch a single minute of video to catch this?

Written Response Questions 101

Previously on the UserTesting blog, we’ve covered how you can use quantitative research questions to supplement your qualitative studies, including Multiple Choice Questions and Rating Scale questions. (Our Customer Experience Analytics feature makes it easy to incorporate these types of questions into your test plan!)

Now it’s time to cover my favorite research question type: the Written Response. This powerful tool is a valuable way to get instant responses from users in their own words. Although the answers to these questions are technically qualitative, they are included in UserTesting’s Analytics tools because they provide instant data for researchers—without having to watch a single second of video.

As with the Multiple Choice and Rating Scale questions, this type of question has a wide variety of applications:

  • Gathering first impressions
  • Determining where users want to click and what stands out to them
  • Verifying that users grasp the concepts presented to them

Within the UserTesting platform, a Written Response question looks like a regular task when you set it up…

Written response question screenshot

Writing a Written Response Question in the UserTesting dashboard

…but test participants see a field in which they can type short responses.

Written Response question seen from the user's side

The question as the test participant sees it

Written Response questions are excellent for getting the test participants to use their own words to describe things.

For example, if you want to gather users’ initial impressions, you might try questions like these:

  • Explore the home page. Based on what you see, please describe what this company offers and who the site is for.

  • Spend a few minutes exploring the app. As you go, please type 3-5 words or phrases you would use to describe the app to a friend or family member.

And if you’re curious about how users would behave or what links they gravitate toward, you could try:

  • Please perform a Google search for Waterford chandeliers. Based on what you see, please indicate which link (or links) you are most interested in visiting, explaining your selections aloud.

  • Please tell us what you would do first if you landed on this page during a search for chicken pot pie recipes.

You can also gauge users’ understanding of symbols and concepts by using tasks like these:

  • In the top right corner of the playing screen, there is a small square with three horizontal lines. In the space below, please briefly explain what that button indicates to you.

  • This game uses two forms of currency: gold and magic seeds. How do you spend and earn these forms of currency during game play?

I also like to use the Written Response questions to capture in-the-moment questions and concerns with tasks like this one:

  • Now that you’ve explored the site a bit, please type in any questions or concerns you may have about the site, the company, or its product.

I find the last example particularly helpful when testing multiple features or flows within a single study. The UserTesting Research Team often sees a psychological phenomenon known as the recency effect while analyzing videos; users recall the most recent issues with a site or app much more vividly than those they encounter early on while testing. Adding an occasional Written Response field to our studies helps us to get a more accurate understanding of the issues that occur throughout the study, rather than only focusing on the last part of the experience.

Here are some tips to keep in mind when you write Written Response questions:

1. Use wisely.

Here at UserTesting, when we first decided to incorporate Customer Experience Analytics into our existing tool, our Research Team used them extensively to find out how users reacted to the new types of questions.

More than other types of questions, the Written Response questions fatigued users, especially if they were given 3 or more in a row. By the third question, users offered short typed responses, and didn’t even bother speaking their thoughts aloud anymore.

It’s tempting to load up a test with these questions because they’re so valuable, but most remote test participants are accustomed to answering questions verbally. The effort required to articulate the response, type it in, and edit it is much greater than simply clicking a button on a Rating Scale or Multiple Choice question, so it’s more likely to cause user fatigue—especially if they’re testing on a mobile device and have to type on their smartphone keyboard.

Use Written Response questions wisely, and space them out with other types of questions and tasks.

2. Remind users to type their response.

Although many users have encountered a Written Response question before (on the UserTesting platform, as well as in surveys, polls, etc.), it’s helpful to remind users of the task’s expectations—especially if they go beyond the typical think-aloud format.

One line of instruction (such as “Please type your answer in the space below”) can keep users from talking extensively for a few minutes, only to realize that they need to re-articulate their answer in written form in order to proceed with the test.

3. Whenever possible, specify response length.

Hopefully, you aren’t expecting users to type a full essay into a tiny text box, but if you are, you should say so! Same goes for when you want only 3 words.

Clarifying the proper response length of written feedback helps your users avoid over-exerting themselves, and it will make for more consistent data when you get into analysis.

4. Have fun!

Written Response fields are wide open to creative uses, like having users rank 3 sites from best to worst, respond to fill-in-the-blank statements, a play word association game, or suggest alternate terms.

So what do you do with all that text?

For basic analysis, you can always just read through responses to see if anything seems to be trending as an issue.

But if you want to get fancier, you can always use Wordle or another word cloud tool to turn that text into something a little more dynamic:

word_cloud

An example of a word cloud

A lot of teams find word clouds useful for presenting their research to team members and executives. It’s a handy way to instantly convey insights without overwhelming stakeholders with a wall of text.

If your team prefers a more scientific approach, most spreadsheet software allows you to count up the frequency of particular words appearing within cells and translate the totals into charts or graphs that point to common themes in user feedback.

However you slice up the written responses, there’s no denying that they can save you time and help you share results from your usability research.

If you’re a UserTesting Pro client, reach out to your Client Success Manager for more ideas on using Written Response questions to get the feedback you need.

Happy testing!