![](/sites/default/files/usertestingv3/styles/width_120px/public/wp-uploads/ki-in-chicago.jpg.jpg)
You’ve just run a 5-person usability study on your e-commerce site.
User A told you that “...During the filtering process, it seemed like I could only research one brand at a time.”
Then User D mentioned that there was “...no way to sort items...”
Finally, User E explained that he wanted “...to make sure it had a USB port on it but didn’t see that filter. Also wanted expandable memory but didn’t see that filter either.” In fact, he was so bothered by the issue that he admitted he would “...go to a website that has filters for all of the features, not just a few.”
Are you picking up a trend here?
Now, what if I told you that I didn’t have to watch a single minute of video to catch this?
Previously on the UserTesting blog, we’ve covered how you can use quantitative research questions to supplement your qualitative studies, including Multiple Choice Questions and Rating Scale questions. (Our Customer Experience Analytics feature makes it easy to incorporate these types of questions into your test plan!)
Now it’s time to cover my favorite research question type: the Written Response. This powerful tool is a valuable way to get instant responses from users in their own words. Although the answers to these questions are technically qualitative, they are included in UserTesting’s Analytics tools because they provide instant data for researchers---without having to watch a single second of video.
As with the Multiple Choice and Rating Scale questions, this type of question has a wide variety of applications:
Within the UserTesting platform, a Written Response question looks like a regular task when you set it up...
...but test participants see a field in which they can type short responses.
For example, if you want to gather users’ initial impressions, you might try questions like these:
And if you’re curious about how users would behave or what links they gravitate toward, you could try:
You can also gauge users’ understanding of symbols and concepts by using tasks like these:
I also like to use the Written Response questions to capture in-the-moment questions and concerns with tasks like this one:
I find the last example particularly helpful when testing multiple features or flows within a single study. The UserTesting Research Team often sees a psychological phenomenon known as the recency effect while analyzing videos; users recall the most recent issues with a site or app much more vividly than those they encounter early on while testing. Adding an occasional Written Response field to our studies helps us to get a more accurate understanding of the issues that occur throughout the study, rather than only focusing on the last part of the experience.
Here at UserTesting, when we first decided to incorporate Customer Experience Analytics into our existing tool, our Research Team used them extensively to find out how users reacted to the new types of questions.
More than other types of questions, the Written Response questions fatigued users, especially if they were given 3 or more in a row. By the third question, users offered short typed responses, and didn’t even bother speaking their thoughts aloud anymore.
It’s tempting to load up a test with these questions because they're so valuable, but most remote test participants are accustomed to answering questions verbally. The effort required to articulate the response, type it in, and edit it is much greater than simply clicking a button on a Rating Scale or Multiple Choice question, so it’s more likely to cause user fatigue---especially if they’re testing on a mobile device and have to type on their smartphone keyboard.
Use Written Response questions wisely, and space them out with other types of questions and tasks.
Although many users have encountered a Written Response question before (on the UserTesting platform, as well as in surveys, polls, etc.), it’s helpful to remind users of the task’s expectations---especially if they go beyond the typical think-aloud format.
One line of instruction (such as “Please type your answer in the space below”) can keep users from talking extensively for a few minutes, only to realize that they need to re-articulate their answer in written form in order to proceed with the test.
Hopefully, you aren’t expecting users to type a full essay into a tiny text box, but if you are, you should say so! Same goes for when you want only 3 words.
Clarifying the proper response length of written feedback helps your users avoid over-exerting themselves, and it will make for more consistent data when you get into analysis.
Written Response fields are wide open to creative uses, like having users rank 3 sites from best to worst, respond to fill-in-the-blank statements, a play word association game, or suggest alternate terms.
For basic analysis, you can always just read through responses to see if anything seems to be trending as an issue.
But if you want to get fancier, you can always use Wordle or another word cloud tool to turn that text into something a little more dynamic:
A lot of teams find word clouds useful for presenting their research to team members and executives. It’s a handy way to instantly convey insights without overwhelming stakeholders with a wall of text.
If your team prefers a more scientific approach, most spreadsheet software allows you to count up the frequency of particular words appearing within cells and translate the totals into charts or graphs that point to common themes in user feedback.
However you slice up the written responses, there’s no denying that they can save you time and help you share results from your usability research.
If you're a UserTesting Pro client, reach out to your Client Success Manager for more ideas on using Written Response questions to get the feedback you need. Happy testing!