Watch a Demo

The art of content testing and measurement

| May 30, 2019
Sign up to get bi-weekly insights and best practices, and receive your free CX Industry Report.
Thank you!

Get ready for some great content coming to your inbox from the team at UserTesting!

Today’s article is a guest post from Patrick Stafford, Senior Digital Copywriter at MYOB and co-founder of the UX Writers Collective. Enjoy!


For many UX design teams, getting feedback on visual elements is a fairly straightforward affair. Seeing how someone interacts with a set of visual components can provide fairly immediate and meaningful feedback.

Testing the written word is a different beast. It’s a shame that more UX teams don’t put more emphasis on this, especially when the results of any test are heavily impacted by content. Too many researchers just opt for lorum ipsum or rough draft copy, without realizing that including meaningful content would alter the psychological framework of a test, thereby altering the results.

Just as users might be slightly confused or thrown off by an unfamiliar interaction or poorly-designed navigation, confusing words or jargon can also impact their experience. By not researching, measuring, and testing their content, UX teams rob themselves of the opportunity to learn how users understand and react to various tones, word choices, and content structures.

There are multiple ways to do this, but content measurement and testing happen in macro and micro stages, followed by understanding content personas and information hierarchy.

Macro and micro content measurement and testing

At the macro stage, usability testing and a variety of other methods are used to shape your broad messaging framework. Questions might include:

  • Does our main value proposition resonate strongly with our target audience?
  • Do our key phrases and our word choices for messaging speak directly to our audience’s pain points?
  • Are benefits called out in highly-relatable customer language versus internal marketing or product speak?
  • Does the user have enough context to move through our main task flows?

At the micro stage, tools like A/B testing help pinpoint which particular phrases and words people respond to—and get results.

  • Are our calls-to-action clear, motivating, and effective?
  • Do our explanatory moments give people enough info to move past high-friction or trust points in our task flows?
  • Do users understand our names, functional descriptions, and instructions for features?

Understanding content personas

Research and UX teams hoping to measure and test their content need to understand where the content begins. Content measurement and testing require a control or “base point” from which to measure progress.

This looks much the same as any other UX process. Speaking with customer support representatives, sales reps, product managers, business analysts, and especially customers themselves, all form valuable input.

The key is to identify the key messages these stakeholders provide. Just like any UX team would create personas, content measurement and testing require putting words to those personas.

What language do these personas use? Listen carefully to the words they use to describe the product or service in question, and think about the words they don’t use—and why.

From there, you can create value propositions and language to articulate the key benefits. Once you have that base point, you can begin actually testing the content with users to see if it makes sense.

Creating an information hierarchy

It’s important to remember at this stage that testing language and word use isn’t just about whether people react to specific word choices. That’s important, but not as important as understanding how users react to the flow of information. Creating an information hierarchy that allows people to find what they need is your first step.

To do that, there are several methods you can use:

Card sorting

Create cards with particular topics written on each, then ask those users to collect them into logical “buckets.” You begin to understand how users group information and where they expect to find it.

Surveys

Utilize surveys to literally ask users what terminology they might use for particular features and services, and what confuses them.

Lo-fi prototypes

Create a series of prototypes using mock screenshots or pages and guide users through a particular flow. See where they trip up, and what particular language might confuse them.

Highlighter exercises

Share a printout of a screen, or a link to a sharable document, like a Google Doc, with a user and ask them to use a red or pink highlighter to highlight words or phrases that confuse them, and then use a green highlighter for words or phrases they like or that make sense to them.  (This method makes it easy to quickly compare results from multiple users.)

Usability testing

Just like any usability test, create tasks for users that involve word use and phrasing. For instance, ask them to read content about a feature and have them describe what the feature does to measure their comprehension.

It’s also important to note that you don’t have to wait to test until after you write. Conducting tests or interviews before you write content about users’ needs can help form your writing process and form a vocabulary that resonates, especially if you listen to words users actually use themselves.

This also doesn’t have to be in person, there are plenty of platforms and tools available that enable remote testing and interviews.

After just a few tests you’ll begin to see what resonates, and what doesn’t and can shape your content based on those insights.

Search term mining

There is no better research than seeing the terms people actually use. Investigating Google Trends and keyword usage will guide your language to the popular phrases customers are already familiar with.

A/B testing

Placing one version of text against another is one of the easiest ways to determine if a particular phrase or word use will result in higher engagement or sales. This is known as A/B testing.

But before you use A/B testing on your copy, you should realize what it does and doesn’t tell you. While the data from an A/B test might inform you that users prefer a particular phrase or word, it doesn’t tell you why. For that, you’d have to follow all the various methods outlined above. For example, after conducting an A/B test, you can schedule a live interview with users and ask them to explain why they chose one option over the other.  

Remember that A/B testing is simply a quantitative study and that their results should be considered carefully before making any subsequent decisions on how features and benefits are packaged and sold to users.  

Designing content for everyone

Content testing is a complex but important subject. Many UX design teams focus on interactive and visual elements, leaving out content until after testing has been completed. Content testing is an ongoing, living challenge that any design team should engage with—both during the design stage, and then after any content has been implemented.

Without that testing, you’re missing out on the opportunity to exactly pinpoint what words, phrases, and content people respond to. Which means you’re missing out on sales, engagement, and other important gains that directly impact your bottom line.

Want to learn more?

Check out the UX Writers Collective’s course on Content Testing and Measurement to take a deep dive into testing and measuring your content for better experiences.