Watch a Demo

OneWorld success story: Capital Public Radio increases engagement, loyalty, and NPS

| March 7, 2018
Sign up to get bi-weekly insights and best practices, and receive your free CX Industry Report.
Thank you!

Get ready for some great content coming to your inbox from the team at UserTesting!

UserTesting recently awarded the nonprofit organization Capital Public Radio a free annual subscription through our OneWorld program for nonprofits. UserTesting OneWorld helps charitable organizations create better user experiences, thereby increasing their impact in the community.

We did a Q&A with Veronika Nagy, UX Designer at Capital Public Radio, to talk about the successes her team has achieved on their website through their partnership with UserTesting OneWorld.

Before UserTesting, how did you know what was working and what wasn’t on your website?

We primarily relied on surveys, analytics data, and feedback from contact forms and calls. We conducted an in-house customer research session once on our own. While we loved learning from our users, it took a lot of time from an already stretched web team to set up, recruit, and run the studies.

How did you feel about using the UserTesting platform?

UserTesting has been a great asset for us. The platform itself is easy to use and has streamlined the customer experience research process, allowing us to conduct more frequent research. The ability to create highlight reels made it easy to share findings across the organization, which greatly increased awareness of the power of human insights.

What were some of the insights that you learned?

We used UserTesting for specific projects to learn customers’ preferences, and what’s working and what could be improved in existing and proposed designs for our web audio player and our site navigation.

Web audio player

The ability to stream our stations on our website is a valuable service we provide to a very loyal segment of our audience. We began the responsive redesign of our web audio player and conducted a prototype study in-house prior to receiving the UserTesting grant. In May, we used the UserTesting platform to beta test our finished product and ensure it was ready for launch.

In this study, we wanted to observe first impressions, whether participants find the site easy to use, and to uncover any unexpected pain-points or areas needing improvement prior to launching.  

In a day-long planning session, our team watched the recordings together, compiled notes, and created a list of improvements we could implement right away. We learned a lot of valuable information not only from what study participants said but also things we observed them doing. We learned which aspects of the product they immediately liked, what they found confusing, what didn’t look as it was intended to on their particular device, and even things they didn’t mention, but through observation gave us ideas on how an interaction could be improved.

Here are some findings and the changes they prompted:

  • We observed participants pressing/tapping the play button multiple times, even though the loading indicator was present. We realized the display of the “play” button was confusing and changed the button to “stop” during loading to alleviate the confusion.
  • A question about what was airing at a specific time caused confusion or at least hesitation in some participants. The schedule display only listed the start time of each program, with the assumption that the next program started when the current one ended. To ease confusion and cognitive load, we added an end time as well as stylistic aids.
  • We also observed participants trying to click on a program on the schedule listing in order to listen. We had not anticipated this interaction, but it was a great idea and easy to implement a play button in the schedule listing.
  • Some participants didn’t recognize the share icon, so we added the word “share” next to it for clarity.
  • A participant’s feedback prompted us to add timezone for additional clarity.

Web audio player before and after updates implemented from UserTesting studies.

Navigation improvement

During our navigation improvement project, we used UserTesting in conjunction with a card sort exercise to learn how users categorize website content for a public radio station. The UserTesting videos provided the why—thought process and reasoning—behind the choices participants made in the card sort.  It was significant because it provided us with a greater understanding of how our users think about our content. Specifically, we learned that some users found similarly-named content confusing, such as “Classical,” “Classical Music,” “ClassicalStream.”

As a follow up to the card sort, we created additional UserTesting studies where we showed participants mockups of current and planned navigation structures and asked them questions to learn whether our understanding and planned improvements were effective and working for our audience. The header navigation study confirmed that the participants understood the labels and found the categories to be logical, fitting, and easy to navigate.

The footer navigation study revealed that while participants tended to like the visual changes in the new design, they found the layout of the previous design easier to scan and navigate. These insights lead us to a solution that combined the positive aspects of the two designs.

Footer navigation before and after updates implemented from UserTesting studies.

How will the improvements you plan to make (or have already made) impact your bottom line?

We try to focus on projects that will greatly benefit our customer experience interacting with our digital products. While there isn’t always a direct connection between the usability improvements and our bottom line, we feel strongly that customer-focused improvements increase engagement and loyalty which lead to membership.

Our web audio player is a great example. Streaming is the number one activity our website visitors engage in and for this reason, we felt it was an important area to focus on. Before we started the project, we sent out a survey to gauge current levels of satisfaction and loyalty to our brand and the player, and to learn what features were most important to our audience. We used Net Promoter Score (NPS) and learned that while participants thought very highly of our brand, they scored the audio player below average.

We repeated this survey after launching the new player and the NPS score for the player increased by 21 points (exceeding our goal), and promoters, the most loyal segment, had increased to nearly 60%.

Is there anything else you’d like to share about your experience with UserTesting?

This opportunity to try out the UserTesting services has helped prove the value of human insights across the organization. We plan to continue using it beyond the grant period because of the incredible value it provides. The benefits we receive from a greater understanding of our customers will help elevate our products and services to levels we could not achieve before. Thank you!

Want to learn more?

If you’d like to learn more about how UserTesting can help you understand your customers through real-time human insights, contact us at support@usertesting.com.