Skip to main content
Close PromoBar
Apply now Apply now Apply now Sign up to be a test participant and make some extra cash this holiday season! As seen on GMA   Apply now    Sign up to be a test participant and make some extra cash this holiday season! As seen on GMA Apply now Apply now Apply now

A marketer's success story: Leveraging on-demand, live interviews

Andrew Slutzky  |  December 13, 2017

The marketing team here at UserTesting conducts studies constantly. And when our newest offering Live Conversation was launched, we couldn’t wait to try it ourselves.

Below, I share my perspective of what it’s like running a Live Conversation study for the very first time, how it uncovers valuable insights in addition to our traditional desktop, mobile, and prototype studies, and how it ultimately helps us improve our team’s ROI and our audience’s experience.

The challenge

We received feedback from our customer support team that revealed our email subscription center (where subscribers can change their email preferences for marketing content) could benefit from a few updates to improve the subscriber experience.

These updates would also have the added benefit of reducing the amount of time our customer support team spent responding to subscriber feedback.

Setting up and conducting the study

Normally, we’d conduct a remote study to better understand the participants’ pain points and then make iterative changes to the design and user experience.

This time, I wanted a more interactive dialogue with participants –to ask follow-up questions while they shared their screens (and activity) with me. Live interviews empowered me to do this.

Scheduling the interviews was fast and easy, even for someone who’d never conducted one previously! I was able to schedule five interviews with my chosen demographics (participants were aged 25-55 years old, living in the United States) and begin conducting them in just over one day from my initial scheduling.

Pro tip: To help get the most targeted feedback in a short amount of time, I utilized screener questions based on our marketing and sales team’s ideal customer personas.

Reviewing the results

To help get to the most important insights faster, there were a few steps I took while reviewing the results.

Step 1: Annotate videos by section

During my interviews, I asked all five of my participants to categorize four pieces of content. When reviewing my videos, I annotated each of the clips with the answers to my questions. For example, I broke up the four pieces of content (Newsletter/blog, Webinars, Whitepapers, and Customer Marketing) as ‘Categorizing Content’ one through four. This would help me quickly jump to the various clips I needed to review, without having to listen to the entire interview all over again.

Step 2: Utilize tags to quickly create highlight reels

Tags and labels helped me interpret the annotation clips and group the clips into highlight reels.

Here’s the top feedback we uncovered on areas that needed improvement:

Content categorization

The category descriptions in our subscription center weren’t clear enough, leading subscribers to feel uncertain they were signing up for topics in which they were interested. [video width="1686" height="768" mp4="/uploads/2017/12/Contentcategorization.mp4"][/video]

Page layout

The page felt cramped and was difficult to read. [video width="1760" height="900" mp4="/uploads/2017/12/Pagelayout.mp4"][/video]


Subscribers expected a higher level of personalization when they visited the subscription center. [video width="1760" height="900" mp4="/uploads/2017/12/Personalization.mp4"][/video]

The results

Because I could interact with participants in a conversational manner, I was able to get to the feedback I really needed – easily, and a lot faster. During a standard remote study, participants answer a question or complete a task, then move on. Live Conversation has the added benefit of probing for more insight during the interview. I not only was able to watch what they were doing, but I could also interject and ask for more feedback at any time. After analyzing the results, I zeroed in on just the feedback I was hoping for to reach our goal of 80% reduction in customer support feedback, and ultimately create a better experience with our subscription center.

Turning insights into action

We took this feedback and began implementing these recommendations right away, which we’ll test again and again to ensure we’re both living up to our audience’s expectations and needs, and meeting our goal of reducing the number of comments to our customer support team.

Insights that drive innovation

Get our best human insight resources delivered right to your inbox every month. As a bonus, we'll send you our latest industry report: When business is human, insights drive innovation.

Thank you!
Get ready for some great content coming to your inbox from the team at UserTesting!

About the author:

Andrew is part of the UserTesting Marketing Operations Team. He’s driven by providing cross-departmental insights and solutions to business problems through the mantra of ‘always be testing.’ Nobody has a monopoly on great ideas!