Talking to real people, in real time: how to create scripts for Live Conversation
Talking with people in their own environments is the perfect way to gather information about their wants, needs, and preferences so that you can create great products that resonate with your customers. However, it’s often time-consuming or expensive to do that—between travel, finding the right people, and the inevitable hiccups that accompany in-person interactions—it can be a challenge to prioritize and pull off gathering human insights.
Too frequently, that means that teams that are working on, building, or improving products leave customers out of the conversation because it’s too difficult to carve out the time and money. Live Conversation, a tool available on the UserTesting platform, enables you to have live, 30- to 60-minute conversations with UserTesting panelists or your own participants and brings you into the environments of your users so that you have the convenience of getting quick feedback without having to leave your office.
Live Conversation is often used for prototype or usability testing, but it also presents a great way to expand the boundaries beyond your office or lab so that you can have interviews with users—time to prove about preferences, attitudes, and behaviors—to leverage that information as you plan new products, updates, or to dig deeper into topic areas that might be hard to cover in an unmoderated setting. Note that sometimes, the reasons to have a moderated conversation versus an unmoderated one have nothing to do with logistics. Sometimes, leaving a moderator out of the equation can net you deeper, less biased, more confessional answers simply because a moderator isn't there.
However, for this post, I'm focusing on live interviews because frequently, I hear UserTesting customers note that they've never conducted a customer interview before and weren't sure how to structure an effective script. Unlike a casual conversation, you must plan how you want to spend time with participants. And also, unlike an in-person interview, there are additional technical and other considerations around which you must plan. Here are ways to make sure that you collect great human insights, even if you're not experienced at conducting customer interviews.
Prioritize what you need to know
Obviously, during a 30-minute discussion, you'll be able to answer fewer questions than you would in a 60-minute session. Given the amount of time you have and what you need to accomplish, be mindful of how long sessions need to be to enable you to collect the information you need. For any session, you should plan for about 5 minutes at the beginning of the call to set the stage, navigate any technical challenges, and get through introductions. You'll then spend the next block of time you have running through questions. Give yourself another 5 minutes to wrap up the conversation.
Decide who needs to be in the “virtual room”
Having a remote conversation means that you can enable your team to listen in. In this setting, since you are having a conversation, having a single person speak for the team is recommended; however, if you have people listening in real time, you should let the participant know. Be aware that this can make some people nervous, so be mindful of that fact. Imagine yourself being questioned in front of a group of people—there's a reason why one-way mirrors were invented!
Set the stage
Just as you would in an unmoderated script, it's critical to set participants’ expectations about how you'll be spending your time together. Confirm that the time still works for them, introduce yourself, and review how you'll be spending that time together. Also, think about what the participant will see during the conversation. Do you want to show them something? What will they see of your environment? Be sure to be intentional about the “stage” you will present to them.
Create rapport before you get to the core questions
Because you have limited time together, making sure that you set participants at ease and getting them used to speaking about what they're thinking and feeling is key. Make sure you one or two questions at the beginning of the conversation that are easy to answer, are sufficiently open to ensure that people are prompted to speak for several minutes, and that also help you to know more about their context. Here are a few examples:
- Tell me about how you currently accomplish a critical task? [one that relates to the concept you want to learn more about]
- How do you use the product?
- Describe the environment where you typically complete a task.
- Tell me about your role and how it fits into your team’s area of responsibility.
Question order: most to least important
To ensure that you get the answers to your most important questions (and that they're covered), make sure you front-load the questions your team wants to know about early in the interview. Think about how long you expect it will take to answer each of the questions. For example, if you assume each question will take about 3 minutes to answer. If you have an hour-long conversation scheduled, that means you'll have about 50 total minutes to ask questions, so will probably get to ask 16-17 questions in total. In some interviews, you might only get through 10. For others, you might get through the 17 and have time for more. In the case that you do get through all of your questions, make sure you have about 5 bonus questions that you can use to go broader or deeper with the participant into your subject matter.
Ask ‘about’ questions
Interviews are best suited for asking questions that prompt participants to tell their stories. This means that questions should be open-ended rather than closed. Reserve closed questions for surveys. Typically, this means that questions you ask start with how, why, describe, tell me about, tell me about a time when—anything that gets people talking in a directed way.
Follow your script
While interview questions and overall flow don't have to be 100% consistent, make sure that everyone who's conducting interviews is framing and phrasing the questions in the same way. They should plan to stick closely to the script to ensure everyone is hearing the same question and answering the questions in a similar order. Controlling the conversation helps you to collect the answers you need from enough people to spot themes and ensure that you cover the key points you need insights about.
Be a great active listener
Being a great active listener is a challenge because you must balance listening to what your participant is saying at the same time as you are planning what to say next. Sometimes, people will answer a question that you might have planned to ask later in the conversation. People might not answer your question completely, and you will need to probe in order to understand their response. Practice makes perfect here—knowing when to redirect people, planning what prompts will help you to deepen the conversation, and keeping the conversation flowing takes time. Practice on friends and family or even with participants on the UserTesting platform. The value of interviews is that they help you to understand why people think, behave, and feel the way they do. Do your homework before you write your scripts. Collect behavioral (analytics) or attitudinal (survey) data. Question it. Talk to stakeholders about what they know already and what they'd like to know. Plan what the final deliverable will look like and ensure that there's a question that will help you to populate each one of the areas you'd like to know more about in the interview script. And, finally, enjoy the conversations. Interviews help you to have a-ha moments that can help you to move your experience from good to great.
Want to learn more?
If you’d like to learn more about how UserTesting can help you understand your customers through on-demand human insights, contact us here.
Insights that drive innovation
Get our best human insight resources delivered right to your inbox every month. As a bonus, we'll send you our latest industry report: When business is human, insights drive innovation.
Lija is a Customer Experience Consultant at UserTesting. When she's not helping UserTesting customers understand the wide variety of topic areas they can cover using the platform, she teaches a usability research methods class to undergraduates at the University of Michigan - Ann Arbor.