Connect with your exact customers
See, hear, and talk to your customers
Uncover insights about any experience
Share key insights across your organization
Our recent webinar on how fast human insight is revolutionizing marketing resulted in some great questions from the audience. They mirrored things we hear from many other customers, so we wanted to share the answers here. The topics include:
We offer free online training to our customers on a regular basis; and as a subscriber, you can also purchase specialized training with us. But you don't have to wait for the training to get started. Our system includes about 40 pre-made templates for various types of studies. The templates were all created by our professional services team, so they’re a great way to learn. You can learn more about using templates here. In one of the case studies that we discussed in the webinar, a company had discovered that its in-person customer interviews (conducted at company headquarters) were skewed because people living around headquarters (in Los Angeles) were technophiles and more affluent than the average person in the rest of the country. The company had been making its products too expensive and complex as a result. Recruiting through UserTesting eliminated the local bias and enabled them to spot the problem. That prompted this question:
There's no such thing as a totally bias-free sample. Even if you had a way to randomly pick anyone in the population, there would be bias because some people aren’t willing to talk to a researcher. The important thing is to understand what the biases are, so you can correct for them. In the case of UserTesting, we don't think there is much of a technophile bias. Our panel by definition includes people who know how to use computers and smartphones, but that's most of the population these days. We don't see a bias toward technophiles, but our panel does skew a little younger than the population as a whole. That's great if you're looking to conduct research on millennials, but if you want to get feedback from older people, it's important to use the age filter when setting up your study.
We get this question from time to time. You're welcome to use the videos in the course of your normal business for informational purposes. In other words, you can show video clips in internal documents and even external presentations (for example, we used some of them in our recent co-presentation with Harry's at the IRCE conference). If you want to use a video in a promotional sense—for instance, in an advertisement—you should definitely get permission from the participant. You can use the UserTesting platform to reach out to participants to request their permission.
This is an excellent question: What do you do if you're testing a marketing deliverable—for example, an email message—and you find that some people like it and some people don't? There are a couple of answers. First, before you start the study, be really clear on the questions you’re trying to answer. Do you really just want to see if people like what you're testing, or do you have more specific issues? For example, as part of the prep for our webinar, we ran studies on the written description we were going to use to promote it. My biggest question was whether the language we were using was over the top. Did it come off like hype? Some people liked our language a lot more than others, but no one said it was over the top. So I was comfortable that I had a clear answer to my question, and I moved forward on that basis. Second, keep in mind this old saying in marketing: If your message doesn't drive away some people, it isn’t specific enough to attract anyone. When testing marketing material, I look for things that are spot-on for the right people rather than blandly acceptable to everyone. You should expect disagreement. Even if you were running a quantitative survey, you wouldn't get 100% agreement on anything, and most of the time it'd be 60-40%. This is especially common if you're dealing with a new market or product that people haven't had a chance to form strong opinions yet. So it’s OK to ask, “what am I hearing from the majority of participants?” rather than, "is everyone happy?" If the responses are so scattered that you can't tell what's happening, run a few more sessions until the responses start to converge.
I was a consultant for a few years and had to deal with a lot of situations like this. The answer depends in part on your relationship with those marketing execs. If they ask to see the raw videos, I wouldn’t hold them back. Many marketing people love listening to customers whenever they get the chance, so don’t stand in their way. However, executives can be pretty busy, so I’d always create a report with key video clips plus my conclusions about what they mean. Give the execs a quick answer, supported by evidence, and let them access the full videos if they want to. Be sure that your report includes those supporting video clips. The clips will help the execs understand your conclusions emotionally rather than just rationally, and chances are they’ll share the clips with others. Thanks very much to everyone who attended the webinar and asked such great questions. If you'd like to watch the full version, you can access the on-demand webinar here.
If you’d like to learn more about how UserTesting can help you understand your customers through on-demand human insights, contact us here.