Margarine Doesn’t Cause Divorce: Why Qualitative Insights Help You Understand Your Data

Posted on October 27, 2014
6 min read

Share

qualitative-header

You can learn a lot from statistics.

For example, did you know that from 2000 to 2009, there was a direct correlation between the divorce rate in Maine and the per capita consumption of margarine in the U.S.?

Don’t believe me? Check out this chart created with data from the U.S. Census and the USDA. The correlation is over 99%.

Chart of margarine consumption and divorce rate

So, if we want a lower divorce rate, we should just cut back on margarine, right?

I’m guessing you aren’t convinced that margarine causes divorce (or vice versa).

The two statistics have a strong correlation, but no causation. It could just be a coincidence---or maybe there’s another factor that happened to affect both statistics. The point is that just because two things are related doesn’t mean that one caused the other.

Why are we talking about silly, coincidental statistics?

As marketers, we love our data, and we rely on it when we make decisions. But it doesn’t always tell us everything we need to know. We’re guilty of mistaking correlation for causation all the time. (Yep, I do it too.)

Maybe our site traffic skyrocketed on the same day we announced an important product update. Hooray! That must mean everyone is excited about the update… unless there was another reason we didn’t think of. Maybe an extremely influential customer wrote a blog post about us. Or maybe the hashtag we were using got usurped by a much more popular event.

Data can make us overconfident in our conclusions

Here’s another example. Let’s say we launch an A/B test and look at the results. If the B version wins, we assume we know why. Surely it’s because we changed the headline copy in the B version. That must mean our audience prefers that type of headline, and we should use it from here on out. Right?

Maybe, but maybe not. Even when we’ve reached statistical significance, we can’t assume we know everything. It’s also critical to think about the length of the A/B test, as well as any external factors that could have an impact.

Do we know that the audience was split into two truly random groups? Was there something on the news this week that subconsciously caused readers to prefer one version over the other? It could even be that the weather had an affect on the audience’s mood, and that affected their behavior.

Averages can be misleading

One of my favorite stats in Google Analytics is Average Time on Page. Ideally, it lets me know which pages people spend a lot of time reading. On a lengthy blog post, it’s great to see a long Average Time on Page because I know people are reading (and hopefully enjoying) the whole article.

On the flip side, we don’t want a long Average Time on Page on the Contact Us page, for example. We just want people to be able to quickly find the info they need. So if I see a 19-minute average on the Contact Us page, something is probably wrong… but maybe not. Maybe there are a couple of people who opened the page, forgot about it, and left it open for hours, bringing the average up.

This is especially a problem for pages that have lower traffic volumes. If you have a page with an abnormally high Average Time, but only a handful of unique visitors, then it’s possible one or two people’s unusual behavior is skewing the average.

But data is still important!

Let’s be clear: analytics are really important. It would be foolish to ignore our statistics just because we don't understand them.

But here’s the thing---and you may have guessed it by now---analytics only give you one piece of the puzzle. To get the complete story, you’re going to need to ask “Why?” Specifically, you’re going to need to ask your prospects “Why?” That's where qualitative research comes in.

Here are some ideas for answering the “Why?” question from the examples above.

If you see a spike in site traffic:

Don’t assume you understand where it’s coming from. Take a deeper dive into your analytics and determine the source of the extra traffic. Was it Twitter? LinkedIn? A paid ad? Someone else’s blog or newsletter? Go to that source so you can see what your visitors see.

Try running a Behavior Flow report in Google Analytics to follow the exact journey users take from their source through your site. You’ll also get to see which landing pages are the most popular and where users are dropping off your site.

Behavior flow with Source

The Behavior Flow report will show you which page visitors are landing on, where they go from there, and where they exit.

If you want to know what’s going through your users’ heads as they enter your site from these sources, try setting up a few user tests in which users start from each source you’re curious about. For example, if you’re getting a lot of traffic from a paid ad on Google, have your users search for the appropriate keyword, view the ad, and click on it. Ask them if the landing page met their expectations after they saw they ad. You could do the same with social posts, organic search results, or links from other sites.

If you’ve just gotten the results of an A/B test:

Run a handful of user tests on the winning version to find out exactly what users like about it. It might not be what you think, especially if there are a lot of variables between the two versions.

Better yet, run a user test on the A and B versions side-by-side. Ask your test participants to explain which one they prefer and why. You might uncover an idea for an even better A/B test. For example, maybe users like the design of the A version, but they prefer the copy of the B version. For your next A/B test, try combining the winning design with the winning copy to see if you get even better results.

A/B test of two ads

We tried this on one of our own A/B tests. We learned that users preferred the green background in the A version, but they liked seeing the word "Free" in the B version. So our next ad was green AND said "Free."

Don't forget to double check how your audience was split up for your A/B test. Was it actually random, or did your marketing software divide your audience up based on some other factor, like the date they were added into your database? Folks who have been familiar with your company for a long time might react differently to a change than new prospects who don’t already have a set of expectations. Make sure that your future A/B tests are truly randomized.

If one page has an unusually high Average Time on Page:

Run a simple user test to see if test participants are able to do what they would normally want to do on that page. Pay special attention to anything that distracts or confuses them.

You could also try implementing a one- or two-question survey on pages with high average times. Ask whether visitors were able to find what they were looking for and whether there is anything they would change. Qualaroo is great for these quick surveys.

You can even set up a heatmap or scrollmap using a tool like Crazy Egg to find out where people are clicking and how far down the page they are scrolling. This can help you pinpoint where visitors get distracted.

Don’t blame divorce on margarine

There’s not much point in collecting data for data’s sake. If we want to be able to actually do anything useful with it, we need to understand why trends are occurring. And to find out why, we have to ask.

For marketers to make a meaningful and lasting impact on conversion rates, we have to get the whole picture. We can’t accept “maybe” as an answer. Let’s take a stand against guesswork and start getting qualitative feedback to go along with our data.

In this Article

    Related Blog Posts

    • Man and woman writing on a whiteboard

      Blog

      What is an ethnographic study?

      With rapidly changing expectations and behaviors, understanding customers on a deeper level is crucial...
    • Woman's hand tapping on an illuminated mobile phone

      Blog

      A/B test your mobile apps and websites for quick UX wins

      Every product designer or developer needs A/B testing in their toolkit, including those who...
    • Blog

      Website checklist: Test and optimize your website before launch

      Pre-launch tests enable you to evaluate whether your website can withstand real-world usage scenarios...