![UserTesting glyph icon](/sites/default/files/usertestingv3/styles/width_120px/public/pictures/2021-03/UT%20Glyph%20for%20Blog_Gradient_300x300.png.jpg)
Once you catch the vision for what usability testing can do for a company, it's painful to be forced to design and develop blindly. If you're facing the need to get buy-in from management or your team for usability testing, this post is for you.
We all know that you have to pick your battles. This is one that must be picked, and must be won. Usability testing isn't just a really good suggestion that your company or team should try. It's the right way to launch better, faster, and with the least amount of wasted resources. Analytics alone aren't actionable; but once you see the human experience behind the data, you can determine precisely where to spend your resources.
Don't let anyone convince you that this battle isn't worth fighting, and don't let anyone wear you down.
Of course, getting buy-in doesn't have to be filled with conflict; but it's best to be prepared just in case. After all, if you weren't encountering or expecting any resistance right now, you probably wouldn't be reading this. So we're going to prepare you for both offense and defense.
While getting management buy-in for your usability testing initiative may seem a battle of sorts, you don't have to dress up in chainmail and wield a mace. But if you do, please send us a picture.
If you've already tried to get buy-in for usability testing and have been shot down, you might need to start with the Defense section. But if you're in a position to make the pitch, here's what we recommend.
The best way to make the case for usability testing is to let your users do it for you. It shouldn't take many users to make the case either; 3-5 should do it. (You might even qualify for one of our free proof-of-concept studies; contact sales@usertesting.com to find out.)
Keep the results laser-focused by creating a highlight reel of the most engaging insights. Hearing and seeing actual user experiences on your site or app is incredibly compelling.
Let's look at a real example. Imagine that your boss or team is in charge of Spotify, and doesn't yet understand the value of user testing. Here are a few clips from just 3 users, revealing problems or concerns that users are encountering—problems you might never identify without a usability test.
The best way to make the case for usability testing is to let your users do it for you.
Want to give your presentation some extra punch? Run a test against your competition. Have each tester try performing the same functions or answering the same questions on your app or site vs. one or two competitors, and show the results. You might discover that you have a competitive advantage you didn't know about, or you might learn why your competitor is winning some customers that you're not. And you'll definitely get some good ideas along the way.
One last thought: sometimes the best way to break the ice is to chip away at it. Start taking 45-90 second highlight reels of usability studies into your weekly executive meetings. It shouldn't take long for the CEO to start giving you resources to fix the issues. It's much more difficult to ignore videos of users struggling, than it is to ignore analytics.
Use terminology that your team understands and cares about. Empathize with them, and use their own frame of reference when making your pitch.
Clearly articulate the connection between usability and the KPIs management is concerned about right now. The connections may seem crystal clear to you, but not everyone gets it.
For example, if you're trying to convince the marketing manager that usability testing is a solid investment, talk about outcomes she cares about, like "more sales," "easier to buy," "increased revenue," or "improved brand loyalty." While "easier to use" resonates with UXers, it could be completely the wrong term for some on your team.
If you want your audience to listen, use terminology and outcomes they use every day.
If there are some hot button issues on the table right now, such as poor retention or low customer satisfaction, clearly articulate the connection between usability and the KPIs that management is concerned about right now. The connections may seem crystal clear to you, but not everyone gets it. Enterprises will benefit from features like monthly qualitative KPI benchmarking that our new Customer Experience Analytics feature makes possible.
In order to calculate the ROI for usability testing, it's tempting to look at only hard numbers like increased revenue or improved conversion rates. "We spent $1000 to identify bug x, we fixed it, and conversion increased by 1.2% to bring in an extra $3000 in monthly revenue." Beautiful. Simple. But you're not always going to get numbers like that; and when you don't, it doesn't mean usability testing was a waste of money.
Therein lies the extra value of usability testing: you learn something every time you test, and learning itself is valuable.
In fact, usability testing can help measure lagging indicators like UX work, which can often take 3, 6, or 9 months to show up in analytics.
Therein lies the extra value of usability testing: you learn something every time you test, and learning itself is valuable. Sometimes that value can be calculated, but often it's less tangible. Examples:
There are many more examples, but the point is that you can raise the awareness of learning as a valuable outcome of usability testing.
To defend against any resistance, arm yourself with answers to common objections and usability testing myths.
As a usability testing fan, you've got this one covered. You know that analytics tells you what happened; usability testing tells you why. That's completely true, but that line might not work on someone who loves analytics—unless you have an example.
StubHub's Go Button story is a great example of what usability testing can uncover that analytics can't.
Uncovering your own example is best, but StubHub's "Go Button" story is compelling too. During usability testing, StubHub discovered that the "See Details" link on their event pages was preventing a lot of people from clicking. (Analytics can indicate that people are abandoning the funnel here, but it doesn't explain why, and it certainly can't tell you what to do about it.) Replacing that ambiguous link with a clear "Go" button resulted in millions in increased revenue. Usability testing showed them why people weren't converting, and gave them a clear path forward.
This one is almost too easy these days, but we know this myth still lives on in some organizations, so here's the 3-part answer:
Confront any lingering ideas about usability testing costing thousands of dollars per study.
This is mostly a concern of small businesses. Large enterprises typically have the resources to tackle usability testing; it just becomes a question of priority for them (which all of the points made above should address). But sometimes the in-house marketing/UX/engineering team's time would be better spent on fixing problems than finding them. That's why our enterprise plans come with access to a research team who can conduct usability studies and even watch the videos and summarize the findings.
But back to small businesses. Is usability testing too complicated and time-consuming for them? Eric Ries, pioneer of the Lean Startup movement, doesn't think so.
I scheduled UserTesting.com sessions, making sure that I got participants in all the main branches of the experiments. Within a few hours, I had a dozen 15 minute videos of people using the product. The entire process, including analysis, took about one full day. —Eric Ries
Lean startups wouldn't embrace usability testing if it were a complicated and time-consuming way to get feedback.
Steve Krug, author of the popular Don't Make Me Think, says that UserTesting.com “Requires almost no effort and gets you results incredibly quickly."
Lean startups wouldn't embrace usability testing if it were a complicated and time-consuming way to get feedback.
Let's take a look at what it actually takes to run a simple study once you've determined what you want to learn. The numbers can vary widely here, but these are ballpark estimates for a straightforward 15-minute test with 5 users from the UserTesting.com panel:
Total of your time: 2 hours
Total time from start of project to the time you're ready to present your findings: 3 hours
Ok, this one's tricky. It's such a ridiculous objection, that nobody is going to say it out loud. You have to listen to the subtext in order to discover this objection within your company. When you find out that people are resisting testing because they can't come to grips with the possibility of "wasting" work that's already been done (even if usability testing reveals that it needs to be tossed), you need to appeal to Eric Ries's story once again.
Here's an excerpt in which he explains the value of learning early in the process. He even talks about throwing away thousands of lines of code.
For anybody who resists discovering problems after an initial investment of design or engineering time, you might need to remind them of the obvious: Any problems in your product or site will be discovered at some point—either before launch when you test, or after launch when sales dip or customers complain or move to your competitor. Putting your head in the sand doesn't work. It just delays the inevitable, and at a high cost.
Putting your head in the sand doesn't work. It just delays the inevitable, and at a high cost.
You might encounter managers or team members clinging to the popular notion that you need to test only a small number of users to catch all the critical problems.
It's true that you can uncover a lot of actionable information with just a few users, but an iterative approach is best. In Jakob Nielsen's often-referenced article, Why You Only Need to Test with 5 Users, Nielsen actually recommends running multiple 5-user tests.
Jakob Nielsen suggests tests of 5 users at a time, not 5 users total.
"Why do I recommend testing with a much smaller number of users? The main reason is that it is better to distribute your budget for user testing across many small tests instead of blowing everything on a single, elaborate study. Let us say that you do have the funding to recruit 15 representative customers and have them test your design. Great. Spend this budget on three tests with 5 users each!" – Jakob Nielsen
There are three main factors to consider when determining how many testers to budget for:
Has this happened to you?
Jim: We really should run some usability tests on this.
Steve: We might not make the launch deadline as it is. We don't need another to-do list.
Much of what we've already talked about can be used to get Steve on board, so we'll just focus on a few additional points specific to his objection:
Have you had to convince management or your team to embrace usability testing? What happened? What worked best?
P.S. For a little extra help getting buy-in, you can also download our "Gaining Executive Buy-In for User Research" whitepaper.