Today's post comes from Peter Hughes, usability expert and founder of Ascest. Enjoy!
There are a couple of different ways to run a usability test:
Both methods have their merits. But no method is perfect, which is why they are often used at different points in product testing to capitalize on their specific strengths.
This post is for people who are keen to add moderated usability testing to their research programs, but who don’t have much or any experience doing so.
You don’t need to be a professional usability expert to get value from performing your own usability tests. In time, your moderating skills will develop with experience and training. Following these tips will accelerate your progress.
As you will see, good moderating has a lot to do with preparation and not just how you talk to your test participants…
To get good results, it’s important for a moderator to build good rapport with a participant.
The deeper the trust the participant has in the moderator, the more open they’ll be. Many new moderators underestimate the importance of this point, thinking that with good questions and observational skills, most of their work is done.
Good rapport starts from the moment the participant begins to engage with the test project.
Clear instructions and communications about the test are essential. For example, if your test will be conducted in person, it’s important to provide confirmation emails that clearly state the test location and session time, explain any preparations that are needed (e.g., bring your smartphone and ID), and provide reminders and information about who to contact if they have any questions.
If you’re running a remote test, make sure that the test participant knows exactly what time to sign in (taking care to account for any time zone differences), and that they have the correct links or software they will need to be able to access the test.
Another valuable rapport builder is the introduction itself.
No matter what type of day you’re having, take a moment to prepare yourself beforehand. I always take a couple of deep breaths and think of something uplifting and positive to frame my attitude. Forced smiles are never good. But thinking about someone special, an amazing meal, or wonderful place you’ve been to (and smiling) will always be much more welcoming.
Eye contact and a confident handshake also help to get things off to a good start. A welcoming and friendly tone of voice is always a good idea, and is particularly important if your test is remote.
Staying neutral is something that takes most people a lot of practice, and even pros slip up occasionally.
The key is not to suggest there is any particular way of doing anything, or that anything is right or wrong, or positive or negative. Ideally, there should never be any judgment or surprise in your tone either.
Stay neutral. Try to cultivate a “psychologist’s personality.” Express genuine curiosity.
Avoid leading phrases like, “Do you think that was easy?” or, “Really?” said with raised eyebrows, even if you see something that seems strange.
Try to cultivate a “psychologist’s personality” where you express genuine curiosity for any action taken or comments made by the participant.
Here are some good examples of neutral questions if you’re trying to learn more about something you’ve seen the participant do:
In the test session itself, your preparation and experience will show. It’s extremely important to practice with a few dry runs and to spend time going through the tasks and planned interaction flows.
Know your questions and your test materials very well.
Practice, practice, practice… your script and your tech setup.
Make sure you know how all of the technology works. Not just in theory. Really go through the steps of starting and saving a recording, several times. Check that the recording really did save, and that you know where it’s been saved to.
If your test session is being fed into an observation room, make sure the feed works and the observers can clearly see and hear what’s happening.
Often, projectors or large screens in conference rooms have their brightness turned up too high so detail in your screens gets washed out. You also don’t want your observers interrupting the interview because they can’t hear what’s going on. You don’t want them leaving the test after you’ve made the effort to get them there in the first place.
All these preparations will increase your confidence, which will show. If you get flustered, so will the participant. Being thoroughly prepared and confident with your tools will help you keep calm and neutral.
No matter how prepared you are, though, problems happen.
Occasionally, the participant might accidentally close the browser, or the app might suddenly quit, or any number of unexpected things could occur.
Problems will happen. Make sure you can reset your test quickly.
Making sure you are able to reset your test quickly is the best way to save the day.
For example, if you’re testing a website, have every screen saved or bookmarked as a group. That way you can restore your test session with one click after restarting your browser.
Most web or app prototypes do not contain all the functionality or screens your product will ultimately have. You are normally only testing certain screens or specific flows.
In your introduction, you should let the participant know this by saying something like, “This is an early prototype, so some things might not work, or some screens may not be ready yet. If you do something and it doesn’t work, please try it again just to make sure. In any event, I will let you know if you have come to a dead-end after you have tried to do whatever you were trying to do.”
You don’t need to say more than this. No explanations are needed about which parts aren’t working or complete. (More on this in the next tip!)
The main purpose in saying this in your introduction is to avoid surprising the participant if nothing happens when they do something. It’s also to encourage them to do whatever they feel is normal and natural, so they don’t hold back.
Don’t explain how the system you’re testing functions or let the participant know about any special functionality it contains.
You’ll almost certainly bias your test if you do explain anything. For example, you might say something that hadn’t occurred to the participant, which affects the way they use the product.
Don’t explain your product… you’ll bias your test.
In real life, the user will be operating the computer or mobile device on their own, and they will have to figure things out based on what they see in the interface.
Present a task and see what they do. Ultimately, the interface needs to “speak for itself.”
This is one of the most important rules for any moderator. It’s closely related to the “don’t explain” tip.
Once you have presented the task, everything should be led by the participant.
If the participant tries to go in an unintended direction in your app, wait and see what happens. So long as it doesn’t take them too long, they might double back, or take some other corrective action, realizing the trail has gone cold. Don't try to correct them yet! This is valuable learning.
But they might get stuck and stop. In that case, don't explain what to do straight away. Instead, use the opportunity to explore what they were trying to do and any expectations they had at that moment. See if they have any thoughts about what they could try to get going again.
You’ll be getting a read about the participant’s mental model, or how they believe your product works. This is often a powerful insight if you see this pattern repeated with other participants.
After they explain their thinking, if they can’t figure out what the next move should be, you should give the smallest hint you can to get the participant started again.
Say something like, “Thanks for your thoughts. I’d like to show you what the designers intended.” (This phrase works well because there’s no suggestion that anyone in the room is responsible for the product “not working.”)
Show them the next step. But before they continue with the task at hand, ask their thoughts on the new information.
Your hint may make sense to them. It may not. Either way, you’re learning.
Here are some of the more common participant questions you will encounter in a usability test:
Your focus should always be on the participant, on their thoughts and observations, and on what they do. Your views (or anyone else's) are irrelevant during the test.
A moderator’s favorite question: “What are your thoughts about…?”
The simplest way to counter these types of questions is to respond with a question yourself. One of the most versatile and effective questions is “What are your thoughts about [where to go, what to do next, this feature or function, etc.]?"
If a person thinks they are going too slowly compared to others and asks you for feedback, you could say, “How do you think you’re doing?” Whatever they think of their speed is what’s most important. Other people’s views are not relevant.
It can be hard to resist the temptation, particularly if you’re being pressed, on whether the participant is having a similar experience to the others. A good way to deflect these types of questions is to say, “I can’t answer that right now. But we can talk about that at the end of the session if we have time.”
Place more weight on what the participant does rather than what they say. You should certainly encourage them to think aloud, which will help you to understand their thoughts and mindset as they use the product. This will be useful.
Focus on what they do, not what they say. Actions do speak louder than words.
But occasionally, you’ll watch a participant struggle through a task, and when you ask them about what they have just done, they will tell you that it was easy. Clearly these observations are likely at odds with each other.
Some people will find it difficult to be brutally honest with you, and they will be tempted to “soften the blow.” If you have a participant like this, pay attention to what they do, rather than what they say.
From time to time, you might encounter brief periods of silence from your participant.
But you need to know when to give the participant space to think, and when time’s being wasted.
If a participant is having a hard time and you notice “hunt and peck” behavior, for example, it’s a good sign they are having difficulty, especially if it’s accompanied by grimaces or sighs of exasperation.
Let things develop for a few seconds in case the participant can correct himself or herself. As soon as your gut tells you the participant is stuck, and it doesn’t look like they are going to get back on track on their own, I’d recommend you chime in with a “what’s happening here?” Don’t let the participant dangle anymore than necessary.
Recognize the difference between “time to think” and wasting time.
In the case of a participant who’s just staring at the screen, you should look for signs that they’re thinking. The deep-in-thought look vs. exasperated look vs. “checked-out” look is usually pretty obvious.
If it’s the thoughtful look, you want to be sure they’ve had reasonable time to finish processing what’s on their mind before interrupting. Once you feel they should have moved on you could ask them, “You haven’t said anything for a while. Please tell me what you are thinking about.”
In remote moderated sessions, you won’t be able to see the participant’s face. As a result, you’ll need to regularly remind them to verbalize their thoughts, so that you’re not guessing.
On occasion, you’ll experience a participant giving you what seems like contradictory information. For example, they may like the filtering options when trying to locate one product, but not for another.
Anytime something like this happens, you should not hesitate to clarify what’s unclear to you. You don’t want to leave your test with contradictory information.
Asked in the right tone, the participant will feel like you’re trying to understand them. You should be careful not to make it sound like you’re accusing them of anything or finding fault.
Your participants will work at different speeds. That’s real life.
So long as slower participants are providing useful information, don’t “hurry them up” as the pressure can make them act unnaturally. You could potentially lose the opportunity for some great learning.
This why your tasks should be prioritized so that the most important ones are at the beginning of your task list.
Don’t “hurry up” slower participants if they’re giving you good info.
Sometimes, you’ll get a participant who gives thoughtful responses but goes faster than most.
For this reason, it’s always worth planning your test to have a few more tasks than you’d expect to use. If none are available, and there is a competitor site/app that offers similar functionality, you could repeat a task for comparison.
Follow these 12 tips and you'll have a strong foundation for conducting your own usability tests. Soon you, too, will be moderating like a pro!
Get our best human insight resources delivered right to your inbox every month. As a bonus, we'll send you our latest industry report: When business is human, insights drive innovation.
About the author:
Peter has conducted thousands of hours of usability testing for a wide variety of companies and organizations, from startups to foundations to Fortune 100s.
He founded Ascest, where he loves helping organizations obtain the user feedback needed for creating exceptional products.