12 Things To Successfully Conduct User Testing
I recently completed conducting over 80 hours of user tests using UserTesting.com. I learned a lot in the process and wanted to share some of my key learnings during the process. Here’s everything I wish I knew to save time and $$$.
The results are indisputable, which helped when presenting to the client.
It can sometimes be tricky convincing clients of your ideas or the direction your team has decided to go with a product. User testing takes (most of) the guesswork out of creating products by providing genuine feedback to base decisions on. Our clients had no rebuttals when we presented, for example, that 9 out of 12 users prefer x over y.
The more effort that goes into a solution doesn’t always make it better.
I tend to get attached to designs or experiences that I’ve spent hours constructing, however, just because I spent a lot of time on something doesn’t make it the best direction. It’s easy without user testers to go in a direction that you’ve invested the most time into but user testing allowed us to verify that our direction was correct, regardless of our own biases.
Testers provide their best feedback when presented with a few options and are asked to compare them.
We quickly found that users viewing a product without another product to compare it with had little opinion and would generally say something like “yeah I like, it’s good.” However, when we sent testers through two different flows then asked what they thought they were much more opinionated and could provide better reasoning behind why they preferred one over the other.
All tests should be previewed before being sent live.
This might be a given but not only should all tests be reviewed and previewed before being sent live — they should be viewed in an incognito window to ensure nothing is password protected. We had one test that was a complete waste because our prototype was locked to outside viewers.
Most testers will do EXACTLY what you tell them to do, no more, no less. Write very specific instructions informing them exactly what they are expected to do.
We quickly realized the testers were not exactly going to view our product and immediately hand us a list of specifications on how to make it better. We had to more or less squeeze it out of them through specific steps and questions. The more hand-holding we did, the better the results were.
Ask users what their expectations will be before starting to get an unbiased opinion of what users think the experience will look like.
Before sending a tester through our flow we would ask what they expected from a hotel booking app for example. Users would occasionally provide ideas that we hadn’t thought of before.
Not all users are tech savvy so it’s important to create tests that don’t require overly complicated tasks.
Break each step down into many steps. Rather than telling testers to “book a hotel” — we would say “on the homepage, select a random date and destination for a hotel room then click view hotels.” Being specific makes it much easier for users to feel comfortable and it ensures they’re doing the right steps.
Ask detailed screening questions to ensure you get the right testers.
When creating a test on UserTesting.com we had the option to create screener questions to ensure that we received the right demographic of testers. It’s important to have specific demographics you’re targeting otherwise you’ll get random testers from all over the globe.
Always use simplified phrasing as opposed to technical terms.
Being in a bubble of techies can make it seem that terms like UX, UI, Chatbot, Cookies, Hamburger Menu, etc, are commonly known terms and most people are likely familiar with them — but airing on the side of caution is always best when using technical terms.
Giving users a chance to voice their thoughts and feelings after each step helps you understand what frustrates and delights them with greater detail.
We gave our user testers a final chance to voice any opinions or thoughts that they may not have expressed during the test. This is an opportunity for them to get anything off their chest that they may have been thinking but not asked a question that allowed them to reveal.
Testing 7 users allowed for a consensus to be drawn without having too many results to review.
Our user testing videos were generally between 10–13 minutes long. We ran 10 tests total. That’s a lot of videos to review. So to avoid going insane, we limited the number of users to 7. Having an odd number allowed for the tiebreaker in a few of our studies that involved comparing different flows side by side.
Always emphasize in tests that users feedback is important and their responses really matter.
We wanted our user testers to know that their feedback was important and we made a point to emphasize this throughout the test. This message really highlights that the responses they provide will impact the outcome of the product and it gives them a sense of purpose in the test. We would give users tasks like “enthusiastically sell your stance on why you prefer team x over team y!"
Those are my 12 learnings from running this round of user testing. The results were incredibly insightful and after our studies we found ourself with a completely reimagined product than our original — all thanks to the feedback of our testers.