User Testing: What They Say vs. What They Do
Pavan Kumar Narkulla
Private equity investment professional, Entrepreneur, Founder @BigLynx and @RealtySlices. Over the years, I've actively promoted diversity of entrepreneurs in the tech ecosystem
People are complicated — they don’t always act the way they say they will.
That can really complicate software testing. You expect people to self-report to the best of their knowledge, and then rely on that knowledge to be relatively accurate. But, unfortunately, individuals are not always the best judges of their own experiences.
It’s like that classic taste test: Coca-Cola vs. Pepsi. Lifelong Coca-Cola fans chose Pepsi in a blind taste test, and vice-versa. People think they know what they want, but it’s not always true.
Sometimes, they simply change their minds and their behaviors based on the conditions. This means that people can perform very differently in a testing scenario than they would in real life.
One thing you want to do is compare the feedback you get from users with the way they actually behave. These are two entirely different types of user testing.
To get the best results, it’s best if you perform both. More data is almost always better.
Thankfully, Eyece is an app-testing platform that is able to host a wide range of different testing methods, including those two. After running both types of tests on the same user group — which you can find and sort by demographic and device within the platform itself — you can easily store both sets of results and compare them in the same simple, easy-to-use interface.
In terms of order, it usually helps to have users go into your product and use it, first. You can collect real data based on their firsthand performance — what they click on, how they navigate through it, how they complete tasks.
Then, ask them how they liked it. Ask for critical feedback, and what they would improve. Ask them to rate their own performance and ease of navigation. You should take care with both tests, from the outset, to know what you are looking for and know what your final set of questions will be, in order to compare the answers. See if answers consistently differ in one area (for example, if they say it was easy to fill out the ‘Contact’ form, when you actually saw them click several incorrect links before locating the correct ‘Contact’ page). Or, see if people were consistently ranking the product better or worse than their performance seemed to indicated from the outset, which might indicate more of a problem with perception or branding. See what you can resolve before running an identical set of tests with a new user group, again.
If you get conflicting answers that are never consistent, it’s time to conduct testing elsewhere; thankfully, Eyece offers a wide range of testing to help you troubleshoot customer experience.
Make sense of user behavior and user feedback with Eyece.
#apptesting #usertesting #customerfeedback #customerexperience #CX #eyece