The key to understanding your users is asking better questions
Morten Rand-Hendriksen
AI & Ethics & Rights & Justice | Educator | TEDx Speaker | Neurodivergent System Thinker | Dad
Have you ever heard of theory dependence? It's a term used by philosophers to describe how we tend to pick and choose the facts we observe, and interpret them to match our preconceived notions. Theory dependence makes us ask questions and gather data that supports our own ideas, and works alongside cognitive dissonance to makes us resist any data that does not support our beliefs.
Scientists are trained to detect and avoid theory dependence in their work to ensure impartial data gathering, but outside of science theory dependence is often overlooked resulting in bad decisions guided by skewed and biassed data.
The key to understanding your users is asking better questions so the data you get is what you need, not necessarily what you want.
For absolute clarity: This post is not about politics or policy but how to ask questions that produce unbiassed results.
Confounding Questions and Useless Data
Yesterday the question above flashed across my screen that gave me pause. It was part of a survey issued by a political party aiming to gather information about public attitudes toward the media.
"Do you believe that contrary to what the media says, raising taxes does not create jobs?"
If you're confused, you're not alone. It's a real mind bender, worthy of an advanced Turing test. And whatever data is collected from it is invalid. What may surprise you is I see these types of questions all the time, in user surveys and interviews, beta tests and design sprints. Designers and developers (myself included) are masters at creating intricate theory dependent questions and expecting them to produce valid actionable results. Case in point: After an event last year I was asked to fill out a survey. One of the questions read as follows:
"This year [event] was hosted at an awesome new venue. Would you recommend we use this venue in the future or not?"
The options to answer were: Yes, No, and Other with a text field for further info. What would your answer be?
Theory Dependence
To see why theory dependence matters, and why asking the right questions is important for any project to be successful, let me break down the original question step by step:
"Do you believe that contrary to what the media says, raising taxes does not create jobs?"
If we ignore everything after the comma and just look at the beginning statement, the theory dependence in the question is already clear: The questioner believes that the media is wrong and wants the person answering to agree with them. The same thing happens in my other example: "This year [event] was hosted at an awesome new venue." The questioner clearly believes the venue to be awesome and wants the person answering to agree.
If either of these questions were asked in a court of law, the opposing lawyer would stand up and shout "Objection, your Honor! Leading the witness!". And rightly so. By asking a leading question, you prime the person answering to agree with your statement rather than provide their own uninfluenced opinion.
Using these questions as a template you can get people to agree to pretty much anything.
"We placed the hamburger menu in the top left hand corner to make it easier to see. Do you think it is in the right place?"
"Our online magazine relies on advertising for income so you don't have to pay to read our articles. Are you OK with seeing ads at the bottom of your screen?"
Clarity is key
Another common problem when asking questions is making the question too complex. The question from the survey is severe examples of this. Just try to answer a simple "yes" or "no":
"Do you believe that contrary to what the media says, raising taxes does not create jobs?"
It took me a while. To make sense of it I fell back on years of training in logic and reasoning. The reason this question is so confusing is it appears to be a triple negation, but is in fact only a double negation. If we strip some of the content out you'll see what I mean:
"Do you believe raising taxes does not create jobs?"
This question is still confusing, but it's better than the original: If you believe raising taxes creates jobs, the answer is "no". If you believe raising taxes does not create jobs, the answer is "yes". This is a double negation.
What makes this particular question so hard to grasp is the non-sequitur in the middle, "that contrary to what the media says". This sounds like another negation, ie "contrary to popular opinion, chocolate covered peanut butter is a food crime", but in the context of the question it is irrelevant and adds confusion.
If the intent of the question is to gauge public opinion on the relationship between taxes and jobs, a better and more clear question would be simply:
"Do you believe raising taxes creates more jobs?"
This is a simple yes/no question everyone will understand.
The event survey above shows a different lack of clarity:
"This year [event] was hosted at an awesome new venue. Would you recommend we use this venue in the future or not?"
Because of the two last words, "or not", you can't answer yes or no to this question any more. Rather than asking the person taking the survey to agree or disagree, this question requires them to provide a recommendation: "I would recommend" or "I would not recommend". To see why the original question doesn't work, just add your answer to it:
"Yes, I would recommend you use the venue in the future or not."
If the intent of the question is to gauge public opinion on the new venue, a better and more clear question would be simply:
"This year [event] was hosted at a new venue. Would you recommend this venue for future events?"
Again, this is a simple yes/no question everyone will understand.
So how do I ask better questions?
The keys to asking better questions is to a) be aware of your own cognitive bias, and b) make your questions simple and impartial. This is not easy which is why people who write surveys for a living typically have university degrees on the subject, but it can be done.
To get started, I recommend learning about cognitive bias. The Cognitive Bias Cheat Sheet by Buster Benson is a great primer. So is the book Thinking, Fast and Slow by Daniel Kahneman.
There are also lots of great resources on how to create better surveys including one from Grammar Girl and another from Constant Contact.
Finally, to get a better understanding of how much we are influenced by our own biases, take 20 minutes out of your day to watch this TED talk from the late great Hans Rosling and his son Ola:
Retired Sr. Business Analyst, forming efficient applications and intelligent users
7 å¹´I have taken surveys with convoluted questions that make it difficult to know if my "Yes" or "No" really mean what I want it to mean. Unfortunately, I even taken tests with questions like that. I also know I have looked at the results of surveys I've written and wished I had written it better. Thank you for the reminder to not lead the responder.
Database Administrator at Data2050
8 å¹´Good!
Product Management, Product Development & Strategy, Data, Machine Learning and AI
8 å¹´Check out the full survey, and tell me the questions were not asked in such a way as to lead the responder. It was clearly ON PURPOSE that the questions were phrased that way.