The Reality Check: Myths & Biases Holding Back Data Quality
What to expect:?
Rethinking Data Quality: Insights from Our Roundtable Dinner
- hosted by ReDem? , Ayda & the Market Research Society (MRS)
How can the market research industry tackle data quality issues in a meaningful way? That was the focus of a recent roundtable in London, co-hosted by ReDem, Ayda, and the Market Research Society (MRS). Senior industry leaders gathered to discuss the growing challenges of data quality—ranging from survey fraud and AI-generated responses to poor participant experiences and the role of suppliers, clients, and researchers in implementing solutions.
An initial survey of attendees underscored the urgency of the problem. 80% said the amount of bad data they received had increased in the past year, with the rest stating that it had stayed about the same. Half of attendees thought 5% to 10% of interviews could be affected by fraud and should be removed, with 30% putting the number of interviews that needed removing at 30% to 40%.?
In terms of how actively organisations are addressing AI-powered survey fraud, 40% said it was a key topic at their business and that robust measures were in place to tackle the issue. In addition, 40% of?attendees said their organisation knew AI survey fraud was an issue and had started to implement new measures to tackle the problem, while 20% said that the organisation knew it was an issue but had yet to act.
By the numbers: How researcher's mental shortcuts open the door to online survey fraud
- Dr. Sebastian Berger (Head of Science, ReDem)
Sebastian Berger's article in Quirk's Media explores how cognitive biases in market research professionals contribute to online survey fraud, ultimately compromising data quality. He identifies 5 cognitive biases —denial of the problem, prioritization of cost over quality, misplaced trust in brands, emotional bias in evaluating results, and oversimplification of fraud threats—that leave the industry vulnerable.?
"From my work with market research agencies, fieldwork providers and research buyers, I have noticed that cognitive biases – mental shortcuts we as humans take – can make us more vulnerable to online survey fraud. To address this situation, it is crucial to critically challenge five prevailing mind-sets that leave us exposed."?
Five cognitive biases that make us more vulnerable to online survey fraud:
Client-side researcher strategies for protecting panel data integrity (Quirk's)
- by Karine Pepin , Tia Maurer , Efrain Ribeiro , Mary Beth Weber & Carrie Campbell
The online panel landscape has fundamentally changed—and not for the better. What was once a structured, well-managed ecosystem has become a commoditized marketplace, where speed, cost, and volume often take priority over data quality, transparency, and respondent authenticity. As a result, fraud, disengaged participants, and misrepresentation are more prevalent than ever.
Yet, many researchers continue to operate under false assumptions about how sample is sourced, verified, and protected. This Quirk's article dives deep into five major myths that many research buyers still believe. It brings together leading voices to reveal the hidden flaws in today’s online sampling practices and what researchers must do to take back control.
5 myths about online panels:?
“Most sample providers run almost no checks on their respondents as they sign up to the panel or throughout the lifetime of the respondent on their panel. The onus is instead on you, the researcher, to make sure that you are building in sufficient checks to your study.”?– Andrew Gordon , Prolific
"How many surveys is too many? What about 21.8 survey attempts per day? This is the average number of survey attempts per survey entrant we captured across 26,000+ survey entrants on a study we ran earlier this year for research-on-research purposes.”?– Marc Di Gaspero
“While many suppliers may promote their proprietary panels, most have transitioned into an aggregation model, sourcing from various providers to meet quotas, timelines and budget constraints."– Mary Draper , EMI Research Solutions?
“With simple prompt adjustments, AI-generated answers can be shorter, less formal and include intentional spelling mistakes, making them no longer easy to identify.”?– Florian K?gl , ReDem
“The vast majority of all respondents registered on online panels are inactive and the size of the pool you can actually recruit into your study is often as much as 5-10x lower than the number the panel will advertise.” – Andrew Gordon , Prolific