Truth About AI Hallucinations: Why Transparency Matters
Adam Morton
Empowering businesses to harness the full potential of data | Best-Selling Author | Founder of Mastering Snowflake Program
Thank you for reading my latest article The Truth About AI Hallucinations: Why Transparency Matters.?
Here at LinkedIn I regularly write about modern data platforms and technology trends. To read my future articles simply join my network here or click 'Follow'. Also feel free to connect with me via YouTube .
We’re working in a world which is increasingly being shaped by the capabilities of artificial intelligence. As we begin to become more comfortable with this approach it's easy to assume that these technologies are infallible. But as Snowflake CEO Sridhar Ramaswamy recently pointed out on The Logan Bartlett Show, that's far from the truth. The AI industry has a trust issue—and it starts with the overlooked topic of "hallucination rates."
For those unfamiliar, AI hallucinations refer to instances where models generate information that isn’t based on reality. These aren’t minor slip-ups; we're talking about rates anywhere from 1% to a staggering 30%, according to third-party estimates. Yet, despite this, it seems most tech firms would rather spotlight their "magic" than admit these flaws.
Ramaswamy's perspective is refreshing because it acknowledges an uncomfortable reality: transparency is necessary, especially as AI becomes embedded in critical applications like financial analysis. The true risk isn't that an AI might be wrong 5% of the time, but that users don’t know which 5% that is. This, he aptly noted, is a trust issue that can have real consequences.
Let’s talk trade-offs for a second. Sam Altman, OpenAI’s CEO, and Anthropic’s Jared Kaplan have defended these hallucinations, essentially framing them as the cost of maintaining the "magic" and dynamism that people enjoy in AI interactions. While I get it—no one wants an overly cautious model that constantly responds with "I don't know"—I can’t help but feel that these arguments miss a key point: reliability matters. Magic is great until it leads to misinformation or, worse, a lawsuit (looking at you, OpenAI).
Baris Gultekin, Snowflake’s head of AI, emphasized that while accuracy issues remain the “biggest blocker” for wider adoption, the tide is turning. Companies are putting in place more robust guardrails to manage what AI can and cannot say, with improvements gradually rolling out thanks to better access to diverse datasets and enhanced tuning techniques.
But here’s the reality check: We need to push for more than vague assurances. Transparency around hallucination rates should be non-negotiable. Ramaswamy nailed it when he suggested that critical applications demand 100% accuracy, whereas less consequential use cases might tolerate the odd error. The key is informed choice.
Let’s wrap this up with a simple thought: AI will continue to evolve, and with it, so will our expectations. But as users, developers, and decision-makers, we shouldn’t be lulled into complacency by the “magic.” We need to demand transparency and take these numbers seriously—because trust, once lost, is hard to regain.
Until next time, stay informed and question the tech before trusting it.
To stay up to date with the latest business and tech trends in data and analytics, make sure to subscribe to my newsletter , follow me on LinkedIn , and YouTube , and, if you’re interested in taking a deeper dive into Snowflake check out my books ‘Mastering Snowflake Solutions ’ and ‘ SnowPro Core Certification Study Guide’ .
About Adam Morton
Adam Morton is an experienced data leader and author in the field of data and analytics with a passion for delivering tangible business value. Over the past two decades Adam has accumulated a wealth of valuable, real-world experiences designing and implementing enterprise-wide data strategies, advanced data and analytics solutions as well as building high-performing data teams across the UK, Europe, and Australia.?
Adam’s continued commitment to the data and analytics community has seen him formally recognised as an international leader in his field when he was awarded a Global Talent Visa by the Australian Government in 2019.
Today, Adam is dedicated to helping his clients to overcome challenges with data while extracting the most value from their data and analytics implementations. You can find out more information by visiting his website here .
He has also developed a signature training program that includes an intensive online curriculum, weekly live consulting Q&A calls with Adam, and an exclusive mastermind of supportive data and analytics professionals helping you to become an expert in Snowflake. If you’re interested in finding out more, check out the latest Mastering Snowflake details .
Data Architect
1 周It seems to me that a 5% error rate means that everything produced by AI needs to be validated by human analysts. So at this point, the AI that's available is best seen as a good productivity tool but not as a replacement for the information economy workforce.