“The Kryptonite of Modern AI”: Tampering with Bad Data.
Jay Deragon
Providing strategic roadmaps for implementation of AI. A systemic view and nonbias narratives are imperative if AI is to flourish throughout the organization.
Last month a story by Cade Metz appeared in the New York Times titled "Who Is Making Sure the A.I. Machines Aren’t Racist?". The article states "The big thinkers of tech say A.I. is the future. It will underpin everything from search engines and email to the software that drives our cars, directs the policing of our streets, and helps create our vaccines. But it is being built in a way that replicates the biases of the almost entirely male, predominantly white workforce making it.
In the nearly 10 years I’ve written about artificial intelligence, two things have remained a constant: The technology relentlessly improves in fits and sudden, great leaps forward. And bias is a thread that subtly weaves through that work in a way that tech companies are reluctant to acknowledge.
Timnit Gebrua says “I’m not worried about machines taking over the world. I’m worried about groupthink, insularity, and arrogance in the A.I. community — especially with the current hype and demand for people in the field,” she wrote. “The people creating the technology are a big part of the system. If many are actively excluded from its creation, this technology will benefit a few while harming a great many.”
The Weak Spot of AI is Bad Data
Ian Rowan, CEO and Principal Data Scientist of MindBuilder AI, says “Every single projection or prediction model being used today, be it Finance, Sales, and even Climate, has failed miserably at this point.”The reason is machine learning models make inferences based on past trends. “At the same time, most models remained shuttered in a remote cottage with only their knowledge of the current domain-specific data Rown says. When the global pandemic hit, it generated new data for which there was no precedent. “Unprecedented phenomena like Covid-19 may prove to be the kryptonite of modern Artificial intelligence approaches that are aimed at predicting the real world,” Rowan states.
"It will take months or years to adjust the data models, a time when humans are likely to be relied on more for forecasting, as they have been in the past. In finance, for example, stocks were at all-time highs and came crashing down. Nearly every earnings and price target prediction was taken over by humans who have been replaced or assisted by AI algorithms a short time ago. New predictions are often prefaced with “Covid-adjusted” terminology. It might be more accurate to say “. New predictions have been tampered with to accommodate a change in the predictions which end up getting masked under the label “Covid Adjusted”.
All of the issues shared above reminds me of a quote from W. Edwards Deming which said “Every system is perfectly designed to get the result that it does.” In today's modern era of management, it seems that some will go back to manipulating the data so they get the results they need to satisfy stakeholders. The problem is their stakeholders don't know that the results were created with bad data. Worse yet they will allow their Algorithm to be bias of the outcomes without knowing why, Tampering with data means a disaster is pending.