Why we need to put AI-washing through the wringer
Dan Sandhu
CEO EDT | Investor, Founder, Plc NED | AI in Education Panel I SaaS, Education, Ed-Tech, AI, Digital
When Sir Anthony Seldon wrote in The Times recently that AI will make the UK’s school system the ‘envy of the world’ my reaction was torn. Yes, on one hand there are fantastic strides being made by UK companies using AI and machine learning to enhance teaching and learning. On the other hand, there are plenty of companies cashing in on the cache of AI, using a rather smoke and mirrors approach.
AI is the buzzword of the moment across numerous industries from healthcare to retail and the education space is not immune. UK investment in artificial intelligence companies reached record levels of over ï¿¡1.3bn after the first six months of 2019. The AI in education market is projected to surpass $6bn by 2024.
There’s just one major problem. Many companies are splashing ‘AI’ across their messaging, playing on our collective confusion around what AI is and playing off the future-forward kudos it attaches to new solutions. This is a phenomenon being labelled as ‘AI-washing’:
"It's really tempting if you're a CEO of a tech startup to AI-wash because you know you're going to get funding," said Brandon Purcell, a principal analyst at Forrester speaking to Axios.
As Xiaoyu Xu, managing partner at Amino Capital told Synced, “Some people are smart at making up stories. When startups have no market traction, they will instead add hype to drive attention from venture capitalists. And the recent hype is AI.â€
When it comes to assessing whether a company does use AI or not, the problem starts with the fact there is no one definition of ‘Artificial Intelligence’. So, while some companies may be making use of data or automation, they are skirting the line between what can be claimed to be AI and what the real experts would define AI as.
At Sparx, we want to be clear about what we mean by AI and how we use it: Sparx Maths uses statistics and machine learning (simple AI based on recognising patterns in data) to support teachers in providing personalised homework. We know this makes a positive impact on student learning, because we have measured it and collected the evidence.
We also want teachers to understand how the AI we use is implemented in our solutions. We call this ‘explainable AI’. Education is too important to be entrusted to a black box algorithm and so we strive to make explainable any aspect of our AI that affects students’ learning.
This idea of ‘explainable AI’ was recognised in the Varkey Foundation’s recent report, ‘System Failure’ which demands government do more to “Ensure that the use of AI is transparent, unbiased and accountable.â€
I am truly excited that the UK is helping to lead the way in innovative edtech solutions but urge caution about further hyping technologies that are clouded in mystery. Whether AI or not, we want to ensure the UK leads the way in clear, transparent, evidence-based technologies – no matter the platforms or algorithms underpinning them.
Read more about this issue in my article for the February edition of Education Investor.
Leader in cybersecurity & edtech | Follow for posts on cyber, evidence-based systems for growth & company building.
5 å¹´Well said!