The acute and chronic problems of AI in education
Jason M. Lodge
Deputy Associate Dean (Academic) & Professor of Educational Psychology
Academic integrity as the acute problem
It goes without saying now that the rapid emergence of generative artificial intelligence (AI) into our lives poses significant challenges for academic integrity. The urgency of these challenges cannot be understated, as they directly impact the fairness and validity of the educational assessments upon which academic progress and certification depend. The focus on these issues over the last year is warranted.
Collectively, these issues are what I have been calling the acute problem. In fairness, generative AI has surfaced or exacerbated existing issues and risks as much as it has generated new ones. Cheating on assessment tasks is hardly new, after all. Nonetheless, educational institutions and systems are increasingly being asked to respond to the risks posed by these new technologies. For example, in Australia, the Tertiary Education Quality and Standards Agency (TEQSA) is about to issue a request for information (RFI) to higher education providers to better understand how institutions are adapting to the risks posed by generative AI.
However, as we alluded to in Assessment Reform for the Age of Artificial Intelligence, there are deeper, more chronic issues that AI is creating and amplifying within our education systems. These are not just technical problems or issues with the validity of assessment tasks but are deeply intertwined with curriculum and pedagogy. There are also equity issues emerging and I will have more to say about that in future. The risk is that focusing heavily on the acute problem (i.e. cheating) means we may be overlooking more profound shifts needed in education policy and practice. Every time I talk to my colleagues (people I have been referring to as generative AI superusers) Aneesha Bakharia , Jason Tangen , Leon Furze or Danny Liu , I am left with no doubt that there is a range of significant problems we are facing that go beyond assessment and academic integrity.
?
Curriculum challenges: The chronic problem
There is a growing debate over whether the current trajectory of machine learning and AI development is sustainable or even desirable. In recent weeks, I have heard compelling arguments from many people who are experts in AI (for the record, again, I am not, so I rely on the opinions of those who are). These arguments run the gamut from the view that we are reaching the absolute limits of current machine learning approaches through to the suggestion that GPT-5 will be here any day and will be conscious, Skynet style (yikes).
The wide range of expert views leaves us with an increasing level of uncertainty about the role of technology in education in the future. As educators, our challenge is not only to integrate technology into our teaching practices but also to critically assess its impact on learning outcomes and student engagement. We also need to carefully consider how technology will feature in their lives and work. Given the uncertainty and pace of change, this task is becoming progressively more difficult.
While considerable debate exists about reaching the limits of current AI technologies like machine learning and deep learning, their influence is already pervasive. The assumption that these technologies will continue to advance exponentially may not hold, yet the impact of existing AI capabilities is already significant and growing. We must address the full range of implications of these technologies, whether they advance further or not. As has been said many times, the genie is already out of the bottle.
So, the chronic problem concerns what and how we teach now that AI is everywhere. Thankfully, much work and thinking is going on around the world to address this problem. Shout out here to Sean McMinn for a recent compelling example of this work.
??
Addressing the acute by understanding the chronic
Delving deeply into the chronic issue is critical in our thinking about how to address the acute problem. The rapid changes and uncertainties in technology development make predicting specific future needs difficult, if not impossible. Education should, therefore, focus on equipping students with flexible, adaptive skills suitable for a variety of future scenarios rather than preparing them for a specific future.
领英推荐
Our primary role should not be to attempt to predict the future and prepare students for a narrowly defined outcome but to prepare them for any and all possible futures. This preparation involves fostering a deep understanding of themselves and the world around them. AI and technology should be tools for enhancing this understanding, not a major focus of their education (in most cases).
I am also suggesting here that perhaps there is too much emphasis on preparing students for day 1 of their working lives and not enough on day 3,000 (with no apologies whatsoever to the Australian ‘Job Ready Graduates’ policy package). I am sympathetic to employer priorities and an emphasis that many students have on employment, which is completely understandable when many parts of the world are experiencing a cost-of-living crisis. However, education needs to prepare students for a life and career in an increasingly uncertain and complex world, not just for day 1 on the job.
The level of uncertainty about where all this is headed is why I remain sceptical about arguments that suggest that the key to solving both the acute and the chronic problem is to double down on AI literacy and/or explainable AI. Humans are notoriously bad at understanding our own level of understanding, particularly in new knowledge domains (even those of us who have been studying this issue for decades), Dunning – Kruger effect anyone? There is little point in focusing on how the machines work (particularly given how rapidly they are evolving) when we often don’t understand how we, ourselves work. My sense is that there is something in this that we also need to consider as we attempt to address the acute problem.
?
Where to from here?
Influenced in particular by the data we have collected over the last year, it seems to me that understanding AI is less crucial than students understanding themselves and how they can best learn and adapt. As we continue to navigate the challenges posed by AI in education, our focus must remain on developing robust, flexible educational practices that can withstand the uncertainties of future technological developments. Similarly, the necessity of adaptability extends to our students, who will need to be able to navigate their way through unknown and unknowable futures.
To me, this need for adaptability means balancing a necessary emphasis on foundational knowledge with a greater focus on helping students understand how to make good judgements about their own level of understanding and then make good decisions about how to enhance that understanding [1]. In my corner of the academic archipelago, we would call this self-regulated learning. Our data (to be published soon – hopefully) strongly suggests that students who are good self-regulated learners are getting the most out of generative AI in their learning, irrespective of their technical understanding of these emerging technologies.
I’m increasingly convinced that both curriculum and assessment practices need to move towards a better balance between acquiring knowledge and enhancing self-regulated learning to help create adaptive learners for the age of AI. This shift will also help address the acute problem, so both must be considered in parallel. Of course, this is all far easier said than done, as I have discussed previously. However, emphasising the process of learning (particularly the metacognitive aspects) over outputs, performance, or artefacts is a good place to start.
Acknowledgement: ChatGPT (GPT-4) was used to edit this article for clarity
[1] I am mindful that skills like digital literacy, critical thinking, etc., are also in the mix and that the transferability of those skills is debatable. I didn't want to go there in this article, but these will be important elements to investigate. In particular, we need to develop a good sense of general, transferable adaptive skills for the age of AI.
Political Society Leader at Scotch College Adelaide
1 个月Jason, I completely agree with your thoughts on AI in schools and education. As a student myself, I have seen the problems it causes; when kids are relying on it to get the assignments done for them they don't actually retain the information, and therefore the learning is useless. Yet some teachers are doing nothing about this, because they too are using it to mark assesments. Along with the positives of AI come the negatives. The era of learning to learn is gone, and the new era of learning how to get other systems to do work for you are just beginning, and if nothing is done and articles like yours aren't shown in the media, that is the way it will remain.
AI in Education | Digital Marketing | Future of Work
6 个月Inna Piven Patrick Dodd
Pro Vice-Chancellor (Learning & Teaching) 〓〓 ???? and digital learning leader
7 个月Great article Jason M. Lodge …I think that it points, in many respects, to the long overdue need for a conversation as to the value and purpose of a higher education (no disrespect to the equally important need to understand the value and purpose of all other sectors of tertiary education). Is it to prepare ‘graduates’ for the workplace, is it to educate future citizens to succeed in an uncertain world (when wasn’t it uncertain?) or is it both? And if it is both, what are the key components of it that need to be nurtured and assured. I think we’d be better able to understand ad deal with potentially existential challenges if we had a clearer understanding of the reasons why we exist in the first place…
Researcher, Entrepreneur, Trusted Adviser, Non-Executive Director
7 个月Jason M. Lodge thank you for stating the obvious, noting incumbent practices and educational intuitions are least prepared for this. e.g. How many business schools are able to bridge the gap when what they most need is to access are the scholarly disciplines that they have become disconnected from.
Director @ HKUST | Center for Education Innovation
7 个月“it seems to me that understanding AI is less crucial than students understanding themselves and how they can best learn and adapt”. Agree … mostly. I suspect that as AI technologies become normalized and invisible, users (students) may be nudged by AI in directions without understanding why, or even cognizant of it. As a result, it may become important to address AI literacy to empower their decision making. Of course, it’s not clear whether GenAI is nudging anyone. That said, I couldn’t agree with you more that SRL, metacognitive awareness, and adapabilty are and will continue to be very important for students to learn if they want to be successful in the future. Great piece. And thanks for the shout out.