How AI Can Be Embraced By Teachers
Rebecca Rothwell
Writer & editor, skilled in biographical & technical writing ?? Experience: Center for AI Safety; International Olympic Committee; BBC; Welsh Gov; Ofsted; Dame Stephanie Shirley CH; parkrun founder Paul Sinton-Hewitt
?
Artificial intelligence, or AI, refers to software or computer programmes that can learn, problem solve and generate data. With AI, computers can perform tasks that are typically done by people.
?
AI is used today, and computer scientists predict that it will be even more prominent in the future. ?
?
Learning and AI
?
Learning about AI is important for all of us – and the earlier we learn about it the better we can understand how to apply this tool to improve our world. For a child in school today, AI will be connected to their future job – be it athlete, gardener, artist or astronaut.
?
According to this BBC article, while we are using technology more than ever before, it isn’t making us more productive, at least not as determined by current economic measurements. High-productivity growth, much the same as AI proficiency and mastery, will come only to those who learn how to use it best.
?
About 40% of the world’s population is under 24. Schools are best placed to prepare this generation for the age of thinking machines, to be sure that the consequences on social and economic values remain positive.
?
But how can teachers, those who nurture the hearts and minds of future generations, support children to learn about AI whilst navigating a world that’s as new to them as to anyone else?
?
This academic year has been affected heavily by teachers and lecturers striking, for reasons of pay and working conditions. ?
?
AI can reduce teacher workloads, especially in environments where teachers’ capacity and headcount are low. Teachers are overworked, with a typical run-down of responsibilities that barely includes teaching and learning. It’s not a stretch to say that these ‘extra’ responsibilities can be picked up by AI. If this happens, teachers are left with what they really want to do – teach. This may be optimistic but it is also realistic.
?
AI in the classroom
?
While any powerful, new technology can disrupt established practices, it also holds the promise of improvement and innovation. Education technology is continuously changing, and the emergence of AI tools is another step in this evolution. It is a teacher’s bread and butter to adapt to the diverse and ever-changing needs of learners, learning contexts and new resources.
?
AI is already being used in the classroom. From virtual tutors to assessment and feedback software, AI is helping to enhance the learning experience for students, and the working experience for teachers.
?
Students learn better when teaching and learning is personalised. Typically, this means that AI is supporting teachers to assess students’ current knowledge and competencies, identify gaps, deliver content at the right level and provide feedback – all to improve learning outcomes.
?
And AI-powered teaching tools are already available. For example, Microsoft’s PowerPoint has Presenter Coach, which helps students improve their presentation and public speaking skills. Students can access on-screen guidance for pacing, inclusive language, use of profanity, filler words and culturally insensitive phrases.
?
Other AI-powered teaching tools that may be coming soon to a classroom near you include:
???????? Systems that give guidance and support specific to an individual student.
???????? AI-powered computer vision and voice-to-text apps that can boost school accessibility for learners with visual and hearing impairments.
???????? Educational data analysis to generate insights about student performance and identify trends. ?
???????? Tools to overcome language barriers, facilitating effective communication for students and teachers who speak different languages.
???????? Content creation – for example, lesson plans, curriculum design and assessment topics. ?
?
AI can automate the assessment process, too. This saves teachers’ time and enables faster feedback to students. Multiple-choice and one-word answers are easily dealt with by existing software, using a straightforward binary mark scheme. However, assessments or projects that ask students to ‘explain’, ‘describe’ or ‘debate’ are looking for explorative, creative answers, answers that reward personality as well as understanding. These require a more complex assessment process, often using an assessment rubric to help determine an answer’s quality.
?
Hamish Arnold, a Computer Science teacher at Bassaleg School, Newport, is on the case. He is developing an AI-powered assessment tool that can mark, assess and evaluate his students’ learning and progress. ?
?
“It’s starting to get really exciting,” he tells me on a call towards the end of what’s been an incredibly busy summer term. “Assuming this is a viable project, and there’s still a way to go before I can roll it out to other subjects, such as English, this will create a much more reliable form of assessment, providing consistency and reducing time spent marking and moderating.”
?
It works by getting students to complete their assignments in Google Docs, and then the software marks and analyses their efforts. This method also means that subjects can go paperless, reducing the cost to both schools and the environment.
?
Hamish explains that AI has the potential to underpin positive transformation in education, with staff working to understand how their classes can prepare students for a life with AI. This would mean reviewing curricula, syllabi and teacher professional development programmes. “We know that the pace of change will accelerate. The skills landscape will shift. Education evolves.”
?
The future we are living in
?
Some of the UK’s top universities have drawn up a set of guiding principles to ensure that staff and students are ‘AI literate’ so that they can capitalise on the technology, while also avoiding some of its pitfalls. The Russell Group, which includes the likes of Cardiff, Cambridge, Oxford and Durham, said the principles will shape institution- and course-level work to support the ethical and responsible use of generative AI.
领英推荐
?
It comes after dozens of university lecturers said they were concerned students would use generative AI tools like ChatGPT. Developed by OpenAI, which is backed by LinkedIn-parent Microsoft, ChatGPT can be used to produce essays and other pieces of text when given a prompt. According to itself, “ChatGPT uses artificial intelligence and a vast amount of pre-existing knowledge to respond to questions, provide information, and hold interactive discussions with users, simulating natural language interactions.”
?
Because these AI text-generative tools can mimic natural human language in much more sophisticated ways than previously possible, there is a genuine concern about the impact these tools may have on academic integrity and originality.
?
Since the launch of ChatGPT in November 2022, university departments have had to determine how risk averse particular assessments are to ChatGPT. Philosophy, an inherently written, essay-based subject, is particular risk averse.
?
Dr Alex Carter is a philosophy teacher at the University of Cambridge’s Institute of Continuing Education, and Study Skills Fellow at Fitzwilliam College.
?
“Factual learning is less in demand since the advent of the internet – we have known this for some time. We no longer have to teach facts that are available to all of us just by picking up our phones.” Curriculum design is now about developing higher-order thinking skills instead, which is where ChatGPT may seem scary, since it appears to replace these very skills. “But it doesn’t,” says Alex. “ChatGPT doesn’t do anything creative or critical.”
?
In-keeping with the subject of Philosophy, using the right prompts you can get ChatGPT to write you an essay entitled ‘Why did Kant come up with the synthetic a priori?’ You’ll get fairly good answer using a persuasive combination of words. That’s because so many essays have already been written on this topic and fed into the AI machine. But you can get an equally good answer by asking it, ‘Why did Homer Simpson come up with the synthetic a priori.’
?
“ChatGPT doesn’t care about the truth. For this reason, it might be good at re-writing essays but it can’t and it won’t think critically or creatively for itself. Even so, it will get better at pretending to think critically and creatively.”
?
The appearance of work isn’t enough: you still have to do the work. Take this for example: AI produced a new Vivaldi score based on Vivaldi’s canon of work. “That’s not what Vivaldi did,” says Alex. He did not look back at the work he’d done to create a new piece of music. This can be applied to fiction writing too – all creative work is in a process of becoming, so it isn’t creative until it’s made. AI works differently to humans – it recreates based on what’s already been created. It’s not real. It’s not original.
?
Importantly, as AI advances, we must not relinquish all things cognitive to machines. Doing so would not only exacerbate tech dependence but also undermine critical thinking and reflection, which are essential aspects of the human experience. “We must continue to teach people how to think,” says Alex. “For humans, creative intelligence is biological before it is logical. It is this which gives us an edge over machines.”
?
As machines become better at answering questions, teachers should guide students to ask better questions. This will go beyond writing good prompts for conversational AI. Teachers need to continue to inspire students to be curious, as this is where humans have that edge over AI.
?
Technologically deterministic
?
Here in the UK, the education sector has been chronically underfunded for years. Class sizes are too big and the teacher to student ratio is challenged. This has made a space for technology companies specialising in AI to fill, with off-the-shelf, affordable solutions available for schools to buy in. It shows that the Government is doing something, and it’s cheaper than fixing the system.
?
However, we need to be wary of such companies coming into schools, because their values may not align with those of the learners. The future of education will not be determined by technology, but, as it stands, it will be determined by who the main interest groups are. It’s the agendas of those stakeholders that will determine education’s, and therefore our children’s, future. ?
?
You only have to read case studies of events such as the Google chaos in the Danish city of Helsing?r to see how this can play out. It can be very difficult for schools to stop using certain tech, as using it in the first place creates a dependency on its structure and service. The longer we are locked into any sort of product the more difficult it is to get out. Robust policies and GDPR training can help here, although that responsibility lies with an already over-loaded, under-resourced teaching community.
?
And there are mixed answers from the teachers themselves. More technology can be helpful, although only a few so far are embracing it. There’s research to show that more tech actually increases rather than decreases a teacher’s workload. Data has to be inputted into some sort of system in some sort of way in order for it to be gathered, processed and interpreted. For example, if a student is late then that data needs recording in the software. Teachers become the data-inputters, shifting from a quality engagement to a tech management role. ?
?
Recognising that there is a gap in our understanding of user experience, Dr Lulu Shi, a lecturer at the Oxford University’s Department of Education and research associate at Oxford Internet Institute (OII) is looking at things from the perspective of just those individuals. ?
?
“What we don’t know much about yet is what it’s like for teachers and learners in the deployment of AI tech. We don’t necessarily need more and more tech, because we don’t yet understand the particular experiences of those involved.” She adds that these experiences “may vary substantially across the population depending on their socio-economic and cultural backgrounds.”
?
Lulu wants to look at how people imagine the future of digital education, and what the best research methodology is for this type of research. “It is impossible to know what the future of education will be without first understanding the perspectives of the different main stakeholder groups.” She imagines that their visions will be very different from one another, and this is very important. “We have to have the different voices of these different interest groups if we want to shape the future in a democratic way.”
?
Her research will also extend to understanding how much ed tech is being used across the UK as comparable to other countries, and how the usage has changed over time. “We know that there is a huge amount of investment in this area, and that there may be a discrepancy between what investors believe is happening and what is actually happening.”
?
Robot teachers?
?
In much the same way that you don’t have a robot GP, it’s unlikely that robots will take over as teachers. Education is more than just the transfer of information: it’s about human connection and emotional support.
?
Teachers, human teachers, will always remain pivotal to learning. They will still set ambitious learning goals, lead instruction, and motivate and inspire their learners.
?
A good teacher is one who asks themselves, Am I a good teacher? A machine will never, ever ask itself that question, because it will never, ever care.
?
And that’s what we need to remember.
Writer & editor, skilled in biographical & technical writing ?? Experience: Center for AI Safety; International Olympic Committee; BBC; Welsh Gov; Ofsted; Dame Stephanie Shirley CH; parkrun founder Paul Sinton-Hewitt
8 个月It may be of particular interest to educators, the AI sector in Wales, and robot teachers ?? also Andy Adams MBE ??
Founder @ Bridge2IT +32 471 26 11 22 | Business Analyst @ Carrefour Finance
8 个月Your insights into AI are thought-provoking. Thanks for sharing! ????