Seven Steps to More Gain, Less Pain with AI in Higher Education
Dr Andrew Clegg, SFHEA, NTF, IFNTF
Head of Academic Development at the University of Portsmouth
In the last 18 months, Artificial Intelligence (AI) has become one of the most discussed topics in Higher Education, with a literal explosion on how AI can reshape education. Yet, in my mind, much of this conversation is still happening at a high level, focused on policies and overarching models of engagement and regulation. While these emergent narratives are valuable, and will no doubt help to shape the long-term relationship with AI, what many educators are still searching for is something more practical and tangible. Quite simply they are asking what does using AI look like on a day-to-day basis in the classroom and how do I integrate AI into effective learning and assessment design.
For the last year, I have been supporting the development of AI at the University of Portsmouth, leading workshops on AI and academic integrity.? I’ve also had the opportunity to present nationally and internationally on the topic.? Using this experience, I want to use this latest blog to offer some practical guidance on how AI can be incorporated into learning, teaching and assessment.? And I will preface this blog by making it very clear from the start that I am in no way an expert on AI – certainly not compared to some of my Portsmouth colleagues.? But what I have tried to do when developing AI workshops is to adopt the persona of an educator who has yet to fully appreciate and exploit the opportunities represented by AI.
AI and the Higher Education Landscape – What’s Really Happening?
So, let me start by setting the stage and sharing some insights I’ve gathered over the last year or so.? At the beginning of my AI workshops, I have consistently asked participants to reflect on their state of AI readiness and share their confidence levels regarding AI through a series of reflective questions based around the emergent UNESCO guidelines for using AI in Higher Education. These questions (see Figure 1), aim to gauge how confident educators feel when integrating AI into their teaching practice.? Figure 1, taken from my latest workshop – and which is typical of the results I have had over multiple AI workshops - highlights an important point: awareness of AI is perhaps not as high as we might first think.? As shown in Figure 1, participants average a score of 6.5 on a scale of 1 to 10 when asked if they are aware of the opportunities and risks AI presents in the educational context.? Scores in other workshops have been similar and indeed often lower. When we look deeper, the confidence in actually applying AI within teaching, learning, and assessment drops significantly. For example:
1?? When asked if they could proficiently identify and apply appropriate AI tools to facilitate student engagement, the average score was just 2.9.
2?? The ability to employ pedagogic practice in AI scored slightly higher at 3.1, but still shows a considerable gap in confidence.
3?? Interestingly, the ability to integrate AI into formative assessment was rated even lower at 2.7, showing a clear area where more support and guidance are needed.
OK, while this might be appear to be somewhat anecdotal, I’ve seen the same pattern repeated again and again in different workshops with different audiences.? So, what I have seen and heard over the last year is that while most educators see the potential of AI, there’s a wide gap in practical understanding and application. Most significantly, there is an emerging juxtaposition in terms of emergent levels of expertise and confidence.? Some colleagues are racing ahead, fully embracing large language models and other AI tools. Others are cautiously experimenting, while many more are still unsure about how to get started. I call this my ‘motorway analogy’ (see Figure 2) - a term I first coined when reflecting on staff engagement with remote learning during COVID. And forgive me – I do love a good analogy which colleagues at Portsmouth will verify ??.
The situation, as shown in Figure 2, is this:
?? Fast lane: These staff are extremely confident, racing ahead with AI, already using it to enhance their teaching practices – often making a personal investment to access the latest AI platforms.
?? Middle lane: Here we have those staff cautiously experimenting and integrating AI, moving forward at a steady pace and still gaining experience.
?? Slow lane: These are the educators just starting to explore AI, getting up to speed slowly.
?? Not even in the car yet: Finally, and most worryingly, there are those staff who have not even started the journey yet. They know AI exists but don’t yet see how it fits into their teaching and lack the confidence to engage.? Therefore, the biggest challenge we face is helping this last group - they aren’t lacking interest but they need practical support and confidence to take that pivotal first step.
The Seven-Step Framework for Using AI in Teaching
In response to this situation, I’ve developed a seven-step framework that simplifies AI integration for educators (see Figure 3), with a particular focus on assessment design and academic integrity.? It’s by no means rocket science but the simplicity of the steps has proven extremely effective in starting conversations and moving the AI narrative forward.
Let me quickly provide an overview of the key steps.
Step 1 and 2: Reflect on Module Learning Outcomes and Assessment Types
For me, the first step is to go back to basics and revisit your module’s learning outcomes and the assessment type, rational and purpose.? Ask yourself what you trying to achieve with the module and what knowledge or skills your students should gain by the end of it. How are you assessing students?? What is being evaluated through the assessment? How is evidence of learning being gathered?? Once you’ve thought about this - and I would suggest doing this through a team-based approach - consider how AI can help you meet these goals.? This reflective process is essential in understanding how AI can be a helpful tool in advancing your teaching goals. AI can be leveraged to assist in personalised learning, automating routine tasks, and create new ways for students to engage with content (something I will come to later in this blog).? But ensure to frame this discussion against the level of AI usage you would deem acceptable and how this is influenced by your own University regulations and guidance around the use of AI.?
How you integrate AI will also differ depending on the academic level.? At lower levels, AI can be used to introduce basic concepts and scaffold learning. At higher levels, AI might support students in more complex tasks, such as research projects or higher-level critical analysis.? Using AI to assist in the development of skills according to respective taxonomies helps ensure that AI doesn’t just enhance learning but also encourages students to grow cognitively.
It is also important to consider when and how students are equipped with the knowledge and skills to use AI effectively. Whilst AI can be integrated into individual modules, it is paramount that expectations and guidance around the use of AI in academic work is clearly embedded into induction programmes as students transition into Higher Education.? Students need clear guidelines about what constitutes acceptable AI use and where the boundaries lie to avoid academic misconduct. AI guidance should also be clarified for students transitioning between level 4 and level 5, and level 5 to level 6, as well as when moving into postgraduate study.
Are you Constructively AI-Aligned?
Steps 1 and 2 focus on setting up your teaching in a way that ensures constructive AI Alignment - which stresses the importance of making sure your pedagogical approach is aligned with how AI can enhance learning outcomes. It’s essential to ensure that your module’s learning outcomes, teaching methods, and assessments align with the way AI is being used. This alignment is what guarantees that AI not only supports the learning process but also fits into the broader pedagogical goals.
Testing Your Assessment Briefs with AI: A Crucial Early Step in the Process
When considering the first two steps of AI integration, one of the most important actions you can take is to run your assessment briefs through an AI. What continually surprises me in my AI workshops is the number of academics who haven’t yet tried this relatively simple task. Doing so can be incredibly illuminating and will help you identify:
?? The AI’s ability to meet the assessment brief: Can the AI-generated response fulfil the assessment criteria you've set? If so, this might signal that your assessment is vulnerable to AI, raising concerns about academic integrity. Try rephrasing different iterations of the prompt to look for any significant variations in the AI's responses. For instance, slight changes in how you frame the question or the level of detail you provide could lead to more accurate or creative answers from the AI. This process helps you identify how easily AI can be guided to produce work that aligns with the assessment criteria.? It also reveals the limitations of AI and that vague or complex prompts can produce incomplete or incorrect results.? By experimenting with different prompts, you gain a better understanding of how well AI can interpret and fulfil your assessment requirements, and help you decide if any alternative assessment is needed.
? The limitations of AI-generated output: More often than not, I have found that the AI generated output is not as good as expected. AI, for now at least (well until next week), still struggles with tasks that require deep judgement, referencing, or a nuanced understanding of the topic. This opens up opportunities for learning, particularly by leveraging the AI’s shortcomings as a foundation for student engagement and reflection. For me, the fact that AI doesn’t always get it right is where the real learning opportunity lies by refocusing AI’s fallibility into a valuable educational tool.
For example, after putting an assessment brief through an AI tool like ChatGPT, you could use the generated output as a first draft or a starting point for student engagement. The process involves two key stages:
1?? Initial Evaluation: Students begin by analysing the AI-generated output. What’s wrong with it? Where are the gaps? What is missing? This critical evaluation serves as an excellent way for students to practice reflective judgement, identifying areas where the AI's work doesn't align with the assessment criteria.
2?? Improving the Output: Once the students have evaluated the initial output, their next task is to improve it. They should seek additional resources, data, and research to more effectively meet the brief. This transforms the AI output into a collaborative, research-based process where students refine and restructure the work to align with the assessment criteria.
Step 3 and Step 4
Now, let's delve into the next steps. Step 3 is all about identifying the defining elements of your assessment tasks and considering the associated complexity and role of human judgment. The principal question here is: What makes an assessment more or less vulnerable to AI? And equally important, how can AI actually support the assessment? Step 4 then asks if the integrity of the assessment could be compromised by ChatGPT.
For these steps, I would recommend looking through the AI guidance resources that have been produced my Monash University - some of the best in my opinion! In summary, Monash provides guidance on how to structure assessments to mitigate academic integrity risks in three key areas: Assessment Formats, Forms of Knowledge Assessed, and Assessment of Product vs. Process.
? Assessment Formats: Academic integrity risk decreases as assessments move from using freely available texts to paywalled resources, from monomodal (single format) to multimodal (e.g., text and image), from assessing submitted work to assessing real-time performance, from single discrete assessments to longitudinal ones, and from monologic (one-way) to dialogic (interactive) formats.
? Forms of Knowledge Assessed: The risk of academic integrity breaches is lower when assessments require higher-order thinking (e.g., evaluating or creating) compared to lower-order thinking (e.g., remembering). Additionally, tasks assessing contextualised and personalised knowledge present lower risks compared to those assessing purely factual and abstract knowledge.
? Assessment of Product vs. Process: Including the process within assessments and making the process visible to the assessor decreases academic integrity risks. Additionally, collaborative work (students working as teams) tends to have lower integrity risks compared to individual work, as collaboration often introduces peer accountability.
Step 5: Using AI to Aid Engagement in Formative Assessments
Brace yourself - here’s where it gets really exciting! Once you've evaluated the vulnerability of your assessments, you can start thinking about how AI might actually enhance student engagement, notably through formative assessment design. This is where AI shines as a creative tool, and I’ll admit, this is the part where I could write multiple blog posts as there are just so many possibilities of how to embed AI in formative assessment!? So, let me share some practical examples of how I’ve played with AI to create engaging tasks and others that I have been experimenting with to help colleagues:
?? Sustainable Destination Management in the Coastal Zone
Earlier this year, I was running a session for undergraduates and postgraduates focused on coastal zone management - one of my areas of expertise. For the undergraduate class, things went well, but I felt I needed something more dynamic for the postgraduate session. That’s when I turned to AI.? Using ChatGPT and DALL-E, I generated fictional images of coastal landscapes that included a variety of tourism management challenges. It took a few prompts to get it right, but the resulting images (see Figure 8) served as the perfect visual aid for students. Of course, the images were not perfect and there were notable oddities but they succeeded in showcasing many of the key issues in coastal tourism management. The students loved it. They not only engaged with the images but also started discussing the oddities and how those might reflect real-world management problems. What began as a simple 'spot the issue' exercise turned into an interactive, critical discussion on the management of tourism in the coastal zone.
?? Crime Scene Analysis for Criminal Justice Students
Another example comes from a collaboration with a colleague in our criminal justice team at Portsmouth who wanted something more engaging than the standard textbook cases to help students map and record a crime scene. Once again, we turned to AI. We initially had a problem - when I asked for ‘crime scene images’, the AI kept generating images of crime scenes that had already been labelled by the police, which defeated the purpose of the exercise. So, we refined the prompt, instructing the AI to create a scene as if the students were the first officers on site. The resulting images were perfect for what we needed (see Figure 9).? The students could now label the scene themselves, identifying evidence and discussing how they would process the scene. This not only created a more hands-on learning experience but also helped the students develop critical thinking skills in a practical forensic setting.
We also thought about how to integrate AI into the development of crime scene report that investigators create for use in criminal proceedings. As we explored this, we discovered that AI tools like ChatGPT can indeed generate a detailed draft crime scene report. But these AI-generated reports often have deficiencies – they are missing details, structured poorly, or lack key elements that are crucial to successful criminal prosecutions.? That’s when we realised the fallibility of AI could actually become a learning opportunity. Instead of treating the AI output as a final product, we reframed it as a draft produced by a new team member named ‘Al’ (our personified AI). The idea is that students take Al's draft and improve it, acting as mentors or line managers refining their colleague’s work.
The task then becomes two-fold:
1?? Improving the report: Students add the missing evidence, restructure the document, and ensure it meets the required standards in line with the assessment criteria.
?2?? Reflecting on the process: Students critically reflect on what was missing or incorrect in the original draft and document how they corrected it.
This approach does more than just teach students to create better crime scene reports; it also engages them in reflective practice, encouraging them to think about the criteria and quality of the reports. By working through AI-generated mistakes, they build deeper understanding of what a good report looks like and how to improve it.
Alongside improving the AI-generated output, students also keep a reflective log documenting their thoughts on what was wrong, what the AI missed, and how they fixed it. This reflective practice not only demonstrates their understanding but also enhances their ability to critique and improve work aligning to the the learning outcomes and assessment criteria. This can also be framed as a group activity, where students work together to assess the AI output, discuss its limitations, and collaboratively develop a more accurate version. As they go through this process of critical evaluation, they will be able to articulate the gaps in the AI's reasoning and explain the steps they took to correct those gaps. In this way, the assessment focuses on both output (the final product) and process (the critical evaluation and reflection). AI is no longer a threat to academic integrity but rather a critical tool to enhance learning experiences.
?? Example 3: Creating Dummy Datasets for Statistics Modules
Another exciting use of AI that I’ve experimented with is in statistics - a subject I’ve taught for years and which many students find quite challenging. AI, in this case, offers an innovative way to engage students with the statistical approaches. The idea is simple: AI can create dummy datasets that students can use in SPSS or similar statistical software. But instead of providing students with one pre-generated dataset, they create their own. This adds a layer of personalisation to the learning process and makes the exercises more relevant to the students' own interests.
Here’s how it works:
1?? Students write an AI prompt to generate a dataset for a specific statistical test, such as a t-test. To do this, they must demonstrate their understanding of what a t-test is, the type of data it requires, and write the supporting hypotheses.
2?? For example, if students wanted to examine differences in content engagement between two streaming platforms, they would create a dataset that includes categorical (nominal) data and ratio data. They would also need to create hyotheses that reflect their understanding of key trends and patterns in streaming and what differences they might expect find.
By writing the AI prompt themselves, the students are not just passively learning; they are actively demonstrating their understanding of statistical approaches.? The prompt they generate reflects their knowledge of the test’s specific requirements and the data they need the AI to generate. Week by week, as new statistical methods are introduced, students can create prompts tailored to these statistical methods, turning prompt engineering into a creative and highly engaging exercise. And just like with the crime scene reports, AI doesn’t always get the datasets exactly right. This is where students’ critical thinking comes into play. They need to evaluate the AI’s output - does it meet the requirements for the test? If not, what’s wrong with it? By critiquing the AI’s mistakes, students demonstrate their grasp of the statistical method itself and identify how to correct it and then apply the changes.
?? Example 4: Experience Management Strategies for Attractions
In the past, while teaching a module on experience management, I would task students with creating an experience management strategy for a new visitor attraction in a specific destination. This is where AI can now play a pivotal role in shaping the learning process. If you ask ChatGPT to create a visitor experience for a new attraction in Bognor Regis, it will generate a basic report. However, as with previous examples, you’ll find that the report is likely incomplete. The structure may be there, but the depth and quality often falls short of what is required for a robust experience management strategy.
Just as we’ve done with crime scene reports and statistics, the students can use this AI-generated report as a starting point. Their task is to identify what’s missing and to further develop and enhance the report. In doing so, they are not only critiquing the output but also demonstrating their understanding of experience management concepts by refining the AI’s work.? Moreover, this method opens up greater creativity in class. Instead of all students working on the same example of a new attraction, groups can generate multiple types of attractions using ChatGPT, each with their own unique strategies. This leads to a more dynamic learning environment where students share, critique, and refine their concepts collaboratively, deepening their understanding of key experience management principles. For formative assessment, students can use AI to experiment with ideas before moving on to a more summative exercise. This flexibility allows students to engage with the content contextually, which aligns with the philosophy of focusing on context over content - a central tenet at Portsmouth (see enABLe.port.ac.uk )
?? Example 5: Reflective Practice for Work Placements
Reflective practice is another area where AI can offer innovative support. I’ve been collaborating with colleagues to look at the feasibility of integrating AI into work placement assessments, specifically in the reflective component. While the final placement reports are excellent, students often struggle with reflecting on their placement experience and the learning and skills development they have gained from it. Here, ChatGPT can be used as a placement coach. At the end of each week during their placement, students can use ChatGPT to simulate an interview that asks them about the tasks they completed, what they learned, and how they plan to transfer those skills. The AI can provide tailored questions based on the students’ responses, prompting deeper reflection (see Figure 10). Students can then use the AI-generated output as a foundation for their reflective reports, helping them articulate their learning experience more effectively. This approach also allows students to engage in reflective practice in real-time, turning their placement experience into an ongoing learning journey, rather than a task to be completed after the fact.
Leveraging AI’s Limitations as Learning Opportunities
Throughout these examples the fallibility of AI serves as an educational tool. Instead of seeing AI’s limitations as a drawback, we can turn them into learning opportunities, encouraging students to think critically about the outputs they receive. This approach gives them hands-on experience in refining and critiquing work, all while building essential skills for their respective fields and disciplines.? By integrating AI in this way, we also shift from using AI as a simple generator of content to using it as a partner in the learning process. The students become active collaborators in improving the AI’s work, honing their own skills in the process.
Steps 6 and 7
OK, let me try and bring this together. Having been through Steps 1 to 5, Step 6 now asks you to consider, having taken everything into account, whether an alternative assessment form is needed. If it is, you need to think about the relative complexity of the task to safeguard academic integrity (see Figure 11).
?
Step 7 asks you to think about the support that is needed to help you make the required changes and how this aligns to university policy and related regulations. When thinking about how you might integrate AI and respond to any concerns over academic integrity, be creative and work collaboratively with both departmental colleagues and Academic Development teams to craft really innovative and effective solutions.
Summary
By flipping the narrative around AI in education, we move from seeing it as an ‘ominous threat’ to academic integrity to recognising it as a valuable ally. AI, when used effectively, can become a pivotal tool for both formative and summative assessment. It allows educators to emphasise process just as much as output, helping students build the critical thinking and reflective skills now central to narratives about graduate attributes. This approach not only prepares students for assessments but also engages them in a deeper, more reflective learning processes - one that encourages them to critique, analyse, and improve the work that AI generates, demonstrating their own mastery of AI along the way.
And finally - can you name the AIs in the following images? Please add to the comments if you can name them all. An ice-breaker activity I use in my AI workshops.
If you would like more information or would like an AI workshop please get in touch.
Maria Hutchinson MSc PgCLTHE SFHEA Alejandro Armellini University of Portsmouth Advance HE Helen Clegg Simon Brookes Beth Hallissey Antonina Pereira Mike Wilson Professor Mike Lauder Times Higher Education Jisc Professor Harriet Dunbar-Morris PFHEA, NTF Dr Mark Mason Anne L Murphy Marc Lintern Dr Edward Stoddard
Online Course Development Specialist | University of Portsmouth
1 个月Data for Pain and Gain ??
Academic Director, University of Portsmouth (London)
2 个月Will you come to London and guide us through these steps? I want us to build our new UoP London programmes with much more openness to the affordances of AI.
A Business & Management Educator, Affiliate member of IEMA
2 个月Great blog and very useful. Thanks for sharing Andrew.
Professor of Neuropsychology and Neuroscience; Director - Institute of Psychology, Business and Human Sciences at University of Chichester
2 个月So very exciting, informative and timely - thank you so much, Andy!