In an earlier post,
Margaret Bearman
shared some of the work that a 3 day symposium produced around generative AI and work-integrated learning. Here's another glimpse — a design provocation I brought to spark our thinking, which was then tested and evolved in our deliberations (Acknowledgements: Dave Cormier, Michael Tomlinson, Rola Ajjawi, Bonnie Amelia Dean, Franziska Trede, Simon Buckingham Shum, Phillip Dawson, Jack Walton, Thomas Corbin, Joanna Tai, Kelli Nicola-RIchmond, Nicole Crawford, Helen Partridge, Elizabeth Johnson, Jessica Lees, Susie Macfarlane, David Eckstein, Carlo Perrotta, David Boud & David Farrugia.)
Establishing and maintaining good WIL is labour-intensive, and the resources available to support WIL in higher ed are limited. While it is a ubiquitous and longstanding concept that a ‘3-way contract’ should exist between employers, students and universities on what takes place during WIL, all too often it endures only in well-intentioned hearts and minds, rather than in actuality. Even when in place, employers may not understand or have the ability to offer what was expected, organisations are dynamic, and people or constraints change.
An input into the symposium’s conversations was a design provocation — the concept of a “WIL-bot” — a chatbot co-designed for and with stakeholders to assist in the management of WIL placements, before, during and after, engaging conversationally with different stakeholders.
This concept evolved through the symposium:
- Focusing on the chatbot concept draws attention to the potential of the conversation to build common ground, e.g., between the very different stakeholders, or between sometimes abstract, decontextualised principles (such as academic outcomes, or a professional competency framework), the unique, messy WIL context and the practices students experience and must learn to navigate. Remember that we’ve now moved beyond texting, to agents that have hyper-realistic, emotional voice recognition and synthesis, sometimes with user-defined avatars, running on every student’s mobile.
- In one mode of working, the WIL-bot is a “coordinating tool” which could elicit, compare, and as far as possible align the expectations about WIL placements brought by students, educators and placement sites, synchronising these perspectives with the placement coordination platform used by the university. As explained below, however, in other modes, it must be very clearly a learning support coach.
- While a generic chatbot grounded in a foundation model (GPT, Claude, Gemini, etc) has impressive conversational capabilities, their training has produced a phenomenon known as “LLM sycophancy” – a 'desire' (i.e. training bias) to please the user that can trump the truth. They will not challenge or stretch the user out of their comfort zone without explicit prompting, and will often back down if the user challenges their replies. In contrast, the envisaged WIL-bot would be grounded in WIL theory, evidence, pedagogy and practice, able to hold expertly designed conversations with a given stakeholder.
- While AI vendors promote GenAI as a productivity aid, this of course means the same or higher quality work can be produced with less effort. While our students must be equipped to wield such tools on graduation (or they may not be able to compete), minimising effort is hardly a recipe for learning. A WIL-bot for students should, therefore, be a learning tool – the emphasis is on extending skills not efficiency.
- As part of the university’s learning ecosystem, it would be authenticated and privacy-respecting.
- The WIL-bot would use appropriate language for each stakeholder, and might engage ? Academics — feedback on how their draft assignment might be improved to help students voice the many informal modes of learning they have experienced in the workplace ??Employers — realistically, what time commitment can they offer a student for feedback? Encouragement to take a moment to recall the student's mindset and confidence? A productivity aid would be to include these in the draft report on the student, potentially mapped against relevant competencies; ? Students — 24/7 support as they prepare, undertake and review WIL. Advice on their rights, responsibilities, and who to turn to for support. Nonetheless, the bot might challenge the student to step out of their comfort zone. Nor must it be so seductive that we breed student dependency on it.
Stepping back to imagine the future WIL landscape with such AI agents, our conversations emphasised that WIL is a sociotechnical, political and cultural system. Introducing a WIL-bot with such abilities would have ripple effects on workload and stakeholder communication, power and practices, both predictable and unpredictable, good and bad.
We need design-based research into these dynamics, moving to formal processes that account for and monitor these consequences.
Acknowledgements: Image by Kseniya Lapteva from Pixabay