Why AI tools, like ChatGPT, pose an existential threat to learning and assessment in higher education
Source: https://positive-vibes.co.in/Blog/Modern-Version-of-Tortoise-and-Rabbit-Story.php

Why AI tools, like ChatGPT, pose an existential threat to learning and assessment in higher education

When I worked as a video editor between 2013 and 2017, I noticed that many of my most time-consuming daily tasks, like creating title cards or intro animations, were repetitive and didn't require deliberate thought -- which is why, even though I enjoyed the act of designing them, I took steps to automate these processes.

At the time, 'Artificial Intelligence' (AI), while already covertly integrating itself into many technologies we used, was only an abstract buzzword in creative circles, and as creatives, many felt what we did was immune to its threat, as what we offered was "special" and could not be automated or artificially generated.

However, as I recognized that many aspects of my work could, and arguably should, be automated, I realised that with the demonstrated exponential growth of AI technologies, no one would be willing to pay inefficient and indecisive humans to do a job that AI programs could potentially complete in seconds.

As a result, I decided to introspect and identify what I would like to do professionally that aligned with my values and had the least chance of being automated within the next 25 years.

Cut to 2022, and I notice some of my AI-related predictions have already come to pass, in some forms, as AI-driven video editing tools like Descript are making video editing simple and accessible to all, but other developments, such as chatbots like ChatGPT , which can generate thoughtful, refined and comprehensive answers, and even write complex computer code, have admittedly blind-sighted me.

As a graduate-level student currently studying social work, I initially felt that tools like #ChatGPT will not affect or help me -- as I believed the level of research, writing, and analytical reasoning I engage in cannot be replicated by AI -- at least for the foreseeable future. However, after creating an account for ChatGPT last week, and tinkering with it since, I believe higher education is in for a seismic disruption too.

Many professors on subreddits, such as r/Professors or r/Askprofessors, are already discussing these tools uneasily, as they feel they pose a threat to the assessments they have already devised and understand the proclivity of a vast swath of students to take 'short-cuts', but feel ChatGPT is currently only capable of producing C to D-level undergraduate assignments if submitted 'as is'. However, they also concede that these tools provide students with a scaffold to develop and refine thoughts to create work they may not have been able to produce on their own. Further, since ChatGPT is based on a relatively small 'large language model' (GPT-3.5), when GPT-4 is released soon -- a large language model that has been already estimated to be 100x more powerful than GPT-3.5 -- the leap in generative capabilities presumably available to ChatGPT could be hard to quantify, and the quality of arguments and depth of content it generates is bound to improve too.

No alt text provided for this image
Full link: https://www.reddit.com/r/unimelb/comments/ztfvrg/will_you_use_ai_tools_like_chatgpt_when_working/

Above is a screenshot of a poll I conducted on the 澳大利亚墨尔本大学 subreddit, where roughly 45% of students professed that they might/will use #ChatGPT to work on or write their assignments next semester. While it's true that many university students already use essay writing services and employ "tutors" to help refine their work (or write them for them), and that they already disadvantage honest students, this simplification and democratisation of cheating across the university system, enabled by free AI tools like #ChatGPT, can make grades seem even more arbitrary, and even less based on the value of meritocracy.

While I acknowledge that many universities are (arguably unwittingly) already doing things to mitigate this threat, such as reverting to in-person and handwritten exams, these AI-driven tools will be accessible to students for take-home essays and coding assignments; as such, unless faculties acknowledge this threat, firmly announce that using these tools constitute academic misconduct violations (with appropriate consequences), and stay up to date with the latest AI-writing detection software, students like myself, who do not intend to use these tools on assignments will be disadvantaged.

As a result of these developments, I feel that unless educators acknowledge that some students will use these tools to game the system, and adjust assessments to test students' comprehension of content -- without the aid of computers -- the whole higher education exercise will become even more of a farce.

What do you think?

#highereducation #ai #chatgpt

Dylan M. P. Nanayakkara

Mental Health Advocacy @Mind Over Matter | Psychology & Social Work @The University of Melbourne | Content Creation ~> Writing, Video Editing & Graphic Design

1 年

Chris Groot Rebekah Anderson Brock Bastian As academics, is this kind of issue discussed, in any capacity, within universities in Australia right now? Or am I, like many early adopters of tech, fearing a rate of adoption of these programs that may not materialise for some time? Any of your opinions on this emerging topic are greatly valued by me, as y'all are lecturers, tutors, and subject coordinators who defined my undergraduate psychology experience at the University of Melbourne.

回复

要查看或添加评论,请登录

社区洞察

其他会员也浏览了