23 Things I Learned Co-Editing a 650+ Page Book on Generative AI and Its Impact on Education
Stefan Bauschard with MidJourney

23 Things I Learned Co-Editing a 650+ Page Book on Generative AI and Its Impact on Education

I spent a lot of time over the past 6 weeks co-editing Chat(GPT): Navigating the Impact of Generative AI Technologies on Educational Theory and Practice , a 650-page volume about the impact of generative AI on education. That involved reading at least a few thousand articles, following what many of the most highly qualified and influential people are saying on LinkedIn, and providing a content edit of most of the very thoughtful chapters written by 32 incredible contributors. Based on that, I’ve reached the following tentative conclusions about where we are with generative AI in education and what we should think about going forward.

  1. Despite many bans issued by schools, teachers, and professors, students are learning the very basics of how to use generative (GAI) tools and are using them both appropriately and inappropriately in their academic work. GAI and other AI technologies are starting to permeate every application, academic and otherwise, that students (and all of us) use, and continued advances in the technology will undermine the value of a lot of the instruction and assessment that is occurring in schools now. Many will find this reality unfortunate, but it is a reality.
  2. Learning bots have incredible potential. I’ve spent most of my professional life as a debate coach. I think the students I’ve impacted the most are those who I've intensely engaged with in small coaching environments. I saw my own son improve from a student in “regulars” math in 8th grade to a year ahead and in honors in a class with mostly 10th graders when he took advantage of a 1:1 math tutor. Bots that get to know what students know and don't know at a very detailed level (e.g., exactly what math concepts did they learn and not learn in unit 3) get to know them as learners, and can support them any time day or night are going to have a huge impact on student learning. Teachers can provide many things that bots cannot, but this granular tutoring is something a teacher cannot practically provide to the 100+ students they are likely responsible for in an academic year. Check out this post by one of the book's authors, Bonnie Nieves, about the capabilities of a particular bot. In our book, Char Shryock, a former public school district Instructional Superintendent writes about the tremendous power of bots in supporting instruction.
  3. Professors and teachers want to learn about the technology. Whether or not they support student use in the classroom, they want to learn about it so they can best manage its use in a way they find appropriate and currently they do not think they have the knowledge to do that.? Many teachers and professors who have banned student use of it are quickly becoming aware that these bans do not work and that they have been forced to manage the fall out from students using it potentially inappropriately (there are more gray areas than clear cases of unethical use), producing a lot of frustration. This is magnified by a lack of student training on appropriate use.?
  4. Professors are torn about the value of the technology. Some are embracing it out of necessity; others are embracing it because they think it’s a good tool that can improve student work and have done a lot of their own innovative work with it. Many are fighting hard against it because they still want to keep traditional tests and protect the desirability of "original" student writing.?
  5. Complete bans are common at the K-12 level, creating a gap between K-12 and university education and the preparedness of K-12 students to enter university knowing how to use the technology properly. The bans are also contributing to the very slow to non-existent knowledge of the technologies at the K-12 level, leaving students not only unprepared for life as a university student but for entering the workforce as well.
  6. ?Schools at all levels haven’t started thinking about the importance of teaching students about the technology. There are important things students need to start learning about the technology, even if schools resist students using it: deep fakes and the risks of societal and even in-school destabilization, which are magnified in a highly armed society; the incredible dangers of personalized attachments to bots; and the likely societal disruptions (unemployment at a minimum) that are likely forthcoming as a result of the technologies. Privacy and discriminatory patterns in both writing and art need to be discussed, though I recognize the latter may be hard to do in more and more states. These are not reasons to ban the technology in schools. These are reasons to teach students about it, to properly integrate it, and to teach students to use it responsibly.
  7. K–12 educators, especially 7–12 educators, need to start thinking about how learning standards will change, especially English writing standards, in the world of generative AI. In other words, what skills will students need when writing with AI (as a “copilot”) as opposed to writing without it. What are the learning standards for writing with AI? Writing with AI is an employable skill; it’s not clear that one could make money as a writer without the skills to use AI in writing. Would an employer be more likely to hire someone with mediocre grades and a high school degree who doesn’t have copilot skills or someone without a high school degree who has strong copilot skills?
  8. The gap between the driven students and their peers will widen. If students are motivated to learn, they have tremendous opportunities both inside and outside of schools to do so and those opportunities will expand rapidly. There are incredibly low-cost learning opportunities all around them, and those that use them to learn and learn how to use generative AI tools properly in work and school will have enormous advantages over their peers.?
  9. Students who rely on public schools to provide these services, at least in the US, are going to be at an enormous disadvantage if schools do not start taking this technology seriously and allow student use on their school-issued computers, especially since lower-skill jobs can be easily automated and the bots may even be more capable than many low-skilled workers at completing many tasks.
  10. International schools appear way ahead. My sample size is small, but I had no problem engaging the heads of international schools about this technology and their plans. If I had more time, I could have included more interviews with international heads of school than are included in this book. The best I could do in the US was an anonymous interview with a teacher (which is fascinating and focuses on the difficulties teachers are having managing widespread and arguably innappropriate student use in a district where it has been banned from school-issued computers and devices).
  11. Foreign governments, at least the UAE and Vietnam, have public and aggressive plans to integrate GAI technologies into their school systems.? I recently heard that more than 500 schools in Singapore are also working on integrations of generative AI systems.
  12. Teachers need professional learning. Teachers need professional learning in many areas –

(12a) They need to learn the basics of AI and how LLMs function, both to understand what these tools can do and what they cannot do. If one more person says ChatGPT is useless because it can’t produce bibliographies, my head will explode.

(12b) They need to understand the basic difference between tools, including which ones can access the web, which ones include calculators, etc.

(12c) They need guidance for how they are expected to manage student use in the classroom.

(12d) Ideally, they need guidance on how to teach students to use it productively.

(12e) They need to learn how to use these tools to strengthen their own capacities as teachers, including saving themselves enormous amounts of time generating lesson plans, materials and other form of assessments.?I have a theory that once they learn how to use it to assist with their own work, they'll be more comfortable with students using it to help with their work (and they'll even be able to help them do that).

(12f) They need to have their feelings heard and their opinions accounted for in decision-making related to the technology.

13. The biggest effect on education is on how teachers evaluate students, since they still use "original" work in the form of essays. Many teachers and professors falsely think “AI writing detection tools” are useful. And they falsely assume this problem (the ability of the tools to generate text output) is not going to get a lot worse: currently, yes, teachers who are familiar with GAI output can distinguish that from student work, but we are not that far away from where individual bots will be able to replicate the students’ current writing styles and abilities. It’s also the case that many teachers are not familiar with the existing and detectable language patterns produced by tools such as ChatGPT and thousands of students are passing off AI generated material as their own work. This is why they are inevitably struggling to manage this.?

14.Thinking is too reactive. There is an astonishing lack of appreciation for how much this technology will continue to advance and that the current GAI tools are really nothing more than prototypes that are designed to see how people use them. These tools will overcome their many (or all) of their limitations and improve, continuing to be able to accomplish more and more of what humans can do. There is even strong reason to believe the most advanced tech these companies already possess is being held back and slowly released. Educators need to stop saying this technology is no big deal and that it won’t impact the classroom.?

15. Artificial general intelligence discussions are relevant. (AGI) refers to the idea that machines will eventually have the same average intelligence as humans. No one knows precisely what that means. Does it mean the intelligence of your average remote office worker (Altman)? Does it mean a machine has enough intelligence to do what most workers can do? Does it mean it can do what every remote employee can do (that would be impressive!)? Does it mean a machine can do what must students or teachers/professors can do? And no one knows exactly what ChatGPT5 and other emerging LLM and non-LLM models will be able do, but we know that soon these technologies will be able to do a lot more (e.g., write in our own voices rather than generic (though grammatically perfect) GPT speak and talk like us. WHEN that happens (perhaps at the start of the spring semester 2024) can education (especially K-12) be any more prepared than it is now??

At a minimum, though, it makes sense to start thinking through this as there are more and more highly qualified individuals who think AGI, whatever your definition is, will be by the fall 2025 semester (Shapiro ). And even if you refuse to believe that (it is debatable),? figuring out exactly what the criteria are for determining AGI and then trying to figure out if a machine meets them at any point in time is a waste of time from the perspective of an educators, because AGI or not, these technologies will continue to grow and have the capabilities of to do things many humans currently do and are trained to do in school. That is going to undermine a lot of both the value of what we are teaching students and how we are teaching it.?

16. What is a useful, employable skill will change radically. A lot has been written in the last few years about how students should learn how to code so that they’ll always have job security. Now we know that one of GAI’s greatest strengths is coding and that millions of coders will likely lose their jobs. Educators need to think about how to help students develop skills in a world where many of the jobs that exist now will either not exist or will only exist in limited quantities in the future.

17. The discussion about this technology in education is focused on extremes: Use it everywhere or don’t use it at all. This isn’t helpful. The discussion needs to be about where, when and how to use it. I think everyone would agree that first graders need to learn to write sentences and that second graders need to learn to write paragraphs. At the same time, it’s not clear that there is value to teaching technical writing to college students, as machines are probably better than that (or they soon will be). Until Brain-Computer-Implants are widespread or VR glasses can instantly put relevant information in front of us, we are all going to need foundational knowledge such as very basic math.? Educators need to invest time and energy into figuring out what that essential foundational knowledge is and how to best teach students to acquire it. They should think through that in their own subject areas. And they need to think about how to assess this knowledge in this new world and reduce their reliance on the essay.

18. If we continue to sit back and pretend that we don’t need to engage this technology and engage it quickly and directly, we are in a lot of trouble. There is no way to keep it out of school buildings, as technology is porous in many ways. And attempting to isolate students from it only (further) isolates schools from the the “AI World” (Bill Gates ) that we are starting to otherwise live in. We can lock students in the buildings and keep track of their every movement when they are there, but we can’t lock networked society out of schools, both physically and as part of the larger world our students are continually connected to. Students need to be educated to thrive in the world they live in, not the one we wish they (and us) live in.

19. There is tremendous value to AI integration into schools. As we outline in the book, this includes individualized instruction and assessment, critical thinking, career readiness and support for college admissions.

20. No matter what teachers/professors and administrators think of the technology, they have to manage the growing impact it is having on education. I think that at a minimum they have the responsibility to teach students about the dangers it can present when not properly, just as they talk with students about the harms of social media and potential addictions. Ideally, they would work on some integrations in order to properly prepare students for the AI World and help teachers lesson their own work loads.

21. AI has enormous potential to strengthen human capacity as long as humans take advantage of it properly.

22. Humans need to retain control of the technology and educators need to retain control over how it will be used in our space. We need to welcome the technology, but on terms that enable it to work to benefit us and our students. We do not exist to benefit the technology and its developers.

23. Educators need to wake up and start asking hard questions. AI is here, this is no longer a future projection. We can post all we want on social media about the limitations of these technologies, and we can debate forever about what AGI is and when it will arrive (and if we can even know when it does), but the reality is that even if there were no improvements at all related to current technologies, the number of people we will need to do any of the jobs below will massively decline in the near future, as AI can do these as well or better than most people already.

Copy editors

Paralegals

Computer coders

Software engineers

Data analysts

Technical writers

News writers

Financial analysts

Traders

Graphic designers

Customer service agents

This list could be much longer; I just don’t think it needs to be to make the point. Yes, people who are exceptionally brilliant, talented and understand how to use these technologies well in these areas will survive; those who are not brilliant, talented and able to use the the technologies well will not, because it’s inconceivable that they could do one of these jobs better than a machine and in any company only a limited number of people are needed to leverage these technologies.

Given this truth, how will the educational system respond? Are universities really going to collect $300,000 from students, putting many of them in a lifetime of debt to train them to be coders and copy editors? Are high schools going to continue graduating students who can write 10 page reports in their own words but who don’t know how to use a copilot? Is K-16 education going to start teaching students about the social repercussions of this technology and the potential mental health downsides? What’s the next step?

Justin Suran

Writer and Tutor @ justinsuran.com

10 个月

I wonder if in 20 or 25 years the merging of human and AI will be so complete that we no longer try to maintain boundaries between the two.

Howard Moskowitz

Generative AI Learner ; Federal & State Education Grant Consultant, Reviewer, Evaluator

10 个月

Wow - so much information in a few pages...I have got to get the entire book! ??

回复
Miriam Scott

Educator focused on integrating technology into the mainstream learning experience. Tiny fish in this expanding pond…gulp! Education | GenAi | Change Management | Business

1 年

This is a great snapshot of everything that is happening in the education & AI space right now. It is an exciting time to be a teacher. Where can I find a copy of the 650+ book?

James J. Mischler, PhD

Director, Human Research Protections Program and Chair, Institutional Review Board

1 年

Thanks very much for this! I especially like #17–the need to think about and talk about AI and its place in education. We either decide for ourselves or it will be decided for us.

Letty Rising

Montessori Entrepreneur | School Development and Leadership Consulting | International Speaker | Writer | Teacher and Parenting Coach | Trainer | Course Developer | Content Creator

1 年

Great article! I recently wrote an article on ChatGPT for Montessori teachers, and as a person who is focused on Montessori elementary education, I’ll be curious to read about the thoughts on its usage with elementary-aged students. In Montessori spaces in general there has been lots of debate on computer usage in general. Most say it doesn’t have a place in preschool settings, and many feel that upper elementary is the time to introduce computers for research and projects, so that the early elementary years can be more focused on the tangible experience of using books, paper and pencil, and of course the Montessori materials. Having started a Montessori homeschool program with a virtual component, I see that students in grades K-3 do better with tangible materials whereas students in grades 4-6 do well with manipulating the corresponding virtual materials. When it comes to AI tools, I’m guessing that the consensus will eventually land in a similar way…students around ages 9 and up having some training and access to these tools. I’m guessing there will be many conversations by fellow Montessori thought leaders before a general consensus regarding best practices is established.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了