Many Schools Believe They’re Teaching AI Literacy. Unfortunately, They’re Not.

Many Schools Believe They’re Teaching AI Literacy. Unfortunately, They’re Not.

As I shared in the latest episode of Safe and Innovative Schools (https://youtu.be/oykY8eowomw)

Schools and districts across the United States—and globally—are working hard to integrate AI into classrooms and their overall operations.

This is fantastic progress, but I’m also seeing a critical misunderstanding play out over and over.

Many educational organizations believe they’re teaching AI Literacy when, in reality, they’re only providing AI Awareness.

And while awareness is a great starting point, understanding the difference between AI Awareness and AI Literacy is crucial.

The Goal of AI Literacy

AI literacy requires that educational leaders fully prepare students, teachers, and staff to:

  • Learn From, About, and With AI
  • Understand how to Work with AI, including machine learning, automations, generative AI, and agentic AI
  • Critically evaluate AI outputs to ensure accuracy, fairness, and ethical use

This graphic highlights the gap between AI awareness and AI literacy, showing where many schools and districts are currently falling short.



AI Awareness vs. AI Literacy

Many schools and districts are stopping at awareness, and as the graphic shows:

  • AI Awareness = Knowing AI exists, recognizing AI-generated content, and using some generative tools like OpenAI ChatGPT or 微软 Copilot.
  • AI Literacy = Understanding how AI works, questioning its outputs, identifying biases, and applying ethical frameworks to ensure responsible use.

Teaching basic AI Awareness concepts and believing that you’re teaching full AI Literacy is like only teaching the alphabet and then believing that your students can now write an essay.

Put simply, it’s not enough.


The Risks of Getting This Wrong

Mistaking AI Awareness for AI Literacy leaves schools, educators, and students vulnerable in critical ways, particularly in instruction, school operations, and decision-making:

Instructional Risks

  • Over-Reliance on AI – Some educators may lean too heavily on AI for grading, lesson planning, and content generation, without critically reviewing outputs.
  • Loss of Critical Thinking and Creativity – If students use AI only for quick answers rather than engaging deeply with concepts, they may struggle with problem-solving and higher-order thinking skills.
  • Gaps in AI Access and Readiness – Not all students have the same exposure to AI tools. Especially paid tools. Schools that fail to teach AI Literacy and provide required AI tools risk creating disparities, where some students graduate AI-proficient while others lack even basic AI fluency.

District and School Operations Risks

  • Poor Decision-Making Based on AI Outputs – Many schools and districts are using AI for predictive analytics in areas like student performance, behavior monitoring, and resource allocation. But if the data and models are not correctly built, the bad predictive data and outputs are not actionable.
  • Unintended Consequences of Automation – AI-driven automation is being used for hiring, student discipline, and operational logistics. But again, if not build and used properly the impacts and be disastrous.
  • Lack of AI Governance and Policy Development – Many schools and districts are integrating AI tools without clear guidelines for ethical use, governance, or transparency. We need to go slow to go fast here.

Privacy and Data Risks

  • Some schools and districts are adopting AI tools without fully understanding how student and staff data is collected, stored, and used.
  • Laws like FERPA, COPPA, CIPA, and even HIPAA set clear regulations, but many AI-driven tools operate in gray, or blatantly improper areas, exposing schools to compliance risks and potential data breaches.

AI Bias and Misinformation

  • AI isn’t neutral, and there it is not possible to remove all bias—its outputs reflect the biases in its training data.
  • If students and educators don’t know how to identify and question bias, they risk amplifying misinformation rather than challenging it.

Cybersecurity Threats

  • AI is being used to power more sophisticated phishing scams, deepfakes, and misinformation campaigns.
  • Without AI Literacy, schools are unprepared to defend against these evolving cyber threats, leaving students and staff more vulnerable to deception.


What Real AI Literacy Looks Like

If we want students, teachers, and staff to be truly AI-literate, we must go beyond surface-level awareness and focus on understanding, evaluating, and applying AI critically:

  • Knowing How AI Works – AI isn’t magic—it’s technology built on neural networks, algorithms, training data, and models like machine learning and deep learning. Students, teachers, and staff should learn how AI systems process data, make decisions, and why they sometimes fail.
  • Understanding AI Model Training and Bias – Again, it is not possible to remove all bias in AI systems. AI reflects the data it is trained on, and biased data leads to biased outputs. Schools should teach students, teachers, and staff where biases come from and how to recognize and mitigate them.
  • Using AI as a Thought Partner – Students, teachers, and staff should go beyond basic prompting and interact critically with AI—refining outputs, analyzing responses, and recognizing where AI lacks nuance or understanding.
  • Building AI-Driven Critical Thinking Skills – AI isn't always right. Students, teachers, and staff must verify AI-generated content, identify hallucinations, and fact-check using multiple sources.
  • Applying AI in Real-World ScenariosAI Literacy isn’t just about understanding AI—it’s about using it responsibly. Students, teachers, and staff should evaluate AI-powered tools for reliability, security, and ethical implications before trusting their outputs.


Final Thought: Schools and Districts Need to Step Up

AI is already reshaping education, whether we’re ready or not.

But right now, too many schools and districts are stuck in AI Awareness, believing they have built AI Literacy—and that’s going to be a problem.

We need real AI Literacy programs that prepare students, teachers, and staff not just to use AI, but to think critically about it, challenge it, and work with it effectively.

How do we get there? What’s missing in AI education today? Let’s discuss.

#whateverITtakes #AIinEducation #ArtificialIntelligence #AILiteracy #EdTech #FutureofEducation #ResponsibleAI #DigitalLiteracy

Jaime Alberti, MPA

Safety, Security and Emergency Preparednesses Executive | Law Enforcement Officer

4 天前

This is a great insight. ?? Your point about AI-driven automation in hiring is spot on. Schools and districts must be careful not to rely solely on AI algorithms, which will miss the human qualities that are unable to be identified by scanning a resume. Automating the hiring process without human oversight risks overlooks traits like passion, adaptability, and a "can do" attitude which are all qualities we want to have in our workforce. ??

回复
Kevin LaBranche

Safety Security and Facilities Manager

1 周

Although AI will evolve to levels we can only imagine at this time, I like how you caution the use of AI generated data without human thought, understanding and engagement to ensure accuracy, reliabilty, compliance.

回复
Victor Rivero

Editor-in-Chief at EdTech Digest

1 周

great distinction

回复
Chiwon Song

Realism Programmer

1 周

Good point!

回复
Bibimariyam Dange

Internet marketing analyst at AI CERTS | Digital marketing | PGDM |

1 周

Dr. Phillips, your insights on AI literacy are pivotal for our education system. I thought you might be interested in AI + Educator related events. Join AI CERTs for a free webinar on "AI + Educator Demo Session – Transforming Teaching with AI" on Feb 26, 2025. Anyone interested in this event can register at: https://bit.ly/m-ai-educator and will receive a participation certification. Your leadership in this area is inspiring.

要查看或添加评论,请登录

Dr. Joe Phillips的更多文章