Higher Ed’s Fear of AI: A Convenient Excuse?
AI is here, higher education—whether your campus is ready or not. In many cases, instead of embracing the opportunity, many leaders are finding every possible reason to leverage it. The concerns?
Let’s break these down. Not to dismiss them—but to expose the irony of using them as a defense for keeping things exactly as they are.
1. AI Will Replace Humans
Throughout history, every major technological shift—from the industrial revolutions to the rise of the internet—has sparked the same fear: Jobs will disappear. And yet, what happens? Humans adapt. We evolve into roles that demand creativity, judgment, and complex decision-making—the things AI can’t do.
The truth? AI won’t replace humans. It will replace the repetitive, soul-draining work that clogs up campus operations.
If your job consists entirely of tasks that AI can automate, the real question is: Why is that still your job?
2. Ethical Concerns
If AI is a threat to ethics, then let’s talk about the current ethical lapses on campuses—because they require zero technology:
If higher ed is so concerned about ethical risks, let’s start by addressing the ones happening in broad daylight. Better yet, why not use AI to audit and enforce ethical standards, working alongside human oversight?? Ai will level shed on data driven truths.?
3. AI Will Reduce Human Interaction
This argument would be funny if it weren’t so ironic. When was the last time higher ed made human interaction easy for students?
Students today struggle to find anyone who can actually help them—stuck navigating a maze of policies, departments, and dead-end emails. If anything, AI could increase human interaction by automating administrative bottlenecks, freeing up staff to focus on actual student engagement.
The real problem isn’t AI. It’s that higher ed has built a fortress of bureaucracy that keeps students at arm’s length.
4. AI Can Misinform or Hallucinate
Here’s a reality check: AI isn’t the only thing giving students bad information.
The solution? AI and humans working together—leveraging machine learning to refine responses and improve accuracy over time. Unlike humans, AI can actually be trained systematically, reducing errors instead of repeating them.
5. AI May Make Mistakes
Yes, AI can make mistakes. So can humans—constantly.
The difference? AI can be refined. It learns. It gets better. Humans, on the other hand, continue making the same mistakes, with little accountability.
AI shouldn’t operate independently—it should be a collaborative tool, integrated into teams to improve decision-making. The goal isn’t perfection; it’s reducing errors to near zero.
The Real Issue: Higher Ed Doesn’t Want to Change
This is why I shake my head when I hear these so-called “experts” throwing obstacles in front of AI. If these concerns were truly a priority, why haven’t we fixed them in human-driven processes first?
Higher ed doesn’t have an AI problem. It has a status quo problem.
It’s easier to blame obstacles of AI than to admit the real issue: campuses have been running inefficiently for decades, and AI is simply exposing what was broken all along.
So, instead of fearing AI, maybe it’s time to fear staying exactly the same.
This is 1 of 5 articles I will post before Alliance'25.
If you are coming to Alliance'25 Higher Education User Group in New Orleans, find me on MainStage on Monday & Tuesday at 9:45 am where I will be talking Deploying AI and the path to a modern SIS. Also is also required to get your Deploying AI and SIS Modernization certificate.
Lauren Wass Christopher Cameron will be joining me on the stage.
Aspires to be an Exemplar of a Purpose-Driven, Inspirational, and Compassionate (EPIC) Leader. My posts are my own and do not represent my employer.
2 周Ouch. ??
Associate Vice President | Information Technology | Program & Project Management | Enterprise Resource Planning
2 周I am so sad I will miss this opportunity to hear you live and meet you. I appreciate your perspectives and insights.
Author
2 周Matt, once again you address the reality of higher education as we both observed it over the years. What a different world higher education might be if decision makers, faculty, and staff acted and made decisions as if their own children were the students affected and as if they were spending their own money to run their institutions. I believe they would look at what they fear more closely and creatively because they had something at stake. I have learned to become interested in what threatens me, because threats are teachers, the whetstones against which we might sharpen ourselves. The earth seemed flat until someone had the courage to question what everyone thought.
Free Agent | Strategic Growth & Transformation Leader | Driving Innovation in Higher Ed, Nonprofit & Corporate Sectors | Cloud, CRM & Digital Strategy Expert | Ex-GE, Ex-IBM, Ex-Goldman Sachs, Ex-Northwestern Univ
2 周Change is difficult in Higher Ed. People get comfortable and that can lead to complacency. Often I will assess the technology portfolio for an institution and find legacy systems that have been there relatively unchanged for decades. It’s difficult to deliver great experiences if your toolsets don’t innovate with you.
Head of Final Pixel Academy, PhD | MFA | FHEA | PG Cert
2 周Some interesting insights here