Can AI cause Tech Trauma? – Part One

Can AI cause Tech Trauma? – Part One

Welcome to data uncollected, a newsletter designed to enable nonprofits to listen, think, reflect, and talk about data we missed and are yet to collect. In this newsletter, we will talk about everything the raw data is capable of – from simple strategies of building equity into research+analytics processes to how we can make a better community through purpose-driven analysis.

?

I have found myself more angry, more untrusting of the world lately.

Sure, the ongoing wars, elections, hurricanes, inflation…, and general injustices against humanity… have an effect, but this seems deeper.

The deepfakes using AI that lead to riots, widespread false news, dangerously polarized journalism, tech continuing to disregard public safety in digital living, inauthenticity in favor of capitalism, and somehow new models of Gen-AI taking over the world everyday…yes that, I think all that is what's making me feel this nonstop tiredness, mistrust, anger.

So, today, I want to talk to you about something I still haven't figured out the language yet.

I want to talk to you about something called tech trauma.

Over the years, technology has transformed nearly every aspect of our lives, from how we communicate and work to how we socialize, shop, learn, and earn. Artificial Intelligence (AI) has amplified this shift, offering unprecedented efficiency, personalization, and innovation capabilities. Note that AI flavors existed before Gen-AI tools like ChatGPT. But as technology grows more sophisticated, I wonder if you, and many of us are experiencing a hidden, growing crisis called tech trauma?

This subtle form of distress arises from mistrust, inauthenticity, and a deepening unease about how AI and other technologies affect our lives.


Let us explore what tech trauma is.

I express this "tech trauma" as the emotional and psychological strain caused by the unstructured (and sometimes unethical) ways of black-boxed AI use, causing the mistrust and uncertainty that often accompany them. Unlike traditional trauma, which stems from discrete, often physical, experiences, tech trauma is more insidious and can accumulate over time. It manifests in feelings of anxiety, overwhelm, disconnection, and a loss of control in a world increasingly dominated by algorithms and machine-driven systems.

At its core, tech trauma results from the way technology—say, AI—creates a sense of inauthenticity in our relationships, decision-making, and even our understanding of ourselves. This strain can come in many forms, such as:

  1. Mistrust: As AI becomes more integrated into daily life, many people feel uneasy about its capabilities and the lack of transparency around how these systems work. High-profile failures, such as biased AI algorithms or breaches of privacy, fuel this mistrust.
  2. Inauthentic Interactions: The rise of AI-driven interactions, from chatbots to recommendation engines, has shifted the way we communicate with brands, institutions, and each other. While AI can offer efficiency, it often lacks the authenticity of human connection, leading to feelings of disconnection.
  3. Data Exploitation: Many people feel powerless in the face of data harvesting and surveillance. Knowing that personal data is constantly being collected and used for purposes beyond their control erodes the sense of privacy and security, contributing to tech trauma.
  4. Moral and Ethical Dilemmas: AI and automation raise difficult questions about fairness, bias, and accountability. As AI makes decisions that impact our lives, from loan approvals to job screening, we may experience frustration or a sense of injustice, especially when these systems are opaque.


The problem is, if we leave this unaddressed, tech trauma can contribute to a long-term emotional harm. For some, this tech trauma is subtle—manifesting as a low-grade anxiety or reluctance to engage with new technologies. For others, it can lead to a complete withdrawal from tech-driven environments or a mistrust of institutions that adopt these tools.


Let's look a bit more deeply at why this is happening.

One of the most pressing issues fueling tech trauma is the growing mistrust around AI and related technologies. Many AI systems operate behind a veil of complexity that the non-data scientist-like person cannot necessarily penetrate. When an algorithm decides what news stories you see, whether you qualify for a service, or what ads target you, the decision-making process is often invisible. People are left wondering: How did this decision happen? Why was I excluded? Or why was I included here?

This lack of transparency, compounded by reports of AI bias, can cause distrust. Additionally, even well-meaning AI systems can perpetuate harmful stereotypes and reinforce social inequalities when their algorithms rely on flawed or biased data. For example, facial recognition software has been shown to misidentify people of color at higher rates, leading to concerns about racial bias in access or denial of services.

Naturally, one can feel powerless in the face of technologies they don't understand, which seem to have an outsized impact on their lives.

Another critical component of tech trauma is the feeling of inauthenticity brought on by our interactions with AI. In recent years, AI has become deeply embedded in everyday life, from social media algorithms curating our news feeds to virtual assistants like Siri and Alexa anticipating our needs. While these tools offer convenience, they often come at the cost of real human interactions.

Take online social platforms, for instance. AI algorithms decide what content we see, who we engage with, and how our online communities are shaped. This hyper-personalization can create an echo chamber where we are constantly fed information that aligns with our beliefs, reinforcing biases and limiting exposure to diverse perspectives. Over time, these filtered experiences, without intentionality, risk eroding our sense of authenticity as we lose touch with the broader, likely more complex realities of the world.

Recommendation engines suggest what to watch, read, and buy, often leaving us feeling like passive participants in our own lives. As AI takes on more roles traditionally filled by humans, from customer service to caregiving, we may question whether our interactions are real or artificial.

This is dangerous - this growing sense of inauthenticity can leave people feeling alienated, disconnected, and emotionally adrift in a world increasingly mediated by algorithms. The more our interactions are filtered through AI, the harder it becomes to distinguish between genuine human connection and automated responses, leading to a sense of emotional dissonance.


The question is – can we try to heal from this oncoming tech trauma?

With all the optimism I know, I believe we can. We just have to.

But let us explore that in the next edition.

***********************


***?So, what do I want from you today (my readers)?

  • Share with us: what comes to your mind when you hear "tech trauma"? Are there any parts of this piece that resonate with you?

Lucy Ruiz

Helping Non Profits achieve OPERATIONAL EXCELLENCE| Personal Mantra: Little steps make a difference #movetheballforward

4 个月

Saving this post to read with a good cup of coffee. This article will require my undivided attention. Thx

回复
Tasha Van Vlack

Helping Communities Create Meaningful Engagement | CEO @ Community Hives/The Nonprofit Hive, Chief Engagement Officer @ Ember2Action

4 个月

“At its core, tech trauma results from the way technology—say, AI—creates a sense of inauthenticity in our relationships, decision-making, and even our understanding of ourselves.” It’s so interesting Meena that at the same time we were still all in some state of recovery from Covid (or at least wishing it was time to recover from Covid) AI came on the scene. We were at the depths of our disconnect and looking deeply for a way to come back to community and connection after hiding out in our houses to protect each other. Grappling with what it means to trust technology at a time when we are so disconnected from each other, seems like a lot of psychological burden to put on humanity. Amazing newsletter as usual, my friend.

Michelle Shireen Muri

Co-Founder of Community-Centric Fundraising / Public Speaker / Fundraising Consultant, Facilitator and Coach / Host of The Ethical Rainmaker podcast

4 个月

I love you so much for leading the global conversation around all the aspects of AI we should be wary of. Thank you Meenakshi (Meena) Das for sharing with us ??????

要查看或添加评论,请登录

Meenakshi (Meena) Das的更多文章

  • The Bridge Between Anger and Kindness? That’s Uncertainty

    The Bridge Between Anger and Kindness? That’s Uncertainty

    If I have to describe my community, my circle of trusted people, my tribe in a single sentence – they are all humans…

    8 条评论
  • Data, Joyful Resistance, and Progressive Philanthropy

    Data, Joyful Resistance, and Progressive Philanthropy

    Welcome to data uncollected, a newsletter designed to enable nonprofits to listen, think, reflect, and talk about data…

    8 条评论
  • Can AI cause Tech Trauma? – Part Two

    Can AI cause Tech Trauma? – Part Two

    Welcome to data uncollected, a newsletter designed to enable nonprofits to listen, think, reflect, and talk about data…

    2 条评论
  • Reclaiming control with AI Activism.

    Reclaiming control with AI Activism.

    Welcome to data uncollected, a newsletter designed to enable nonprofits to listen, think, reflect, and talk about data…

    2 条评论
  • AI-Ready Nonprofits Need AI-Ready Leadership.

    AI-Ready Nonprofits Need AI-Ready Leadership.

    Welcome to data uncollected, a newsletter designed to enable nonprofits to listen, think, reflect, and talk about data…

    12 条评论
  • The non-extractive version of AI.

    The non-extractive version of AI.

    Welcome to data uncollected, a newsletter designed to enable nonprofits to listen, think, reflect, and talk about data…

    4 条评论
  • AI can replace my job, yes.

    AI can replace my job, yes.

    Welcome to data uncollected, a newsletter designed to enable nonprofits to listen, think, reflect, and talk about data…

    3 条评论
  • Justice is Expensive.

    Justice is Expensive.

    Welcome to data uncollected, a newsletter designed to enable nonprofits to listen, think, reflect, and talk about data…

    16 条评论
  • Care, Imagination, and Responsibility.

    Care, Imagination, and Responsibility.

    Welcome to data uncollected, a newsletter designed to enable nonprofits to listen, think, reflect, and talk about data…

    2 条评论
  • Democratizing AI is not equal to Democratizing Equity.

    Democratizing AI is not equal to Democratizing Equity.

    Welcome to data uncollected, a newsletter designed to enable nonprofits to listen, think, reflect, and talk about data…

    10 条评论