As the Mask Drops, It's Time to Face the Politics of Tech

As the Mask Drops, It's Time to Face the Politics of Tech

"Is it really?" She gestured at my hoodie and the bold text across my chest reading "Code is Political."

"Profoundly so," I answered. "When designers and developers build the apps and services, they decide what capabilities and agency other people get to have. Technology is political because it shapes how people act, what people know, and what futures we build."

I designed the hoodie to spark conversations like this. When I wear it at tech conferences, people often approach me—sometimes amused or confused, sometimes annoyed—saying, "Stop trying to make everything political! Software is neutral; it has nothing to do with politics." I respond to them as I did to my friend, adding:

"The apps and services we build shape how people find information, who they talk to, where they work, play, and relax. We measure the success of technology by how much it influences our behaviour. In just a few decades, our work has transformed society, so much so it's now impacting the outcomes of elections!"

At this point, one of three things happen: some people realize they’re not ready for this conversation and change the subject, others walk away, and many respond like my friend did: “Wow. I never thought of it that way. That’s a lot of power—and responsibility!”

"Life is political,?not because the world cares about?how you feel,?but because the world reacts to?what you do." —Timothy Snyder, On Tyranny

Technology happens when someone sees the world, has a vision of how to make it better, and builds that future through augmenting the capabilities of other people. The more successful the technology is, the more closely the future aligns with that original vision.

This begs three vital questions:

  1. Whose visions shape our future?
  2. Do their visions include all of us?
  3. Are their values in line with our own?

If the last few days (and years, and decade) have taught us anything, it's that technology and the online spaces it creates directly influence us—as individuals, communities, nations, and a global society. The mask of value neutrality, worn by tech workers and billionaires as a shield against critique, is now off, leaving us to face the truth:

Technology is a political project, and we need to start treating it as such.

Social media, podcasting, ad tech, health apps, facial recognition, deepfakes, AI—all of these technologies profoundly impact our lives. Yet unlike other industries that chose to self-regulate as their influence grew, the tech industry has resisted any meaningful regulation, claiming neutrality while in the same breath warning that regulation would result in competitive disadvantage (and usually pointing at China as the most imminent threat).

It's time we do what doctors and machinists and lawyers and engineers and psychologists and every other profession that impacts the lives of people have done: Set ethical standards and practices for our work and urge our peers to join us.

If I don't clearcut this forest, someone else will.

The stories we tell shape our shared intersubjective realities. Today, those stories are broadcast on platforms controlled by billionaire oligarchs with political ambitions, on podcasts by men with ideologies that actively promote the stripping away of rights and dignity for entire groups of people, and in chat-based echo chambers where only aligned voices are heard.

Technology reflects the values and vision of its creators, and many of the technologies we rely on today manifest futures that serve those closest in power and appearance to their creators, at the cost of everyone else.

It’s easy to feel that technology is inevitable. It’s not.

Technology is a choice.

Today, those of us who work in tech have a choice to make: Do we keep perpetuating the mask of value neutrality, or do we step into our power as the means of production and take responsibility as designers of our collective future?

I choose the latter, and I urge you to join me. Here's how we get started:

  1. Build and use tech that align with your values. Invest your time and work in building futures you believe in, and protect your values as you do so.
  2. Make Privacy by Design the standard. Privacy is a fundamental right and an essential protection against personal, corporate, and government overreach.
  3. Work under a Veil of Ignorance. Tech workers the privileged one percenters of technology. To anchor your work in the real world of your users, assume you may become the one most disadvantaged by your own work, the one whose data will be used against them by a corporate or authoritarian government. Design and build solutions that protect everyone's rights and dignity.?
  4. Listen to people when they tell you their truth. When your work creates conflict, harm, or injustice, your users will feel it before you do. So talk to them, and inform your empathy by listening and learning and accepting their truth.
  5. Approach your power with humility. In the words of philosopher Michael J. Sandel, meritocratic hubris is "the tendency of winners to inhale too deeply of their own success, to forget the luck and good fortune that helped them on their way."

As you do these things, understand them as steps to making ethics part of your practice. To make this more explicit, challenge your work using these four frames:

The Privilege of Action

Embracing the politics of technology starts with speaking up: naming your concerns, highlighting threats and possible harms, and proposing better solutions. We’ve seen what technology can do, and many leaders are now ready to listen when we raise issues.

But remember: Ought does not imply can no matter how much it should.

For many of us, speaking out or stepping away from harmful projects is not an option. Our social safety nets have been systematically unraveled, and speaking up can mean career setbacks, safety concerns, even job loss and loss of access to essential services like healthcare. Being able to speak up is a privilege, and the burden of action falls on those who have that privilege. So as you push ahead against the forces of unregulated utopian techno-solutionism, grant yourself the same empathy, compassion, and grace you extend to those you love. Hold yourself accountable to your own values, keep true to what you believe is right, and forgive yourself when you’ve done what you can and that wasn’t enough.?

We are social creatures. We shape the world by building community and society and telling stories where everyone is included. We owe it to ourselves to use hope as a catalyst as we work toward a future where everyone is valued and every person’s dignity is upheld.

Be the enemy of oppression. Keep your candle burning.


Some of the work that inspired, got quoted or referenced in this article:

--

Cross-posted to mor10.com

Bas Beekmans, PhD

AI consultant bij DIKW… on the path towards sustainable AI

1 个月

Indeed.

回复
Matt Mullenweg

CEO of Automattic; Co-founder, WordPress

2 个月

Great post.

Daniel Knauss

Designing, building, and sustaining enterprise WordPress solutions.

2 个月

I'm intrigued and baffled by your apparent endorsement of both Rawls and Sandel on exactly the points where they are in fundamental contradiction: neutrality/values and fairness for the weakest versus the strongest (or at least the majority or definitive "community") should define the master values that can trump the rights of individuals. How do you hold both together? And how does one know one isn't one of the baddies? Is it possible (even universal) to be both victim and victimizer?

回复
Saad Aftab

Researcher, Ph.D. Candidate at Iowa State University CBE | Aspiring Scientist, Engineer, Entrepreneur

4 个月

Morten Rand-Hendriksen: This is brilliant. The use of AI tools for marketing or political purposes is often predatory or ill-intentioned.

回复
Ay?egül Güzel

AI Auditor & Evaluator | AI Governance Consultant & Trainer | Social Entrepreneur | Community Facilitator | Interdisciplinary Researcher | Public Speaker | Writer

4 个月

Great article and great questions: -Whose visions shape our future? -Do their visions include all of us? -Are their values in line with our own?

回复

要查看或添加评论,请登录

Morten Rand-Hendriksen的更多文章

  • After WordPress

    After WordPress

    Today, the head of the WordPress Open Source Project Matt Mullenweg unilaterally locked the gates to wordpress.org, the…

    60 条评论
  • Rubicon

    Rubicon

    On Saturday October 12, 2024, a line was crossed in the WordPress open source project that I fear will have a lasting…

    24 条评论
  • As We Break Surface – The AI Transmutation of Web Dev

    As We Break Surface – The AI Transmutation of Web Dev

    "Hey AI, build me a website." It's a matter of time - probably months, before we get here.

    10 条评论
  • It’s time to abandon reckless oil propagandists

    It’s time to abandon reckless oil propagandists

    A response to Dan McTeague’s Financial Post opinion piece “It’s time to abandon reckless EV mandates” published July…

    13 条评论
  • AI Training and the Slow Poison of Opt-Out

    AI Training and the Slow Poison of Opt-Out

    Asking users to opt-out of AI training is a deceptive pattern. Governments and regulators must step in to enforce…

    7 条评论
  • GPT-4o, OpenAI, and Our Multimodal Future

    GPT-4o, OpenAI, and Our Multimodal Future

    OpenAI held up a big shining arrow pointing towards our possible future with AI and asked us to follow them. Beyond the…

    12 条评论
  • Ten Questions for Matt Mullenweg Re: Data Ownership and AI

    Ten Questions for Matt Mullenweg Re: Data Ownership and AI

    Dear Matt. 404 Media tells me you're in the process of selling access to the data I've published on WordPress.

    11 条评论
  • AI Coding Assistants Made Me Go Back to School

    AI Coding Assistants Made Me Go Back to School

    The introduction of graphing calculators didn't remove the need to understand math; they removed the need to do rote…

    13 条评论
  • The Challenge, and Opportunity, of OpenAI's GPT Store

    The Challenge, and Opportunity, of OpenAI's GPT Store

    If you make a useful GPT in the GPT Store, someone else will publish a more popular copy and make yours look like a…

    8 条评论
  • Do Humans Dream of Electric Minds? How language influences our thinking about AI

    Do Humans Dream of Electric Minds? How language influences our thinking about AI

    “One of our persisting challenges is excessive hallucinations.” I’ll cut right to the quick: AI systems are nothing…

    2 条评论

社区洞察

其他会员也浏览了