As the Mask Drops, It's Time to Face the Politics of Tech
Morten Rand-Hendriksen
AI & Ethics & Rights & Justice | Educator | TEDx Speaker | Neurodivergent System Thinker | Dad
"Is it really?" She gestured at my hoodie and the bold text across my chest reading "Code is Political."
"Profoundly so," I answered. "When designers and developers build the apps and services, they decide what capabilities and agency other people get to have. Technology is political because it shapes how people act, what people know, and what futures we build."
I designed the hoodie to spark conversations like this. When I wear it at tech conferences, people often approach me—sometimes amused or confused, sometimes annoyed—saying, "Stop trying to make everything political! Software is neutral; it has nothing to do with politics." I respond to them as I did to my friend, adding:
"The apps and services we build shape how people find information, who they talk to, where they work, play, and relax. We measure the success of technology by how much it influences our behaviour. In just a few decades, our work has transformed society, so much so it's now impacting the outcomes of elections!"
At this point, one of three things happen: some people realize they’re not ready for this conversation and change the subject, others walk away, and many respond like my friend did: “Wow. I never thought of it that way. That’s a lot of power—and responsibility!”
"Life is political,?not because the world cares about?how you feel,?but because the world reacts to?what you do." —Timothy Snyder, On Tyranny
Technology happens when someone sees the world, has a vision of how to make it better, and builds that future through augmenting the capabilities of other people. The more successful the technology is, the more closely the future aligns with that original vision.
This begs three vital questions:
If the last few days (and years, and decade) have taught us anything, it's that technology and the online spaces it creates directly influence us—as individuals, communities, nations, and a global society. The mask of value neutrality, worn by tech workers and billionaires as a shield against critique, is now off, leaving us to face the truth:
Technology is a political project, and we need to start treating it as such.
Social media, podcasting, ad tech, health apps, facial recognition, deepfakes, AI—all of these technologies profoundly impact our lives. Yet unlike other industries that chose to self-regulate as their influence grew, the tech industry has resisted any meaningful regulation, claiming neutrality while in the same breath warning that regulation would result in competitive disadvantage (and usually pointing at China as the most imminent threat).
It's time we do what doctors and machinists and lawyers and engineers and psychologists and every other profession that impacts the lives of people have done: Set ethical standards and practices for our work and urge our peers to join us.
If I don't clearcut this forest, someone else will.
The stories we tell shape our shared intersubjective realities. Today, those stories are broadcast on platforms controlled by billionaire oligarchs with political ambitions, on podcasts by men with ideologies that actively promote the stripping away of rights and dignity for entire groups of people, and in chat-based echo chambers where only aligned voices are heard.
Technology reflects the values and vision of its creators, and many of the technologies we rely on today manifest futures that serve those closest in power and appearance to their creators, at the cost of everyone else.
It’s easy to feel that technology is inevitable. It’s not.
领英推荐
Technology is a choice.
Today, those of us who work in tech have a choice to make: Do we keep perpetuating the mask of value neutrality, or do we step into our power as the means of production and take responsibility as designers of our collective future?
I choose the latter, and I urge you to join me. Here's how we get started:
As you do these things, understand them as steps to making ethics part of your practice. To make this more explicit, challenge your work using these four frames:
The Privilege of Action
Embracing the politics of technology starts with speaking up: naming your concerns, highlighting threats and possible harms, and proposing better solutions. We’ve seen what technology can do, and many leaders are now ready to listen when we raise issues.
But remember: Ought does not imply can no matter how much it should.
For many of us, speaking out or stepping away from harmful projects is not an option. Our social safety nets have been systematically unraveled, and speaking up can mean career setbacks, safety concerns, even job loss and loss of access to essential services like healthcare. Being able to speak up is a privilege, and the burden of action falls on those who have that privilege. So as you push ahead against the forces of unregulated utopian techno-solutionism, grant yourself the same empathy, compassion, and grace you extend to those you love. Hold yourself accountable to your own values, keep true to what you believe is right, and forgive yourself when you’ve done what you can and that wasn’t enough.?
We are social creatures. We shape the world by building community and society and telling stories where everyone is included. We owe it to ourselves to use hope as a catalyst as we work toward a future where everyone is valued and every person’s dignity is upheld.
Be the enemy of oppression. Keep your candle burning.
Some of the work that inspired, got quoted or referenced in this article:
--
Cross-posted to mor10.com
AI consultant bij DIKW… on the path towards sustainable AI
1 个月Indeed.
CEO of Automattic; Co-founder, WordPress
2 个月Great post.
Designing, building, and sustaining enterprise WordPress solutions.
2 个月I'm intrigued and baffled by your apparent endorsement of both Rawls and Sandel on exactly the points where they are in fundamental contradiction: neutrality/values and fairness for the weakest versus the strongest (or at least the majority or definitive "community") should define the master values that can trump the rights of individuals. How do you hold both together? And how does one know one isn't one of the baddies? Is it possible (even universal) to be both victim and victimizer?
Researcher, Ph.D. Candidate at Iowa State University CBE | Aspiring Scientist, Engineer, Entrepreneur
4 个月Morten Rand-Hendriksen: This is brilliant. The use of AI tools for marketing or political purposes is often predatory or ill-intentioned.
AI Auditor & Evaluator | AI Governance Consultant & Trainer | Social Entrepreneur | Community Facilitator | Interdisciplinary Researcher | Public Speaker | Writer
4 个月Great article and great questions: -Whose visions shape our future? -Do their visions include all of us? -Are their values in line with our own?