Facebook, Cambridge Analytica, and the ethical reckoning for tech

Facebook, Cambridge Analytica, and the ethical reckoning for tech

At a conference two years ago, I sat at a lunch table with one of the pioneers of web standards. We were discussing the human impacts of web technologies and how the technologies we built for information and connection and entertainment are influencing our lives in more and more problematic ways. At one point he looked down into the table, let out a long sigh, and said:

“I thought we were making world peace.”

The Reckoning

Over the weekend, the world has responded with fury to the revelations of how the data analytics firm Cambridge Analytica accessed the user profiles of millions of Facebook users and used that information to influence the political views of those users. The story of Christopher Wylie, the “Data War Whistleblower”, hit the airwaves on Saturday March 17, 2018, one day after Facebook announced it was suspending Cambridge Analytica and SCL Group from its service.

Somehow overlooked in the outrage is the fact this story originally broke in December 2016, in the German magazine Das Magazin, and was republished by Vice’s Motherboard in January 2017. Which means from the moment the public was told what was going on, it took Facebook a full 15 months to suspend the company from its service.

I’ll let that sink in.

This is a moment of reckoning for the tech industry. We have spent the last decades moving fast and breaking things, and now those things are breaking us. What the tech industry lacks, what most other industries whose work directly impacts people and the society we live in has, is a practice built on a solid foundation of ethics.

The tech community needs to start putting “ought” before “can”

Last night Yonatan Zunger put together a Twitter thread outlining this situation and juxtaposing it with similar reckonings in other industries:

Allow me to be crass for a moment: For the tech industry, playtime is over. We, the people who make decisions for everyone else through our designs and digital services, need to take responsibility for the creations we put into the world. We can no longer sit back and say “it’s just design” or “I’m just writing code”. Every design decision is a decision made on behalf of the end-user, and every designer, developer, system architect, researcher, and content creator has a duty of care to that user to protect their best interests.

If we don’t establish an ethical framework for our industry, governments can and will impose restrictions on our work as is the case with the upcoming GDPR in Europe. And with good reason: The situation with Facebook and Cambridge Analytica is not unique. It is the story of technology used as designed to create outcomes its creators did not consider. Behind it stands a long line of equally troublesome stories of technology unbound by ethical guidelines laying the groundwork for future world-changing events we will read about in similarly shocking exposés.

Allow me to put together one probable example. I call it:

The Three Horsemen of the AI-Pocalypse

Part 1: Say Anything

In November 2016, Adobe demoed a new application called VoCo. It analyzes existing recordings of someone’s voice to allow the user to make that voice say anything. In an on-stage showcase (video below), an engineer demonstrated how using the software he could type in a sentence and play it back with the voice of actor Keegan-Michael Key:

While Adobe may not ever ship VoCo to market, the science behind it will no doubt be picked up and used by another software vendor at some point.

Conclusion: Technology will make it close to impossible to know if a voice recording is genuine.

Part 2: Lip-sync Nightmare

In July 2017, researchers from the University of Washington released a paper and a video demo (below) of an algorithm that can turn audio clips into a realistic lip-synced video.

Conclusion: Technology will make it close to impossible to know if the video you watch of a person talking is that person actually saying the words you see their lips forming.

Part 3: Face/Off

Over the past year, several so-called “face-swap” apps powered by neural networks have emerged allowing users to quite literally swap the faces of people in video clips for other people’s faces. While this technology has spawned an amusing meme where people swap out the faces of actors in famous movies for that of Nicholas Cage (video below), it has also resulted in the emergence of face-swap porn (Motherboard article).

Conclusion: Technology already has made it close to impossible to know if the person in the video you are watching is the actual person in the video.

Combined, these three technologies create a scenario where literally all the salient parts of a video can be faked by anyone with access to a computer. And what do all these technologies have in common? They were made with the best intentions, with the aim of solving real technical problems using technology.

Building an Ethics for Tech

The problem the tech community faces today is the problem countless other professions have faced in the past: While we want to be able to do anything and everything, our decisions have real-life consequences and we need to take responsibility for them. To turn a classic philosophical problem on its head, the tech community needs to start putting “ought” before “can” and consider whether something should be built before it is built. It may be time to retire “move fast and break things” and replace it with something slower and more mindful.

In 2009, Simon Sinek famously explained how great leaders succeed by asking “why.” His three concentric circles of what, how, and why have become a staple in design and management thinking and has transformed how we approach problems in the tech industry. But something is missing from Sinek’s diagram, something Aristotle and Heidegger pointed to in their studies of the Four Causes: At the center of the diagram there should be a fourth ring asking “To what end?

As I said earlier, every design decision is one made on behalf of the end-user. From this follows a bigger and more important realization: With every design decision, a path to the future is carved out for the end-user to follow. In other words, designers build the future with every decision they make. When we evaluate our decisions, it is not enough to ask “with what should we build it” and “how do we do that” and “why are we doing it”. We also have to ask ourselves “to what end? What world are we creating with this decision?”

That’s what an ethics for the tech industry is all about.

More to come

I have been working on this problem since 2009 and over the past two years I’ve put together a framework for a process of ethical evaluation to give tech workers the tools they need to identify ethical issues in their work and make sound decisions. That framework is summed up in my article "Using Ethics in Web Design" published by Smashing Magazine.

Alex Fryer

Senior SaaS Product Lead at Zebra Technologies

6 年

Shareholders aren’t too worried about how the $s “ought to be made”. If you doubted this it’s worth looking into Herbalife’s funding support vs ethics. The issues is that regulators are too far behind. And why ? Because Facebook Google Twitter make it much more attractive to the smartest of us to work on their side than the regulator’s.

回复
Simone Scelsa

Design HSE Manager, Technical Safety

6 年

If you don't deploy a particular technology that is beneficial for some users, somebody else will. You cannot regulate technological development based on some ethical principles as not all the 'actors' involved, being manufacturers or beneficiaries, will share the same ethical framework. The only option is to develop more sophisticated countermeasures

Aceite Las Cruces SpA

Aceite Vegetal Prensado en Frío

7 年

ética y conciencia son conceptos subjetivos.

回复
Thang Cao (He/Him)

ESL Teacher | Technologist | Veteran

7 年

Nice job, putting this article together. Thank you.

回复

Interesting read. Agree, we it to take it slow if it ought to be thoughtful, ethical.

回复

要查看或添加评论,请登录

Morten Rand-Hendriksen的更多文章

  • After WordPress

    After WordPress

    Today, the head of the WordPress Open Source Project Matt Mullenweg unilaterally locked the gates to wordpress.org, the…

    60 条评论
  • As the Mask Drops, It's Time to Face the Politics of Tech

    As the Mask Drops, It's Time to Face the Politics of Tech

    "Is it really?" She gestured at my hoodie and the bold text across my chest reading "Code is Political." "Profoundly…

    22 条评论
  • Rubicon

    Rubicon

    On Saturday October 12, 2024, a line was crossed in the WordPress open source project that I fear will have a lasting…

    24 条评论
  • As We Break Surface – The AI Transmutation of Web Dev

    As We Break Surface – The AI Transmutation of Web Dev

    "Hey AI, build me a website." It's a matter of time - probably months, before we get here.

    10 条评论
  • It’s time to abandon reckless oil propagandists

    It’s time to abandon reckless oil propagandists

    A response to Dan McTeague’s Financial Post opinion piece “It’s time to abandon reckless EV mandates” published July…

    13 条评论
  • AI Training and the Slow Poison of Opt-Out

    AI Training and the Slow Poison of Opt-Out

    Asking users to opt-out of AI training is a deceptive pattern. Governments and regulators must step in to enforce…

    7 条评论
  • GPT-4o, OpenAI, and Our Multimodal Future

    GPT-4o, OpenAI, and Our Multimodal Future

    OpenAI held up a big shining arrow pointing towards our possible future with AI and asked us to follow them. Beyond the…

    12 条评论
  • Ten Questions for Matt Mullenweg Re: Data Ownership and AI

    Ten Questions for Matt Mullenweg Re: Data Ownership and AI

    Dear Matt. 404 Media tells me you're in the process of selling access to the data I've published on WordPress.

    11 条评论
  • AI Coding Assistants Made Me Go Back to School

    AI Coding Assistants Made Me Go Back to School

    The introduction of graphing calculators didn't remove the need to understand math; they removed the need to do rote…

    13 条评论
  • The Challenge, and Opportunity, of OpenAI's GPT Store

    The Challenge, and Opportunity, of OpenAI's GPT Store

    If you make a useful GPT in the GPT Store, someone else will publish a more popular copy and make yours look like a…

    8 条评论

社区洞察

其他会员也浏览了