Everything is about to change. We are not prepared.
While we are drowning in headlines about AI and technology, too often we fail to confront the bigger picture. The Coming Wave is my attempt to change that.?Amidst unprecedented peril and extraordinary promise, my book is both a stark warning and a hopeful guide to where society goes next.
Here is an excerpt for the #LinkedInBookClub. Pre-order here: https://lnkd.in/ebpy2rPz
The Containment Problem
Alan Turing and Gordon Moore could never have predicted, let alone altered the rise of, social media, memes, Wikipedia, or cyberattacks. Decades after their invention, the architects of the atomic bomb could no more stop a nuclear war than Henry Ford could stop a car accident. Technology exists in a complex, dynamic system (the real world), where second-, third-, and nth-order consequences ripple out unpredictably. Thomas Edison invented the phonograph so people could record their thoughts for posterity and to help the blind. He was horrified when most people just wanted to play music.
Understanding technology is, in part, about trying to understand its unintended consequences, to predict not just positive spillovers but “revenge effects.” Any technology is capable of going wrong, often in ways that directly contradict its original purpose.
As technology proliferates, more people can use it, adapt it, shape it however they like, in chains of causality beyond any individual’s comprehension. One day someone is writing equations on a blackboard or fiddling with a prototype in the garage, work seemingly irrelevant to the wider world. Within decades, it has produced existential questions for humanity. As we have built systems of increasing power, this aspect of technology has felt more and more pressing to me.
Technology’s problem here is a containment problem. If this aspect cannot be eliminated, it might be curtailed. Containment is the overarching ability to control, limit, and, if need be, close down technologies at any stage of their development or deployment. It means, in some circumstances, the ability to stop a technology from proliferating in the first place, checking the ripple of unintended consequences (both good and bad).
The more powerful a technology, the more ingrained it is in every facet of life and society. Thus, technology’s problems have a tendency to escalate in parallel with its capabilities, and so the need for containment grows more acute over time.
Does any of this get technologists off the hook? Not at all; more than anyone else it is up to us to face it. We might not be able to control the final end points of our work or its long-term effects, but that is no reason to abdicate responsibility. Decisions technologists and societies make at the source can still shape outcomes. Just because consequences are difficult to predict doesn’t mean we shouldn’t try.
Please comment with your reactions. Pre-order here: https://lnkd.in/ebpy2rPz?