Revenge of the Machine
Image courtesy of John Holcroft Prints.

Revenge of the Machine

It’s nearly 30-years since Tenner introduced us to the Revenge Effects and with the pace of technological change, we are not lacking for examples. The rapid adoption of low carbon technologies is providing many real-life examples. To me, the value of the Revenge Effects is the reminder that we do not operate in a vacuum; everything we do is within a system and, like Newton’s Third Law, for every action there is an equal and opposite reaction. There is more to Tenner’s idea that perhaps first meets eye. Below I explore the depth of his idea, providing some examples. Please do share your examples of where you see technologies Revenge Effects.

Revenge, a play of four parts

According to Tenner ‘Revenge effects are ways that technologies can solve one problem while creating additional worse problems, or create new types of problems, or shift the harm elsewhere altogether’. In short, technologies ‘bites back’. I love the image Tenner conjures of technologies ‘biting back’, like a vengeful snake. My world of safety and environmental risk management is full of examples.

Tenner helpfully provides four categories of the Revenge Effect:

  1. Repeating effects refers to how more efficient process cause us to do the same thing more often, so no actual time is saved. This is the paradox of efficiency, we introduce technology to save time, but the promise is not realised. Tenner provides the example of improved domestic appliances which have raised our expectations for cleanliness, meaning we actually spend similar or often more time on housework. ?Some writers have argued repeating effect is evident in healthcare whereby medical procedures have reduced the time we spend in hospitals and hastened recoveries. But the number of procedures and medications a patient undergoes has risen significantly. I can also think of many examples of digital processes, both within work and our everyday lives, that promise time saving, but when considered in the round, take longer.
  2. Recomplicating effects refers to how as technology advances, it often increases in complexity and this creates significant drawbacks. Just look to the TV remotes (plural) for an everyday example of this. I visited a hotel recently where I had to download the hotel app to access the room, operate the A/C, switch the room lights, close blinds and operate the TV. I am not known for my patience, but my cries for a manual light switch and a simple room controller fell on deaf ears when I interacted with the room chatbot. My chilling Room 101 nightmare was an example of the recomplicating effect, the forward march of technological complexity cost me 30 minutes of my life that I’ll never get back. Recomplicating effect – there’s any app for that…
  3. Regenerating effects occur when attempts to solve a problem end up creating additional risks. Safety, health and environmental are abundant with examples, and the interesting ones involve the conflict between two or three of them. The use of DDT and similar pesticides to eliminate insects has long been known to make them resistant or eliminate their natural predators. Similarly, while the development of antibiotics have been among the most important forces extending human lifespan in the last hundred years, their overuse has led to the creation of strains of antibiotic-resistant bacteria that are harder to treat. I’d argue Jens Rasmussen’s “Defence in Depth Fallacy” is an example of regenerating effect. Rasmussen argued more safety controls and defences can have the countereffect of making a safety system more complex, fragile and thus prone to catastrophic failure. Dare I mention the how work from home is often presented as lower risks and greater benefits, whereas research suggests decentralising work from the office to home working often leads to greater captivity to work rather than greater freedom, and working hours are typically longer.
  4. Rearranging effects occur when costs are transferred elsewhere so risks shift and often worsen. The widespread use of asbestos as a fire retardant led to its use in ships, building and infrastructure (marketed as the magic mineral no less), but this created a deadly health risk which workers, families and communities paid for. Similarly, the use of lithium-ion batteries is reducing carbon emissions and air pollution, but leading to a sharp rise in domestic fires. The large-scale adoption of air conditioning units in workplaces and homes to address rising temperature is increasing the heat load in the adjacent properties and communities.


Unpacking the idea

Tenner offers a number of reasons for the Revenge Effects. He argues that isn’t technology itself at fault, but rather our tendency to “anchor it in laws, regulations, customs, and habits,” coupled with an inability to anticipate the unintended and unpredictable interactions between individual components acting as a system. The individual component of a vehicle, factory, computer, airplane or power grid are understood easily enough, but the parts combined form systems and subsystems that behave randomly. So this is actually a critique of technological determinism and the managerial faith in predictability and control. Now here’s an interesting alignment with James C Scott’s critique of ‘Authoritarian High Modernism’, Charles Perrow’s ‘Complex tightly-coupled’ and latterly John Downer’s notion of ‘Rational Accidents’. So this is really a critique of technical rationality and determinism, our belief that we can control everything through rational, logical and empiricism. In his seminal work ‘ Thinking Like a State’, James C. Scott puts it well:

“A recurrent theme of Western philosophy and science, including social science, has been the attempt to reformulate systems of knowledge in order to bracket uncertainty and thereby permit the kind of logical deductive rigour possessed by Euclidean geometry”

Tenner advises that if we learn from revenge effects we will not be led to renounce technology, but we will instead refine it: watching for unforeseen problems, managing what we know are limited strengths, applying no less but also no more than is really needed.

The Revenge Effect is not new. In the great DisasterCast podcast series, Drew Rae reveals how the introduction of the Safety Lamp in the early 1800’s, which was invented to create lighting without invoking a methane explosion, initially led to an increase in mine explosions and fatalities. Believing that the Safety Lamp had addressed the risk of pit explosions, owners reopened mines that been closed due to concerns on excessive danger. Over-confident in the technical solution, the mine owners also ?encouraged faster and less cautious mining, but the newly adopted Safety Lamps were not fail safe and could, if damaged spark an explosion. It is noted that the Felling Colliery explosion (1812, 92 fatalities) and the St Hilda Collery disaster (1839, 51 fatalities) both long after the introduction of the Safety Lamp. Hence, the revenge effect of the Safety Lamps led to over-confidence in the technology and increased disasters until the Safety Lamp was improved. Refer to the link to the podcast below.

The Last Word

My overall impression of Tenner’s idea is that as we complicate the systems which govern our lives, revenge effects multiply. This is not some Luddite anti-technology argument - new technologies certainly do improve the quality of our lives. But we need to recognise that more technology often increases complexity and this demands more, not less human work and vigilance. More technology can also make a system more fragile. Complex technologies that interact with our human system always produce revenge effects. Like a game of ‘Whac-a-Mole’ for every acute problem solved, a chronic problem quickly replaces it. When operating in a complex sociotechnical system, we can only foresee and predict so far.

So what is your experience?

  • Where have you see example of the revenge effects of technology?
  • Perhaps you introduced a game changing process, chemical or technology that had huge promise, only to discover it had introduced unintended consequences elsewhere within the system.
  • Maybe you have launched new environmental technology to reduce carbon only find out that the benefits were offset by significant unforeseen issues or hazards?

I’m always interested to hear examples, so please share what you’ve experienced technologies ‘biting back’.

References and further reading

?? Edward Tenner, ‘Why Things Bite Back: Technology & the Revenge of Unintended Consequences’ New York: Alfred A. Knopf, 1996

?? DisasterCast Safety Podcast. Episode 3 – Risk Acceptance and Coal Mine Disasters. https://www.listennotes.com/podcasts/disastercast/episode-3-risk-acceptance-p_0WZzNtna2/

?? A TedTalk with Edward Tenner on his ideas. https://www.ted.com/talks/edward_tenner_unintended_consequences?subtitle=en




Maybe risk is like the game of whack-a-mole – it never really disappears, but pops up elsewhere in the system.


minal M.

Sr. Policy Analyst: Systems Thinking and Wellbeing || blending Art and Science

5 个月

Sounds a lot like the Rebound Effect in industrial ecology - solving for one visible problem that creates new hidden effects that isn't often measured. Looks good on the surface and ends up compounding the mess. https://www.sciencedirect.com/science/article/pii/S2666784321000267

Phil Strong

Managing Director at Ergo Ike Ltd (home of Phil-e-Slide range of products)

6 个月

James Pomeroy ,”rebound effect” per chance , by any other name?????

Mario S.

HSEQ Compliance Professional

6 个月

Thanks James. I think the affect of social media on the minds of society is one item of technology that is playing out in real time. It was supposed to connect people but people are now more lonely than ever, narcissistic and addicted to likes. Who knows what impact this will have on society in the future.

Patrik Lund

Social and environmental risk mitigation

6 个月

James, interesting concept! I disagree with the use of the word ‘revenge’ as this implies the technology has some kind of ‘awareness’. Whereas the truth is all the limited upsides with delayed and more significant downsides are completely caused by our own actions and lack of understanding. ? The subcategories are much better especially the first two: repeating and recomplicating effects. The third and fourth seem the same or at least very similar. ? In response to your request for examples, your article highlighted the problems caused by smartphones. As more and more service providers assume everyone has a smartphone, they begin to design their services on the basis of this assumption – however, the results of these ‘design improvements’ are frequently focused on increasing the convenience and profitability of the business instead of the convenience of the consumer / client. Your hotel example highlights this perfectly*. ? (continued in reply)

Agree James Pomeroy, healthcare is awash with bite-backs. The relentless push to shorten the length of stay in hospitals triggers both a Repeating effect and a Rearranging effect. The patients in hospital are on the whole more acute, and the hospital frequently hands the patient back to a primary care sector that is not resourced for post-acute care. And the Recomplicating effect means that technology has allowed medicine to attempt 'good' in ways that were once unthinkable. Surgery in the 100yr old + cohort at one extreme, and survival of 24-week gestation foetuses at the other. Society wants medicine to be heroic, and much of medicine willingly puts their hand up for that hero role - it is undoubtedly a rewarding one when all goes well. And all you need is the n=1 survival story to set a new consumer expectation or clinician aspiration. From that point, the decision to not take the system to right to the boundary of feasibility and safety is somehow immoral or cowardly (regardless of how that decision impacts the system elsewhere).

要查看或添加评论,请登录

James Pomeroy的更多文章

  • Cartesian Anxiety: Without a Plan and Metrics, We’re Lost

    Cartesian Anxiety: Without a Plan and Metrics, We’re Lost

    There is a wonderful phrase in Karl Weick’s book ‘Sensemaking in Organizations’ called Cartesian Anxiety. When I first…

    45 条评论
  • If This Was Happening, I’d Know About It

    If This Was Happening, I’d Know About It

    When we look back after a major wrongdoing or a major incident, it's common to identify information that was not acted…

    30 条评论
  • Commitment, Capacity and Expectations

    Commitment, Capacity and Expectations

    At the heart of Enactment is the idea that action precedes cognition and in turn, shapes our understanding of a…

    4 条评论
  • Enactment, when doing produces knowing

    Enactment, when doing produces knowing

    How often do you hear the idea that thinking must precede action. It is rational and logical way to approach…

    12 条评论
  • Order in a Chaotic World

    Order in a Chaotic World

    Although it was published 30 years ago, Margret (Meg) Wheatley’s book remains one of the most popular and approachable…

    14 条评论
  • Power is the capacity generated by relationships

    Power is the capacity generated by relationships

    I’ve been reading Meg Wheatley’s classic book ‘Leadership and the New Science’ which applies complexity theory to…

    16 条评论
  • What is missed when we overlook the “And”

    What is missed when we overlook the “And”

    Many of the books I have read recently on Complexity Science and Systems Thinking include this quote to demonstrate the…

    43 条评论
  • The Blessings of Disaster

    The Blessings of Disaster

    This is my review of Michel Bruneau's book The Blessings of Disaster. At its core, this book argues that natural and…

    14 条评论
  • Decoupling, organisations with two faces

    Decoupling, organisations with two faces

    In this article I’m exploring the metaphor of ????????????????????. This refers to a business practice where…

    34 条评论
  • What Tools Will You Drop?

    What Tools Will You Drop?

    The 1949 Mann Gulch tragedy is one of the greatest stories in the study of disasters. A tragedy has given rise to a…

    11 条评论

社区洞察

其他会员也浏览了