The Road to Hell is Paved With Good IOT Intentions

The Road to Hell is Paved With Good IOT Intentions

My previous article about framing the state of the world as BANI was about how the world intuitively feels like, and how that intuition can be correct.

Here, I will touch on a point where intuition is doing us a massive disservice.

 

5G! IoT! 4th Industrial Revolution! Society 5.0!

For a while now, the tech industry has waxed lyrical about the Internet of Things. The hype around 5G is also reaching deafening volumes but when you dig deeper, it, too, really turns out to be mostly IOT hype in disguise.

Connecting everything, as IOT promises to do, often intuitively seems like a great idea. “Everything that can be connected, will be connected” proclaim several vendors and operators alike, presumably intending to arouse positive emotions of awe, inspiration and expectations of a great future.

Instead of awe and inspiration, I’m concerned and anxious at the prospect of indiscriminately connecting everything.

Why? There are many reasons, but I want to address just one now: the systemic risks created by this trend.

 

Welcome to the Danger Zone

Let me introduce you to Perrow’s Matrix – a method of thinking about systems developed by Charles Perrow some decades ago. It is a simple way of mapping systems of all kinds according to their degree of coupling (loosely vs tightly coupled) and complexity (from linear to complex).

This is roughly what the world looked like 30 years ago:

What’s with the yellow quadrant? That is what is known as the “Danger Zone”, so called because complex and tightly coupled systems are inherently dangerous.

Think about it; when you have a complex system that is loosely coupled, stuff can go wrong but it doesn’t result in a cascading failure because of that loose coupling – there is slack, and usually, time, to notice and recover from any errors or mishaps. 

On the other hand, when things are tightly coupled but linear, they are easier to model, easier to understand, and easier to keep under control – and any failures are far more predictable.

It’s when a system is complex and tightly coupled when it becomes risky; you can’t know what will go wrong - the only guarantee you have is that something will go wrong, and often in an unpredictable manner. These are the kinds of systems where a dam sensor malfunction can set off a chain of events that lead to flooding, or a misbehaving computer system leads to jailing the innocent (both cases that have actually happened).

Now, we didn’t have too many systems in the Danger Zone 30 years ago.

Today, however, the picture is different as the danger zone is getting crowded:

Automation and connecting everything to everything, often in the name of efficiency, is making systems both tightly coupled and complex.

Much of the ICT industry is about driving systems into the danger zone.

Oops.


Caveats Galore

A few things need to be pointed out:

  • Nobody is doing this consciously or out of malice. It’s just an emergent property of the system; an unintended consequence that is often not understood.
  • While in some situations, architecting systems differently allows them to be brought out of the danger zone, I am not advocating we shouldn’t have any systems in the danger zone; obviously that is not practical.
  • I have worked decades in the IT and communications industries, so why am I trying to ruin the party? I’m not; quite the contrary - I’m trying to do my bit to save it. I have no time for people who think one is not allowed to talk about the difficult and uncomfortable aspects of one’s industry, so if you want to complain about that, save it. History shows that it is ONLY through acknowledging the unintended consequences that we can work to avoid them and drive a positive change. '


Managing the Risk

While systems in the danger zone are inherently, well, dangerous, we can learn to successfully live there. Many industries already do, in fact: nuclear power and aviation, for example, have shown that successful, and by all reasonable metrics remarkably safe, operation is possible in the danger zone. It just took them a while to get there.

Much of the risk for other systems stems from the fact they didn’t use to be in the danger zone but are now rapidly being taken there. It is new and unknown territory for them, and we regularly see examples of what happens when one is unprepared. The experience, processes, methodologies – even culture – necessary to successfully navigate the danger zone does not exist in most industries and/or organisations.

I agree with Bruce Schneier who urgently calls for more regulation on IOT security because the cost of getting it wrong is becoming too high. The industry needs a security intervention.

But the industry needs much more than that.


Learning to Fly

The good news is that we don’t have to re-invent all that much stuff; we just need to look outside our patch. In learning to live with systems in the danger zone, we can learn a lot from aviation. Aviation in all its aspects has always fascinated me. Early on in my tech career, I was this close to switching over to becoming a pilot – the interest, however, never went away.

I have spent the past few months studying the safety and resilience aspects of aviation in some depth; from CRM (Crew Resource Management; not the one it usually stands for among tech companies) to Just Culture (not ‘just’ in the ‘only’ sense) to institutionalising learning from failure (we talk the talk on failure, but don’t have anything like NASA's ASRS etc. to actually make the learning effective and industry-wide) and many other aspects. The ICT industry can, and should, adopt and adapt many of these effective methods and approaches and do so broadly.

Instead of just indiscriminately connecting everything that can be connected, we need to learn to fly those systems into the danger zone in a controlled fashion – and where better to learn that from than the organisations that already do?

While at it, we also need to take on a stewardship role of guiding our customers there. They are often even less informed about the systemic risks.

It’s not a “nice to have” either; we need to do this.

The cost of getting it wrong is already too high, and is increasing.


Neal Naimer PhD

Technology, BizDev & Policy

4 年

While it’s critical to mitigate any perceived risks, systems/institutions that don’t become integrated will be left behind in the dust. Same arguments apply for AI/ML/DL.

回复
Martin Ellis

A CV writer who gives you your voice - Not mine - Since 2011. Ex Candidate, Ex Hiring Manager, Ex Headhunter, Ex Recruiter

5 年

"Perrow's Matrix" - I'm using that. Other people will think I'm brighter than I am - and will think I'm even brighter when they discover I can explain it.

Michael Gogos

Senior Product Manager

5 年

Great read. I'm looking at this through my own filtered lense of data protection/disaster recovery/business continuity. It's a problem we've (mostly) solved in data center IT, and are solving for cloud. But IoT is just not on the radar. And what you've focused on here is a big part of the reason why...

回复

Thanks for this insightful post. It re-confirms that decoupling is key for successful systems design..

Tuija N.

GROWTH ACCELERATOR | VIRTUAL EXPERIENCES | AI and WEB3 | DIGITAL TWINS | DATA ANALYTICS | BLOCKCHAIN | SUSTAINABILITY

5 年

Excellent article. We as humans must put a lot of serious thinking to enable this new world - 5G connectivity of everything (IoT), AI and ethics. I recently started updating my risk management studies for this reason - how do we manage the transformation and the related risk to be on the winning side with technology??

要查看或添加评论,请登录

Sami M?kel?inen的更多文章

  • Smuggling in the moral crumple zones

    Smuggling in the moral crumple zones

    You may or may not know the concept of the moral crumple zone. It’s a term coined by Madeleine Clare Elish, PhD some…

    1 条评论
  • Oh look, there’s a shiny new thing – let’s choose sides and fight!

    Oh look, there’s a shiny new thing – let’s choose sides and fight!

    Ah, humanity. Give us any shiny new thing, and we'll find a way to turn it into a battlefield.

    16 条评论
  • Face-to-Face Business in the Era of AI Avatars

    Face-to-Face Business in the Era of AI Avatars

    A fascinating phenomenon is on the horizon. We are not quite there yet, but in a year or two we will be: in online…

    5 条评论
  • How We Feel About the Future

    How We Feel About the Future

    Late-stage capitalism; post-truth; climate apocalypse; permanent crisis - for years now, we have been told that we’re…

    14 条评论
  • 10 Essential AI Considerations for Boards and Organisations

    10 Essential AI Considerations for Boards and Organisations

    Future-ready businesses not only outperform the average, but also significantly outshine them, enjoying 33% higher…

    9 条评论
  • Revisiting The Dark Mountain Project in the Age of Climate Anxiety

    Revisiting The Dark Mountain Project in the Age of Climate Anxiety

    Welcome to the long emergency. Whether it’s the Mediterranean or Hawaii on fire, China and Slovenia flooding or the…

    8 条评论
  • Where Does Efficiency Stop as a Value?

    Where Does Efficiency Stop as a Value?

    It’s no secret that the world worships efficiency and its cousin, productivity. If there’s a way to do something more…

    10 条评论
  • Foresight in the Age of Exponential AI

    Foresight in the Age of Exponential AI

    It’s only April, but it’s fair to say this year can be classified as the Year of AI - this time arguably with much…

    9 条评论
  • AI, Automation and Work as Imagined

    AI, Automation and Work as Imagined

    Right now, there’s plenty of excitement and concern alike about the rapid progress of AI. We went through a period of…

    11 条评论
  • Learning to Learn from Close Calls

    Learning to Learn from Close Calls

    Five-point summary: Flying is incredibly safe; why it is so safe is something few people pay attention to. It’s safe…

    7 条评论

社区洞察

其他会员也浏览了