In our quest for convenience, are we making Devil’s bargains?
Image by Tumisu from Pixabay

In our quest for convenience, are we making Devil’s bargains?

By Lubna Dajani and Peter Dorrington

As a society, we are more connected than ever, and with that connectivity comes great opportunities but also risks - many hidden in plain sight – such as in the increased use of Artificial Intelligence.

In this article, we will discuss some of the issues raised in today’s world, our addiction to convenience and instant gratification; where our data is constantly gathered, analyzed, correlated, and used in ways most of us can’t conceive, and maybe exposing us unimaginable dangers never before imagined. In doing so, we will use an analogy – the legendary deal made between Faust and the Devil.

Like us, you have probably moved everything you can to the cloud, from your pictures and music to your money; it’s all stored out there somewhere in cyberspace and it’s readily accessible to you and to the people you chose to share it with – all through your smartphone and your many other connected devices. You do so because you find it convenient, think of it as secure, and it fits in with busy lives. It all seems so effortless compared to how you used to do things a decade or so ago.

With little to no thought, and in exchange for having instant access to your friends and your ‘stuff’ at any time of day or night and from anywhere, you readily hit the ‘accept all’ button about terms and cookies or input your personal data on demand: after all, it’s the only way to get what you want done and everyone else does it. However, you may be unknowingly consenting to, and providing, a constant stream of data to third-parties; your network operator, your favorite search engines, social network, app providers, and an untold number of advertising services and buyers of your data.

In the legend, Faust is a ‘scholar’ - a man at the pinnacle of his career and yet he wants more. In exchange for more knowledge and magical powers, Faust makes a deal with the Devil (through his agent, Mephistopheles) – selling his soul in the future for power now. With his newly-granted powers, Faust is able to indulge every whim and learn the knowledge of the world. However, in the end, and as agreed, the Devil appears and claims Faust’s soul.

For the purpose of this article, we want to focus on one aspect of the legend – what Faust knowingly traded away in exchange for his version of instant gratification.

In our analogy, your digital ‘soul’ (your data) is what you are bargaining away in exchange for free access and more conveniences now. But what are you really giving away? Do you know what is happening to all that data? Do you appreciate the correlation between your digital alter-ego and your real-life wellbeing?

In many stories and myths, the cost of satisfying wishes can be high and appear in unexpected ways. As a consumer, you have little real control over how your data is collected or how it's used, and by whom. In reality, all too often, consumers don’t care – so long as their wishes are granted with little effort on their part for "free". However, in this competitive commercial world, those organizations may be presenting you with choices that prioritize their needs, rather than yours.

What’s more, they use technology and psychology to formulate messaging tailored for you that they are constantly whispering in your ear; that you need this product, to shop at that store, or to ‘buy now to avoid disappointment’.

It all appears so reasonable, they seem to know what you want (or should want) and feed on your insecurities as well as your desire to be happier, better looking, more popular, and so on.

Why care about how it works, as long as it does?

Despite the horror stories about data breaches, or misuse of personal data, most people have little idea of the feeding frenzy around their data and just how much of it is being harvested by governments or businesses. Even fewer know the ways in which that data is analyzed to form an insight into who you are, what you do, why, and how to exploit those insights.

For this article, Peter explored what a popular search engine provider ‘knows’ about him; that he is a married, middle-aged male, a business owner, who is also interested in astronomy, web-hosting, science fiction and about 150 other topics. However, it has also incorrectly identified that he is interested in beauty services, coffee makers, country music, and about 50 other things. Two things worry us about this; 1) they know a lot about Peter and 2) about a quarter of what they ‘know’ is wrong (for now).

Bearing in mind that business decision making (especially using Artificial Intelligence) is only as accurate as the data used to inform it, this means that they are going to be making a lot of incorrect assumptions and ill-informed decisions. Also, Peter has no insight into how those decisions are being made, or control of what data is being collected about him and why (and if you have ever clicked ‘accept all’ on the cookie banner, you are just as vulnerable as Peter and the rest of us).

In a previous article, we talked about the deliberate misuse of this power to manipulate consumer behavior – so-called ‘dark psychology’, but whether it’s done in the name of good or ill, everyone needs to be careful about how this power is used. AI may eventually solve some of the most intractable problems of our age, or create a whole host of new ones.

Like any innovation, Artificial Intelligence is a tool, a double-edged sword if you will; it is neither ethical nor immoral, intrinsically good or evil – that’s down to the discretion of those who wield it. It is up to humans to set limits on where AI is applied and how it acts.

If you teach an AI that only the ‘ends’ are important, it will surely come up with some unpalatable (to us) ‘means’ (utilitarianism) – as Facebook discovered, unconstrained AI will find the most efficient ways to achieve an end, but few humans would say about the things they treasure the most that it is because of their ‘efficiency’.  

According to behavioral economics, humans have over 150 ways our logic lets us down (cognitive biases) and if they are known, they can be anticipated and used (even manipulated). Marketers have been doing it for decades and now organizations are teaching the machines to use their huge compute power to focus it on the individual, down to the level of individual feelings and motivations.

AI can now anticipate your mood, your daily movements, even your need for healthcare, and make recommendations that directly influence your decision-making process; from which route to take as you drive home, which movie to watch, or which product to buy, and perhaps even influence the way you vote. But it’s still humans who decide how and whether an AI is working well or not (at least for now).

Taking our analogy further; the ‘Devil’ (AI in the hands of unscrupulous big business/government) has tremendous insight and overwhelming power and you by comparison have very little – it is not a bargain between equals. Your data is indeed the food and fuel that powers and informs these systems, and today we all for the most part just ‘agree all’ to giving away our golden goose (mixing metaphors) for the sake of perceived convenience or social ‘popularity’.

That doesn’t mean that you are powerless – just that you need to exercise your power over what is yours - your data. What is more, it needs to be as simple and natural an experience as it is for you to walk without thinking, or stretching out your hand to grasp your teacup when you want a sip.

To achieve this, we have to make a simple binary shift in our thinking and reset the supporting frameworks we live by. For starters, we need to humanize the language of technology so that it is intelligible to all and not the incomprehensible jargon and technobabble it currently is. We also need to redirect our investments to more human and life-centric projects rather than ‘profit by any means’ a good example of this is the EU's Next Generation Internet (NGI and the important work being done by organizations like W3CISO, the IEEE, and the many other interest groups, where thankfully a great deal of work is being done on ethical AI standards and related technology for good. Possibly most importantly we need to put competitive greed aside and foster collaboration.

When we venture outside of our silos and bring together our specialized viewpoints into a co-creation process that questions the limits of what is possible and resetting the permissible, we will illuminate paths to prosperity otherwise hidden in plain sight. Like Lubna always says, “If you only saw red and I only saw blue, how would either of us ever see purple?” 

In Conclusion

We know that our analogy is not perfect, but individuals need to consciously safeguard their power (their digital selves). Governments and enterprises also have a responsibility not to abuse their overwhelming advantage either - if they won’t do it voluntarily, we will need policy and regulation with the power to back it up.

Let’s be realistic; we can talk all we want and ask all we want but if we do not reset our definition of success, we cannot expect change to happen. We cannot dream of planetary wellbeing if fiduciary responsibility to is shareholders not stakeholder and corporate success is only measured by profit. 

On a wider front, we need to evolve our human socioeconomic and geopolitical systems, as the current models are concentrating knowledge, control, and wealth into the hands of a very few de-facto superpowers. Otherwise, as the world becomes increasingly ‘smart’ and connected, individuals are in danger of becoming irrelevant. We will be well served by shifting our thinking from ‘what’s in it for me?’ to ‘what’s in it for us all?’.

We call on you, the reader, to get involved; think about what ‘accept all’ may mean and exercise your right to decide how much of your data to share, with whom, and why.

Remember, just like Faust, be careful what you wish for and what you give away in exchange!

About Peter Dorrington

Peter is the founder of XMplify Consulting and an expert in using a combination of data and behavioral sciences to lead transformation in the field of Experience Management (XM).

Over the last 5 years, Peter has been focused on developing and using Predictive Behavioural Analytics to understand why people do what they do, what they are likely to do next, and how businesses should respond. As the inventor of Predictive Behavioural Analytics, Peter is an internationally recognized expert in the field of Customer Experience analysis. In addition to this, an executive advisor, award-winning blogger a

About Lubna Dajani

Lubna is a pioneering information and communication technology innovator and design thinker with over 25 years' executive experience with multinational brands. A champion and role model for diversity, inclusion, and women in STEAM, she is trusted advisor, board member, and mentor to social enterprises and accelerators including SOSV and Springboard Enterprise. Lubna is also an active contributor to several standards and industry bodies, including IEEE, W3C, and Sovrin Foundation.

Lubna is committed to applying technology, sciences, and the arts to elevate the human experience and regenerate planetary wellbeing.

Mirna Hidalgo

Creative thinking - Collaboration / Leadership, Learning and Development advisor / Lecturer/ Coach / Visual Artist

3 年

Always a visionary Lubna. Very good article, congratulations to both of you.

Peter Dorrington

SaaS Founder, Event Host/Moderator, Experience Management Expert, International Keynote Speaker, Fractional NED.

3 年

Great to collaborate with you again Lubna - you always provide insight and inspirational thinking about the implications of when humans and tech intersect!

要查看或添加评论,请登录

社区洞察

其他会员也浏览了