Clear And Present Danger ndeed
Apurva Purohit
Co-Founder Aazol | Independent Director | Author - Lady, You're the Boss | Lady, You're not a Man |
For those of you who are familiar with the woes Facebook is facing today, you’d agree with me when I say the Cambridge Analytica data scandal was probably the proverbial straw that broke the camel’s back. It’s widely regarded as a watershed moment in the public understanding of personal data and for many, it was most likely one of the first times that a tech corporation’s wrongdoings had been so clearly laid bare.
Author and scholar Shoshana Zuboff calls this harvesting of personal user data, for business and other ulterior motives, ‘Surveillance Capitalism’. Let’s understand how this works. Corporations provide free services to billions of people and in lieu of said offerings, monitor the behaviour of their users in astonishing detail – often without their explicit consent. While some of the data gathered is relevant for service improvement, the rest is classified as ‘proprietary behavioural surplus’. This data is then processed via ‘machine intelligence’, to predict user behaviour, which is then traded in, what Zuboff calls, the ‘behavioural futures markets’.
Now, for those of you thinking Facebook was the evil monger who started this, well, prepare to be surprised. The term for the practice might be recent, thanks to Zuboff, but the seeds of surveillance capitalism were sown way back in 2001, by none other than Google during the dotcom bust. Google’s investors were threatening to pull out, so in a bid to up the offerings, Google turned to previously discarded and ignored data logs and repurposed them as ‘behavioural surplus’. Instead of being used for product improvement, this behavioural data was directed toward an entirely new goal: predicting user behaviour. Since then, corporations have smartly cashed in on our increasing dependence on technology and the internet for even our most mundane daily activities while ensuring that their habit of storing consumer data seems harmless and inconspicuous.
So how deep into this abyss are we now? Unfortunately, pretty deep, because surveillance capitalism is no longer restricted to few individual companies or just to the technology/internet sector. It has spread across a wide range of products and services, encompassing virtually every economic sector, including insurance, retail, healthcare, finance, entertainment, education, transportation, and much more. It has given rise to whole new ecosystems of suppliers, producers, customers, market-makers, and market players. Nearly every ‘smart’ or ‘personalised’ product or service, every internet-enabled device, every ‘digital assistant’, is nothing but an enabler, a ‘supply-chain interface’ for the unhindered flow of behavioural data, all geared up to predict our futures in a surveillance economy, helping corporations rake up profits.
It’s not surprising then, that with each passing day, we find it increasingly difficult to participate effectively in society without being a victim of this practice. For example, American non-profit organization ProPublica reported late last year that breathing machines purchased by people with sleep apnoea are secretly sending usage data to health insurers, where the information can be used to justify reduced insurance payments. The Google-incubated augmented reality game Pokémon Go is a brilliant example of covert surveillance. When Pokémon Go debuted in app stores in 2016, it was seen as a mostly harmless foray into the world of augmented reality but what it really did under the veneer of AR driven gamification, and community-building was to collect vast amounts of data from millions of people.
The rabbit hole just keeps going deeper, doesn’t it?
So, what are the far-reaching repercussions of surveillance capitalism? Well, the writing on the wall is clear for what is the end goal of this machinery. Surveillance capitalism seeks to make society a place to be modified and controlled, undermining individual self-determination, autonomy and decision rights. It is an eventual threat to democracy because the large scale “behavioural modification” it employs, completely erodes democracy from within since without autonomy in action and in thought, we have little capacity for the moral judgment and critical thinking necessary for a democratic society. We will believe we are making independent judgments, little realizing we have been ‘nudged‘to change our opinion of a politician or our breakfast cereal based on the data gathered about our preferences.
For a moment, think about what this could mean for future generations who would’ve grown up with these new forms of technology. They might just become incapable of having the ability to choose their behaviour and beliefs, becoming unknowing pawns at the hands of corporations and governments, who feed them with selective information. Information that has been gathered by watching them is already biased for action. It is high time we understood the worth of the data we are so carelessly leaving behind all around us. It is imperative that we as consumers control exactly what we want to share, how much we want to share and understood what purpose that data will be used for.
Data has come to become a valuable transactional unit in this surveillance economy and it’s time we treated it just like money – sharing it prudently, cautiously and with a sharp eye on what the trade is worth to us.
(This article was published in Financial Express)