Responsible Innovation: Living with Socio-Technical Systems
Chris Leong, FHCA
Director | Advisory & Delivery | Change & Transformation | GRC & Digital Ethics | All views my own
As normality returns to everyday life for many of us in the UK with large events and music festivals resuming, we are also learning to live our lives against the backdrop of rapidly rising COVID cases. The sudden shock from the impact of this virus over the past 3 years globally and the speed at which governments, scientists and health services were able to collectively respond, provided societies with a range of mechanisms to manage risks and adverse impacts both on individuals and the capacity of health services.
In parallel, over the past decade, a different but impactful societal transformation has been changing the way we live, almost imperceptibly.
Within every large organisation, digital transformation initiatives
Unfortunately, the reality is that much of the promise is hype. Organisations deploying these socio-technical systems have fallen short in their duty of care to mitigate downside risks through the implementation of operational safeguards
On the flip side of that promise too, is the fact that machines are not as productive as they were expected to be. Productivity in the UK has been falling below historic levels for some time (more so than in other countries which have rapidly digitised services, according to some researchers). But, the reality is machines cannot cope with uncertainty, novelty, and they do not ‘think’ beyond the programmer’s code and the narrow set of rules within which they must make decisions; quite often out of synch with the reality and a person’s needs.
The data machines ‘feed on’ is not necessarily accurate, proportional or representative of individuals in any given group. Depending on the frame of reference or time series in use, it may or may not be relevant, and yet important decisions are made on that basis on behalf of individuals who may not receive a satisfactory or indeed any explanation as to how those decisions were made.
Ubiquitous and pervasive
Socio-Technical Systems (STS), by the very definition, are expected to embody the requirements of personal and community aspects in hardware and software, and apply ‘an understanding of the social structures, roles and rights (the social sciences) to inform the design of the system that involve communities of people and technology.’ By this definition, STSs ‘seek to merge people and technology, viewing the integration of computers into societal systems as the next evolutionary step of humanity.‘
However, human societies are complex open systems. Their behaviours are not predictable. There are many uncertainties, which the machines fail to respond to appropriately. Moreover, the complex interactions between human beings and socio-technical systems are not fully understood. It is a big experiment where trial and error is alienating and dehumanising people more and more every day.
STSs are ubiquitous in our lives. Unless you are living ‘off the grid’ and stuck on a remote island without electricity, you would interact with STSs in the digital world at home, at work and on the move.
While machine learning algorithms have enabled rapid analysis of large datasets to uncover patterns and provide insights for further consideration by humans, many organisations have embedded and deployed them in websites, search engines, software applications and the apps on our smartphones to process our personal data, profile us and infer automated decisions that impact us, as consumers of digital services, based historical data and statistical methods.
However, inference is not the truth. It does not reflect who you are, what your beliefs are and what you are thinking. The machines are not sophisticated enough to pick up on meaning. The semantic layers are missing and even the most advanced language models imitate human interaction by ‘guessing’ the next word based on statistical inference. They are not communicating with you in a real way. Language is a complex and rich structure as a form of communication, which cannot be reduced to a mathematical formula. It is not simply a symbolic representation of reality. It is imbued with much deeper meaning and nuance beyond the formulae being used in these statistical and/or probabilistic models. If you are listening attentively to your interlocutor you cannot second guess what they are saying or what they are thinking. It takes time to get to know a person.?It is not an automatic process.
There is a whole range of skills that are required to interpret language and the meaning behind its expression; and of course, there is a rich, diverse cultural depth that the machines are not even close to understanding. And may never learn to ‘understand.’ Moreover, the world does not only involve English as the spoken language. It is somewhat ironic that no one appears to be questioning this. Even today, if you use google search to ‘translate’ you will find the results offered are way ‘off’ the mark. This can have very serious implications as it may cause offence and lead to misunderstandings. To misinformation, disinformation we ought to add the term ‘mis-translation’ which is causing division, mistrust and fueling what has been described with the deeply unsettling term ‘cancel culture;’ symptomatic of the alienation and de-humanisation experienced by many in everyday interactions in our societies.
The way STSs have replaced humans in service provision across all industries has been done largely from the viewpoint of the organisation deploying them. The pervasiveness of their deployment has negatively impacted many in society, mainly those who are not digitally savvy as well as those whose needs are not served by the scope of those STSs. There are well-documented cases where biases inherent in many STSs have adversely impacted the minority groups in society.
The machine learning algorithms powering these STSs are not as intelligent as their proponents would like us to think, as Micheal I. Jordan explains in his article. So, when the Boards and CEOs of organisations decide to adopt them and deploy them to the human consumers of their digital services, without any operational safeguards and recourse for interaction with a human representative of that organisation, all risks are borne by society. Chatbots, Applicant Tracking Software (ATS) and Online Personality Assessment are common STSs that have impacted human consumers. This Forbes article lists other areas of adverse impacts.
The trade-offs
The pace and progress of innovation, particularly in the advent of more powerful hardware have enabled alluring features and functions to be developed and deployed in STSs rapidly. Trust leaps enable human consumers to decide to engage with the STS in the first instance. They also do so in exchange for the personal data that is collected by the STS during the engagement.
The common denominator and fuel for STSs are personal data, which holds tremendous value for the organisation collecting them. Data brokers have benefitted significantly from trading personal data collected through a variety of means, including smart devices such as Alexa, which also fuelled the digital advertising industry.
Where biometric data is required during the engagement, we are asked to trade off our most precious personal data in return for digital services rendered. While biometric data is stored by government agencies such as the Immigration and Border Force and used, not all of us are comfortable with non-government and commercial organisations having similar access.
Incidentally, biometric technologies are being deployed in public places by organisations and most people entering spaces where they are in use are not even aware of it. Here’s a recent report in the FT which highlights the increasing levels of surveillance enabled by the availability of such technologies and the lack of regulation. There is no trade-off for the human subject in these instances. In the US, lawsuits have been mounting in states enforcing BIPA according to this article.
If you are an online video gamer, take a look at this article to understand what data is collected from the time you spent playing those games.
According to this article, Human Rights Watch – an international non-governmental organisation reported that a ‘total of 89% of 164 government-endorsed education technologies were found to have endangered the privacy of children’ through the sale of data collected from their interaction with these applications during the peak of COVID.
Our interactions with the digital world are currently tracked, including our movements if we are allowing our location data to be captured by apps connected to our smartphones or IoT smart devices. Interestingly, this report cited an instance where location data collected through the use of an app breached privacy laws. Although this article reports that Google intends to delete specific location data associated with visits to abortion clinics, access to similar data by Data Brokers will also need to be curbed before our location data is sold to third parties.
Our rights
Existing Data Protection regulations such as GDPR afford us as data subjects, 8 basic rights:
1.??????The right to be informed
The organisation you are engaging with must be completely transparent in how they are using your personal data. This is a key transparency requirement.
2.??????The right of access
You have the right to know exactly what information is held by the organisation about you and how it is processed. This is commonly referred to as a data subject access request or ‘DSAR’
3.??????The right to rectification
You are entitled to have your personal data rectified if it is inaccurate or incomplete. This right is closely linked to the controller’s obligations under the accuracy principle
4.??????The right to erasure
Also known as 'the right to be forgotten', you have the right to have your personal data deleted or removed without the need for a specific reason, but this right is not absolute and only applies in certain circumstances.
5.??????The right to restrict processing
You have the right to block or suppress the processing of your personal data. This is not an absolute right and only applies in certain circumstances. Also linked to the right to object.
6.??????The right to data portability
You have the right to obtain and reuse your personal data for your own purposes across different services. The right only applies to information an individual has provided to a controller.
7.??????The right to object
Under certain circumstances, you have the right to object to your personal data being used. This includes if the organisation uses your personal data for direct marketing, scientific and historical research, or for the performance of a task in the public interest.
8.??????Rights in relation to automated decision-making and profiling
You have the right not to be the subject of an automated decision including profiling where the consequence has a legal bearing on you.
领英推荐
If you are engaging digitally with an organisation that is governed by the UK or EU GDPR regulations, these 8 basic rights provide the mechanisms by which you can hold that organisation to account.
The EU Digital Markets and Digital Services Act as well as the incoming EU AI Act will provide further guardrails around the deployment of machine learning technologies which when applied alongside GDPR, will further strengthen your rights as human consumers of automated decision-making and profiling from STSs and digital platforms.
Additionally, people who have been adversely affected by STSs have resorted to legal advice as there are other existing regulations such as the Human Rights Act 1998 and Equality Act 2010 that could also be impacted by the outcomes. This paper provides a legal analysis of 'Discrimination in the Age of Algorithms.’
What we can do differently
There are a host of technology tools recommended by privacy groups and experts that you can use to interact with the digital world and minimise the risks of your personal data being harnessed on the internet. Nevertheless, we would still expect the organisations that you trust enough to engage with digitally to protect your personal data and privacy, as they have a legal obligation to do so.
Apple and Android both introduced privacy controls for their smart devices to enable their device owners to decide who they allow data from their devices to be shared with. If you own a smartphone or tablet that runs their operating systems, you can configure your settings accordingly.
Being aware of your rights where provisioned by the regulators in your jurisdictions, such as those outlined above for GDPR enable you to be aware of what you can do when engaging with STSs deployed by organisations that process your personal data.
Having an awareness of the mechanisms and avenues available to you if you are disadvantaged, discriminated or harmed by automated decisions and/or profiling from STSs is also beneficial as it enables you to engage differently with those deploying organisations governed by regulations.
Opportunities for deploying organisations
Where organisations that have deployed STSs as a result of their digital transformation programmes but failed to take into consideration the downside risks and potential adverse human and social impacts, having CEOs and Boards who are prepared to think and act differently is the first step towards responsible innovation
Adopting a mindset that accepts diverse inputs and multi-stakeholder feedback would be the logical actionable next step, allowing for the creation of that feedback loop
If only chatbots provided an off-ramp that connected us with a human when it was unable to satisfactorily respond to our query or resolve our problem.
If only a human representative from the HR department was available to explain how their automated systems processed our job application.
If only the bank was able to explain why our mortgage applications were not approved by its algorithm.
If only we were aware of how an advertisement about an item we were discussing offline suddenly appeared on our social media feeds.
If only the organisation deploying the STS was able to explain how our personal data collected was used by its algorithms to infer the automated decision that influenced our actions.
If only the well-being of the human consumer of outputs from STSs were at the heart of the decisions to deploy them.
Feedback for awareness
The key differences between the physical world and the digital world are speed, scale and reach.
Where we had more time to think, plan and execute in the physical world, the digital world enables rapid execution of initiatives to large audiences globally. Information travels much faster in the digital world. A much larger audience can be engaged and more markets can be reached simultaneously.
A brilliant and sound idea has the possibility of attaining success in a far shorter time. Conversely, a bad idea or an initiative executed badly can very quickly fail and sadly, any related negative outcomes can also be amplified and human consumers adversely impacted almost instantly.
The failure to truly understand the consumer, socio-political and cultural norms has created serious damage to high profile brands, both to their reputation and revenues. Once a consumer leaves a brand, their actions may become contagious. It is a vicious spiral companies large and small may wish to avoid.
The negative experiences that we have as consumers of digital outcomes break that initial ‘trust leap,’ we took to engage with the respective organisation. You might recall the decision you took to select a courier recommended by a website to ship a gift to a family member, to arrive in time for their birthday, using the Next-Day delivery service; only to discover that it never arrived and you were left in a lurch by the courier’s chatbots that were unable to assist.
Most of us research online and look out for customer feedback and ratings. Are they real? Beware of misinformation as well as disinformation as these third-party customer feedback platforms can be gamed. Validate and verify all information when collating feedback from all sources.
Some of the digital service providers request feedback at the end of a transaction or digital engagement. Feedback is a mechanism that allows these organisations to continue learning so that they can improve their services. Whether they act on the information collected or not is a different matter. If you are invited to provide feedback, do so. If you have not been invited to provide feedback, and you were not satisfied with the level of customer service received digitally, it would be beneficial to provide your feedback directly, as well as indirectly through the myriad of consumer watchdog and feedback sites in the UK, such as the list of Consumer Advice Organisations and Trustpilot, so that others can benefit from your experience.
Larger organisations monitor consumer feedback on social media such as Twitter and they are likely to quickly respond and manage negative feedback.
Why is this beneficial for organisations?
Businesses do not exist in a vacuum. They may not be aware of negative experiences their customers have owing to the complex value chain, and third-party providers who may not always disclose their methodologies in collating the data. Organisations who genuinely wish to serve their customers will want to know and do everything in their power to get it right. Goods and services generate revenue consistently only if they can rely on customer goodwill.
If you believe that any of your 8 basic rights were not afforded to you under GDPR and you have not been able to obtain a satisfactory response from the organisation deploying the STS, you can lodge a complaint to the ICO if you are in the UK or the EDPB if you are in the EU.
There are also organisations compiling incidents of adverse outcomes from STSs in their repositories such as the AI, Algorithmic and Automation Incident and Controversy repository, AlgorithmicWatch and the AI Incident Database.??
Trust and choice
Trust is critical for engagement in the digital world. Short of taking that 'trust leap' as Rachel Botsman describes, we want to be informed as much as possible before deciding on which organisation we want to engage with through their STS.
In a competitive digital world, we have choices. We decide who we engage with.
STSs are part of the digital world that is also our future world.
Our future world needs to be one where we, as human consumers of outputs, including automated decisions and profiling, have a degree of control about the decisions we take on the choices we have.
We expect governments, regulators and organisations deploying STSs to recognise and continue to support the fundamental rights of humans in a world fuelled by data, powered by technology and respective of the natural environment that needs to be sustained for the sake of humanity.
If governments, scientists and health services were able to rapidly and collectively respond to provide societies with a range of mechanisms to manage risks of adverse impacts from COVID19, we can only hope that all parties can share the same willingness to make our future digital world one that is safe, secure, ethical, fair, privacy-preserving, trustworthy and sustainable.
Standards are not enough to deliver real change. They actually need to be implemented. Effective management, identifying all the risks is a requirement for all organisations. Since sustainability is a systemic concept, we need to take a holistic approach. Embracing context-based materiality where risks are concerned, leaving no stone unturned is the best way to ensure thresholds are not crossed and appropriate guardrails are put in place.
Nobody wants to live in a surveillance economy, where citizens are de-materialised and treated as ‘virtual’ commodities to be exploited by unknown forces. Dystopia or utopia? Idealistic perhaps, but our efforts need to go towards building a better world.
Director | Advisory & Delivery | Change & Transformation | GRC & Digital Ethics | All views my own
2 年Chris Taylor?Mark Chillingworth?Trevor Hunt?Caroline Gorski?Marco Meyer?Richard Foster-Fletcher ???Christiane Wuillamie OBE FIRL?John R. Childress?Emma Parry?Sarah Sinclair?Ni?l Malan?Michael Georg Speller?Julia Chamova, MBA?Emilie Sundorph?Jessica Rose Morley?Elena Sinel FRSA?Katie King, MBA?Maria Luciana A.?Ansgar Koene?Ajay Singh?Debra J Farber?Tim Clement-Jones Mark Lizar
Director | Advisory & Delivery | Change & Transformation | GRC & Digital Ethics | All views my own
2 年Ryan Carrier, FHCA?Shea Brown?Maria Santacaterina?Markus Krebsz?Charles Radclyffe?Sundaraparipurnan Narayanan?Emmanuelle Shaaravi?Heidi Saas?Carissa Véliz?Antonio Placido Buffelli?Dr. Saskia D?rr ???? StandWithUkraine?Dr. Dorothea Baur?Enrico Panai?Paolo Volpe, FHCA?Vikas Malhotra?Vibhav Mithal?Dr Carolina Sanchez Hernandez?Hema Lakkaraju?Lukas Madl, FHCA?Michael McCarthy, PhD?Cheryl Rego?Katrina Ingram?Cari Miller, FHCA?Tristi Tanaka?Sarah Clarke?Rohan Light, GRCP FRSA?Jo Stansfield?Jeffrey Kluge, FHCA?Jeff Jockisch?Jodi Masters-Gonzales, FHCA?Joshua Bucheli?Gisele Waters, Ph.D.
Data Privacy and Technology Attorney | Licensed in CT, MD, & NY | AI Consultant | Speaker | Change Agent | ?? Disruptor ??
2 年Thanks for another great article, and you are right Chris Leong, FHCA and Maria Santacaterina- it's about trust. ??