Data Protection isn’t just about privacy because data processing should serve mankind, but how do we find the common ground?
With a legal regime of the size and scope of Data Protection it’s not surprising that it can be hard to find common ground between all the stakeholders who are invested in the area. At one end of the spectrum there is a community of people who advance privacy as the goal that trumps everything else. At the other end, there are those who fight against the law’s objectives, or try to dodge its affects. Neither extreme can provide the common ground, but they do throw into relief a critical question: what is Data Protection actually about?
The answer is more than privacy. Fortunately, we don’t have to look much further than the fourth recital to the GDPR to identify what it’s real purpose is. I believe that the ideas within it can help us to find the common ground:
‘The processing of personal data should be designed to serve mankind. The right to the protection of personal data is not an absolute right; it must be considered in relation to its function in society and be balanced against other fundamental rights, in accordance with the principle of proportionality.’
Central within these ideas is the thought that society can gain from data processing, if it’s done properly and is respectful of competing interests. For example, processing can deliver or facilitate economic growth, with all the attendant benefits, such as more employment and more taxes that can be invested in things that we value as a society. Or it can help to find cures or treatments for hideous diseases. Or it can help to protect us from those whom would harm us. However, if privacy was the single goal, all of those beneficial outcomes would be threatened, risking our collective impoverishment, whether financially or through the reduction of the quality of life.
Yet, conversely, the goals cannot just be economic growth, medical research or law enforcement at any cost. That would also be a pathway to impoverishment.
So, the aim of the fourth recital is the achievement of appropriate balances between the competing interests engaged by acts of data processing. It encourages us to search for and find the most likely common ground, if it exists.
But how easy is it to get there? My personal experience is that it’s one of the routes less travelled in Data Protection. Problematically, often the essentially foundational balancing acts aren’t even attempted or conducted to anywhere near a satisfactory level.
I was speaking with some friends about this recently, with respect to the nature of professional practice in the Data Protection field, after another long evening spent looking at the broader issues involved in facial recognition for a client. Without any sense of facetiousness, I asked my friends do we really know the law in this area, or the technological and business issues involved? And what about AI, or AdTech even? I asked if we are unsure, despite being seasoned practitioners with many years of work under our belts, what can we extrapolate for the performance of the balancing acts that can guide us to the common ground?
One extrapolation is that many of the established positions in DP might be resting on unsound foundations, due to unsatisfactory or overlooked balancing. An obvious result could be that things might be accepted as being right when they are wrong, while things that are seen to be wrong might have been right all along.
Let’s consider again the example of facial recognition. If we take privacy as the lead point, we can easily find the barriers standing in the way of the adoption and roll out of innovations in this area. However, if we step back a little and examine the full context surrounding the intended scheme, we might be able to find ways through some of the barriers, or perhaps even a new perspective on what the barriers actually are. The UK courts have provided a recent example of what may happen when the issues are viewed from different perspectives. In the case in question, in 2019, a pilot scheme run by the police was challenged with the thrust of the argument against it being that facial recognition in the situation under consideration offended the privacy rules in this country. The court thought differently, observing that the police must be able to take advantage of all technological developments in the fight against crime, a position that has been consistently supported over the years, from the development of fingerprinting back in the day, through to DNA testing and public and private electronic surveillance. However, the use of facial recognition has to be properly balanced against competing interests and there isn’t a green light for an ‘anything goes’ attitude. So we come out of that case understanding that there is a common ground to be found upon which we can all be winners.
Another context of significance today is the data sharing and sale economy, which I’ve been conducting research into for some clients. Again, when you look at the issues from different angles, you can arrive at different points of view. For instance, if you begin from the position of privacy, you can easily conclude that data sharing and sale is always offensive to privacy rights without consent, but if you begin from the perspective of, let’s say, the wider interest that society has in preventing the build-up of unsustainable household debt, then you can legitimately conclude that data sharing and sale are good for society in the context of pre-credit screening. By extension, a privacy-only, or a privacy-first view of the world of data sharing and sale can carry with it many unintended consequences, whereas a different starting point can reveal hidden benefits.
And what about the world of the free-at-source tech giants? The benefits of free-at-source web search, social media and the like are impossible to exaggerate, but how different would the world be today if they had been forced to charge for their services at the outset, rather than support them through advertising revenue?
But, entrenched, opposing views remain in so many areas of DP and I think that this points to a simple but stark reality, which is that we are failing to build bridges between opposing camps. This failure seems to me to more suggestive of a communications problem than a Data Protection problem!
If there is a communications problem, then achieving balance and finding common ground requires much more effort to be invested in answering the challenges that are raised against specific acts of data processing, especially if the processing can deliver big gains for society, but what does the GDPR say about this?
I think that the legislation points at the right target, but it doesn’t hit the bullseye. The rules on transparency contain many ideas, such as communicating the purpose of data processing using plain language, but they do not instruct us to proactively address the points of view and challenges that people may raise due to holding different perspectives. Alternatively, if this is implicitly part of the rules, it’s not obvious, especially to people in a hurry.
I’m sure that people will point me to the rules on the use of the legitimate interests ground for lawful processing and some of the supporting regulatory guidance to say that the GDPR hits the bullseye. I looked again at the legislation and the Information Commissioner’s guidance on legitimate interests assessments when writing this article and I accept that they do point clearly to the need to identify if there are any wider societal gains in the processing. However, I’m trying to make a different point. My argument is that regardless of the grounds for processing that are relied upon and the transparency rules that apply, it would be to everyone’s benefit if the holders and users of personal data could communicate better around the bigger picture that surrounds their activities, cognisant and respectful of opposing views held by others. Working on the art of communication, which isn’t the same thing as being legally transparent or justifying the legal basis that’s relied upon to support data processing, or dealing with discrete complaints, or exercise of rights, should be part of everyone’s DP manifesto.
Good communications should acknowledge and address the positions, concerns, worries and fears of the listener, not just the goals of the speaker. I think this will help to build bridges, bringing people at opposite ends of the spectrum closer together on common ground. This would support the gain agenda in DP and would limit the risks of unnecessary unintended consequences, while ensuring that privacy and other rights are protected properly and to the level they deserve.
I will be publishing some ideas soon on how the DP community can come together to help find the common ground, but don’t hesitate to drop me a line, if you’d like to have a chat in the meantime.
Stewart Room
17th January 2019
CISO || Cybersecurity strategist focused on Cyber Resilience, Risk & Privacy (CISSP, FIP)
5 年"How different would the world be today if they had been forced to charge for their services at the outset, rather than support them through advertising revenue?" Oh so different! I think about this question and remember Zuckerberg's grilling at the Senate in 2018 when asked "How does Facebook make its money?"? My immediate thoughts then were - If only FB had allowed themselves (without being forced) to make money 'legitimately' from user subscription fees, they may not be a billion dollar company today but atleast he won't be caught stammering on live TV describing how his company funds itself. Ultimately, I think DP has to find a common ground that's universal, a position where not just the EU but APAC, The Americas, Middle East can all agree is a common balanced view, maybe via the OECD, so these conversations are accelerated for the benefit of all peoples.
Digital Business and Data Protection Lawyer CIPM, CIPP/E, Fellow of Information Privacy (FIP), Coach, Public Speaker, Trainer
5 年Good points Stewart. It’s a mindset shift for some who ‘hide’ their data practices for fear of being ‘found out’ and an approach by some that any processing of personal data means immediate rebuke and impossible standards to reach. So they perhaps aim to avoid or ignore the requirements - forgetting the more philosophical basis of privacy law. To protect mankind. Let’s embrace technology and what we may do with it - not hide it and feign compliance. It’s to be championed. Nothing to be scared of. Communication is key to that.
DataSecOps Evangelist | Author | Public Speaker | Chief Scientist & VP Marketing @ Satori
5 年Interesting, I think the next few years will clarify a lot about the "contract" between people, companies and the privacy of data and will shape some of the constraints around data protection. The balance between the great innovation and benefits from data analysis vs the privacy of people is fragile and not so obvious in all cases.