Is the Right to Be Forgotten Good or Bad? This Is the Wrong Question
Daniel Solove
Professor, GW Law School + CEO, TeachPrivacy + Organizer, Privacy+Security Forum
Is the right to be forgotten good or bad?
This is the question many are asking these days in light of the recent EU Court of Justice (ECJ) decision that requires search engines such as Google to remove personal data from search results when people request it. (For more background, I wrote about the ECJ decision last week.)
After the decision was released, critics attacked the right to be forgotten as impractical, undesirable, and antithetical to free speech.
I agree that there are severe problems with the ECJ decision, and I fear that the rather half-baked nature of the decision might lead to the right to be forgotten being dismissed as silly and unworkable, a misguided experiment by privacy idealists.
But before we give a quick thumbs up or thumbs down to the right to be forgotten, we should understand what, exactly, the right to be forgotten is. The answer is quite complex.
The Problem of Having a Catchy Title
The right to be forgotten has a catchy title, but that is not necessarily a good thing. In a recent piece in Wired, Evan Selinger and Woodrow Hartzog make the important point that the use of the word “forgetting” is misleading: “This debate is not and should not be about forgetting or disappearing in the traditional sense. Instead, let’s recognize that the talk about forgetting and disappearing is really concern about the concept of obscurity in the protection of our personal information.”
There is an important interest in protecting people from the burgeoning dossiers of data about them. And privacy these days is not just about keeping secrets hidden away from everyone; instead, it is about modulating the accessibility of personal data and the boundaries of how it can flow. This interest is much broader than having information forgotten.
The Right to Be Forgotten Is Many Different Things
The right to be forgotten, is actually not just one thing, but a series of different things.
The ECJ decision, ironically, was too narrow. It focused primarily on Google and search engines, and it failed ultimately to address the larger interests at stake. The problems that the plaintiff faced in that case extended far beyond Google.
I’d like to parse the issues a bit to make some important distinctions, identify some key factors that need to be considered in the analysis, and pose some difficult questions that need to be addressed.
1. What Does It Mean to “Forget”?
What does it mean to “forget” data? Ironically, only on occasion does the right to be forgotten involve complete forgetting. Complete forgetting would entail that the data be deleted – wiped away entirely. But this rarely happens – it often doesn’t even happen when court records are expunged. Instead, in many cases, the data remains in existence but is locked up – rendered inaccessible and only available under special circumstances.
Another meaning of forgetting is the imposition of use and disclosure restrictions. In the ECJ decision, Google is being restricted from including data in search results to the public. Google is being asked to help make certain data less accessible. The ECJ decision doesn't require the newspapers to delete the data. This fact demonstrates that the ECJ decision isn’t really requiring that information be forgotten. The information is still online. The ECJ is just making the information harder to find. Nothing technically is being forgotten.
2. Who Has to “Forget”?
When it comes to the entity that owns or processes the data, the type of entity matters a lot. Many people might be more inclined to embrace a right to be forgotten when it comes to government records than to private-sector records.
There are instances when the government should expunge personal data about people. We have long accepted the practice of expunging criminal records under certain circumstances, so the concept isn’t new. But now, one can find the information in various databases, so it is harder to have such information fully expunged.
We also used to have practical obscurity – the fact that finding out information in various public records was difficult and time-consuming. But that has changed now too. I wrote an article about this problem in 2002 with regard to public records and how the government must do a better job of protecting privacy in the records it maintains.
Moreover, government entities such as the NSA should not be able to store any data about people that it wants indefinitely without adequate justification.
A right to be forgotten is more contested when it comes to private-sector entities, but there are some cases that might have wide support. For example, many might support requiring companies to purge any data they illegally gathered.
I bet there would be widespread support for a rule that allows a person the right to demand that in certain circumstances, when an account with a company (such as Facebook, Google, LinkedIn, or Dropbox) is closed, the data can be deleted (assuming there is no compelling reason to retain the data).
3. What Are the Circumstances In Which the Data Was Collected?
The circumstances in which the data was collected matter a lot. Was it collected illegally? As discussed above, there is more consensus that a law requiring purging of data illegally acquired would be acceptable.
There are U.S. privacy torts that allow people to sue when their personal data is disclosed, and they typically protect free speech by not allowing the suit when the information is of legitimate interest to the public. These torts apply even when information is gathered legally, and even when gathered by the media. They have thus generated significant controversy, and raise very tough issues.
Some other circumstances to consider include:
* Was the information part of a confidential relationship, such as doctor-patient or attorney-client?
* Who is being asked to remove the data? We might be more willing to force a doctor to remove patient files she puts online than to force a reporter from disclosing health information about a person.
* Did a person voluntarily reveal the data? Or was it collected (and possibly disseminated) against a person’s will?
* Did a person provide data when he or she was a minor?
How we decide particular cases will likely depend upon these and other circumstances.
Conclusion
My goal here isn’t to answer all the questions or even identify all the issues. I mainly want to point out that when we speak about the right to be forgotten, many different things are involved, only some of which involve forgetting. Ultimately, the right to be forgotten is just a catchy name for various rights and obligations regarding the collection and use of personal data.
Depending upon what we’re talking about, the desirability, practicality, and free speech implications will be different. We should be careful to avoid lumping together all requirements to delete data or restrict access to data because they involve very different situations.
The debate about the right to be forgotten is not one that can readily be reduced to black-and-white terms. There are several instances where we have certain permutations of the right to be forgotten here in the United States -- and in some cases, they are widely accepted.
* * * *
Daniel J. Solove is the John Marshall Harlan Research Professor of Law at George Washington University Law School, the founder of TeachPrivacy, a privacy/data security training company, and a Senior Policy Advisor at Hogan Lovells. He is the author of 9 books including Understanding Privacy and more than 50 articles. Follow Professor Solove on Twitter @DanielSolove.
The views here are the personal views of Professor Solove and not those of any organization with which he is affiliated.
Image Credit: Pond5
deconstructing liturgitative mantrapeneurship
10 年I think this debate has visceral roots in the most primitive forms of human liberty: the right to determine our own associations with others, and the right to measure for ourselves how much encroachment by others we allow. In online activity it is assumed going in that everything leaves a trail, the ethical questions are about who has the power to do what in following that trail. This is not so very different from any time in history, where we are all living in plain sight of each other, and societal codes develop whereby boundaries that cannot be held indefinitely by force are held by etiquette and social compact. What has yet to develop fully is just what is to constitute the social compact of the digital age. It has been pointed out by a lot smarter people than me that "privacy" as a condition of anonymity or invisibility, in digital terms, has already ceased to exist.
Graphic Designer - Set Dec / Props
10 年The focal points in this situation are two: 1- WHO is going to decide what to delete 2- What about other search engines?
Deputy Director General for Legal Affairs Attorney at Law
10 年The right on privacy or to be forgotten allows us to define who we are and who we want to be positioned
Business Development Consultant, Increasing FDC Pack Station Efficiency by +2X, Ask Me How!
10 年(Search Engine) Data in itself has no malice or intentions. However, it is how the data is used that can have malicious or ill intentions. Conversely data can be the exact opposite, great expectations. Unfortunately, this topic revolves around the negative, historical data of individuals. Personally, I would not like to have an article available to the world if I was exonerated or even found not guilty of a crime. With legal matters sometimes there is too much publicity around the start and little written, thus no data to be found upon an internet search, at the end. In these matters, I think it's important that we maintain our right to expunge all search data. This will ultimately pose another problem, that being all search results of an individual will eventually be all wine and roses. It is difficult to balance the privacy rights of an individual with those of the collective rights of the public, especially when it comes to data compiled by internet search engines. So, just think what the difficulties for privacy will be when corporations really start to play with their new toy, Big Data. We all know what the public's reaction was toward a major retailer and their target marketing initiative, collecting data on pregnant women.