Why the Law Often Doesn’t Recognize Privacy and Data Security Harms

In my previous post on privacy/security harms, I explained how the law is struggling to deal with privacy and data security harms. In this post, I will explore why.

The Collective Harm Problem

One of the challenges with data harms is that they are often created by the aggregation of many dispersed actors over a long period of time. They are akin to a form of pollution where each particular infraction might, in and of itself, not cause much harm, but collectively, the infractions do create harm.

In a recent article, Privacy Self-Management and the Consent Dilemma, 126 Harvard Law Review 1880 (2013), I likened many privacy harms to bee stings. One bee sting might not do a lot of damage, but thousands of stings can be lethal.

In the movie, Office Space, three friends create a virus to deduct a fraction of a cent on every financial transaction made from their employer’s bank account, with the proceeds being deposited into their own account. The deductions would be so small that nobody would notice them, but over time, they would result in a huge windfall to the schemers. That’s the power of adding up a lot of small things.

The problem is that our legal system struggles when it comes to redressing harms created to one person by a multitude of wrongdoers. A few actors can readily be sued under joint and several liability, but suing thousands is much harder. The law has better mechanisms for when many people are harmed by one wrongdoer, such as class actions, but even here the law has difficulties, as only occasionally do class members get much of benefit out of these cases.

The Multiplier Problem

The flip side of collective harm is what I call the “multiplier problem,” which affects the companies that cause privacy and data security problems. A company might lose personal data, and these days, even a small company can have data on tens of millions of people. Judges are reluctant to recognize harm because it might mean bankrupting a company just to give each person a very tiny amount of compensation.

Today, organizations have data on so many people that when there’s a leak, millions could be affected, and even a small amount of damages for each person might add up to insanely high liability.

Generally, we make those who cause wide-scale harm pay for it. If a company builds a dam and it bursts and floods a town, that company must pay. But with a data leak, courts are saying that companies should be off the hook. In essence, they get to use data on millions of people without having to worry about the harm they might cause. This seems quite unfair.

It takes a big entity to build a dam, but a person in a garage can create an app that gathers data on vast numbers of people. Do we want to put a company out of business for a data breach that only causes people a minor harm? When each case is viewed in isolation, it seems quite harsh to annihilate a company for causing tiny harms to many people. Courts say, in the words of the song my 3-year old son will not stop singing: “Let it go.” But that still leaves the collective harm problem. If we let it go all the time, then we have death by a thousand bee stings (or cuts, whichever you prefer).

The Harm of Leaked or Disclosed Data Depends Upon Context

People often make broad statements that the disclosure of certain data will not be harmful because it is innocuous, but such statements are inaccurate because so much depends upon context.

If you’re on a list of people who prefer Coke to Pepsi, and a company sells that list to another company, are you really harmed by this information? Most people wouldn’t view a preference for Coke versus Pepsi to matter all that much. Suppose the other company starts sending you unsolicited emails based on this information. You don’t like getting these emails, so you unsubscribe from the list. Are you really harmed by this?

But suppose you’re the CEO of Pepsi and the data that you like Coke is leaked to the media. This causes you great embarrassment, and you are forced to resign as CEO. That might really sting (though I’m certain you would have negotiated a great severance package).

Another example: For many people, their home address is innocuous information. But if you’re an abuse victim trying to hide from a dangerous ex-spouse who is stalking you, then the privacy of your home address might be a matter of life or death.

Moreover, the harmfulness of information depends upon the practices of others. Consider the Social Security number (SSN). As I discussed in a previous post, the reason why SSNs are so harmful if disclosed is because organizations use them to authenticate identity – they use them as akin to passwords. It is this misuse of SSNs by organizations that makes SSNs harmful. If SSNs were never misused in this way, leaking or disclosing them wouldn’t cause people harm.

The Uncertain Future

Problems of Proof

Another difficulty with harm is that the harm from privacy and data security violations may occur long after the violation. If data was leaked, an identity theft might occur years later, and a concrete injury might not materialize until after the statute of limitations has run.

Moreover, it is very difficult to trace a particular identity theft or fraud to any one particular data breach. This is because people’s information might be compromised in multiple breaches and in many different ways.

A big complicating factor is that very few identity theft cases result in much of an investigation, trial, or conviction. The facts never get developed sufficiently to figure out where the thief got the data. For example, in one estimate, fewer than 1 in 700 instances of identity theft result in a conviction.

Why are identity theft cases so neglected? Identity theft can occur outside of the locality where a victim lives, and local police aren’t going to fly to some remote island in the Pacific where the identity thief might be living. Police might be less inclined to go after an identity thief if the thief’s victims are not in the police’s jurisdiction. Cases can take a lot of resources, and police have other crimes they want to focus on more.

Without the thief being caught and fessing up about how he or she got the data, it will likely be very hard to link up identity theft or fraud to any one particular data breach.

The Aggregation Effect

With privacy, the full consequences depend not upon isolated pieces of data but upon the aggregation of data and how it is used. This might occur years in the future, and thus it is hard to measure the harm today.

Suppose at Time 1 you visit a website and it gathers some personal data in violation of its privacy policy. You are upset that it gathered data it shouldn’t have, but nothing had has happened to you yet. At Time 2, ten years from now, that data that was gathered is combined with a different set of data, and the result of that combination is that you’re denied a loan or placed on the No Fly List. The harm at Time 1 is different from the harm at Time 2. If we know about the use of the data at Time 1, then we could more appropriately assess the harm from the collection of the data. Without this knowledge at Time 1, it is hard to assess the harm.

Harm is Hard to Handle

Privacy harms are cumulative and collective, making them very difficult to pin down and link to any one particular wrongdoer. They are understandably very hard for our existing legal system to handle.

* * * *

Daniel J. Solove is the John Marshall Harlan Research Professor of Law at George Washington University Law School, the founder of TeachPrivacy, a privacy/data security training company, and a Senior Policy Advisor at Hogan Lovells. He is the author of 9 books including Understanding Privacy and more than 50 articles. The author thanks SafeGov for its support. Follow Professor Solove on Twitter @DanielSolove.

The views here are the personal views of Professor Solove and not those of any organization with which he is affiliated.

Image Credit: Pond5

Please join Professor Solove's new Linked In Discussion Groups:


Johann du Toit

Labour Specialist at Henk Kloppers Attorneys

10 年

The broken window syndrome: it starts with the little innocent things....

回复
CJ Burke

Director at Burke & Associates

10 年

Daniel: I very much enjoyed your posting, and I agree with you on most of this. I'd like to ask a few questions and add a couple of comments. Please excuse the fragmented sound of this reply -- I'll be responding to your post pro forma and as such it tends to not present a smooth flow of thought. When you speak of the aggregation of data amplifying its potential effect, you're of course right, especially as the presentation of the data can be contextualized to send a less-than-objective message. And you're correct that a class action suit (for instance) directed against a small company for an inadvertent data leak can indeed give the plaintiffs each a $1.50 and bankrupt the company, all over data that may or may not be harmful. And this field is rife with emergent, unintended consequences. The way HIPAA addresses this (I know you know) is that the damage is not to the individual, but instead to the state. While this doesn't provide recompense to the aggrieved party, it serves to act as a disincentive to those who aren't up to speed on privacy retention. To me, this seems -- while less than perfect -- the way to go. And in this case, there's no good reason for law enforcement not to proceed, is there? In the earlier days of HIPAA, I saw zero enforcement. Recently, however, I am somewhat heartened -- note the $800,000US St. Louis finding. I'd like to see an opt-out system applied, where people can choose not to have information recorded. I don't think it'll work, mind you. But I like the direction. And it's worse at the federal government level. But I fully agree the potential for serious harm is immense. One last thought -- the movie vignette you mentioned originally came from an urban legend that may or may not be real, dating from the 1970's. As the story goes, banks at the time rounded interest accrual, as they had to -- no real tracking of fractions of cents, must less repeating infinite fractions. The idea is that, over the course of time, this statistically round out. The story tells of a programmer that simply truncated the remaining fractions of cents and applied them to an account - one he could access. The story goes on to say he was detected by accident, when an audit noticed very weird amounts being deposited in an account daily. Real or not? No-one seems to know -- not even Snopes. But it was surely possible.

Gerard Smits

Vice president supply chain technology

10 年

Daniel, Sometimes I think that if the ownership of personal data would be residing with the person in question and usage by others is not allowed but only with consent. Putting personal data on par with eg. the ownership of property. And only then the person who use it must have a log on how and when they obtained the personal data. Not only will it for enforcement much more easy to check for any violation, but it automatically help in purpose limitation. When usage is not allowed and authorities can check easier, due care will be taken much more seriously. I know it will not be 100% waterproof, and will have extra need for logging, but at least the burden of proof is in the corner were it should be. The peope/organizations that use personal data and for which personal data has economie value.

回复
Mark Lomas

Security Manager (Consulting) at Capgemini

10 年

Daniel, I agree with your analysis. In addition I suggest that part of the problem is that US courts are reluctant to compensate for increased risk. If I increase monitoring or security measures to compensate for a breach I suspect that many courts would consider that to be voluntary expenditure. I consider it to be a consequential cost. Part of the problem is that even if we agree that there is a cost, the actual amount is subjective. Mark

回复
Alex Hobbs

LOOKING FOR NEW ROLE - Legal IT, Lawtech (Software) & Law Firm Cloud Computing Sales & Marketing Consultant / Director

10 年

Interesting follow up

回复

要查看或添加评论,请登录

Daniel Solove的更多文章

  • My new book, ON PRIVACY AND TECHNOLOGY

    My new book, ON PRIVACY AND TECHNOLOGY

    I am very excited to announce the publication of my new book, ON PRIVACY AND TECHNOLOGY (Oxford Univ. Press – March…

    3 条评论
  • Privacy Scholarship News

    Privacy Scholarship News

    I have a few items of scholarship news to share. SSRN Downloads: A Personal Milestone I’m excited and grateful for this…

    1 条评论
  • U.S. State Privacy Laws: Making Sense of the Mess

    U.S. State Privacy Laws: Making Sense of the Mess

    The year kicked off with several privacy laws coming into effect, and there are several more scheduled to become active…

    8 条评论
  • 2024 Highlights: Privacy and AI Training and Whiteboards

    2024 Highlights: Privacy and AI Training and Whiteboards

    Here’s a roundup of my privacy training and whiteboards in 2024. Training European Union AI Act NIST Privacy Framework…

    5 条评论
  • 2024 Highlights: Privacy and AI Cartoons and Posts

    2024 Highlights: Privacy and AI Cartoons and Posts

    Here’s a roundup of my cartoons and blog posts for 2024. CARTOONS Notice and Choice Personal Data AI Restaurant AI…

    3 条评论
  • 2024 Highlights: Privacy and AI Scholarship

    2024 Highlights: Privacy and AI Scholarship

    Here’s a roundup of my scholarship for 2024. But first, a preview of my forthcoming book (Feb 2025): ON PRIVACY AND…

    3 条评论
  • 2024 Highlights: Privacy and AI Webinars

    2024 Highlights: Privacy and AI Webinars

    Here’s a roundup of my webinars from 2024. Don’t want to miss a video? Please subscribe to my YouTube channel.

    1 条评论
  • What Kafka Can Teach Us

    What Kafka Can Teach Us

    Although Kafka shows us the plight of the disempowered individual, his work also paradoxically suggests that empowering…

    4 条评论
  • The Tyranny of Algorithms

    The Tyranny of Algorithms

    We live today increasingly under the tyranny of algorithms. They rule over us.

    21 条评论
  • FERPA & School Privacy

    FERPA & School Privacy

    When it comes to privacy issues, schools are in the Dark Ages. I cannot think of any other industry that is so far…

    1 条评论

社区洞察

其他会员也浏览了