Did the LabMD Case Weaken the FTC’s Approach to Data Security?

Did the LabMD Case Weaken the FTC’s Approach to Data Security?

Co-Authored by Prof. Woodrow Hartzog

On Wednesday, the U.S. Court of Appeals for the 11th Circuit issued its long-awaited decision in LabMD’s challenge to an FTC enforcement action: LabMD, Inc. v. Federal Trade Commission (11th Cir. June 6, 2018). While there is some concern that the opinion will undermine the FTC’s power to enforce Section 5 for privacy and security issues, the opinion actually is quite narrow and is far from crippling. 

While the LabMD opinion likely does have important implications for how the FTC will go about enforcing reasonable data security requirements, we think the opinion still allows the FTC to continue to build upon a coherent body of privacy and security complaints in an incremental way similar to how the common law develops. See Solove and Hartzog, The FTC and the New Common Law of Privacy, 114 Columbia Law Review 584 (2014). 

Ironically, though, the opinion seems to incentivize the kind of FTC data security micromanaging that the 11th Circuit sought to avoid. If so, we might be left with a more porous and weaker approach to data security. Here is our reading of the case and its implications. But first, some background.

The FTC’s Authority to Regulate Privacy and Data Security

 The FTC regulates privacy and data security primarily through the authority it receives from Section 5 of the FTC Act which prohibits “unfair or deceptive acts or practices in or affecting commerce.”

In many data security cases, the FTC has alleged that inadequate data security is deceptive because it contradicts promises made in privacy policies that companies will protect people's data with "good," "adequate," or "reasonable" security measures. And in a number of cases, the FTC has charged that inadequate data security is unfair because it creates actual or likely unavoidable harm to consumers which isn't outweighed by other benefits.

The LabMD Case

 In LabMD, the FTC alleged that the company “failed to reasonably protect the security of consumers’ personal data, including medical information.”  One major practice the FTC found problematic was the exposure of consumer information on a peer-to-peer (P2P) file-sharing network. 

 The LabMD case began as a typical complaint for unfair data security practices. But unlike the vast majority of cases, this has resulted in a refusal to settle notwithstanding the probable shuttering of business, a controversial commissioner recusal, a defamation lawsuit brought by a third party cyber intelligence company against LabMD, a House Oversight committee investigation into the FTC’s actions, and an entire book written by the LabMD’s CEO chronicling his view of the conflict.

 LabMD’s CEO, Michael Daugherty, has charged the FTC as acting as a bully. His book about his case with the FTC is called The Devil Inside the Beltway, and it clocks in at nearly 500 pages. Daugherty’s book reads like a novel, and Daugherty chronicles his dealings with the FTC and a number of prominent privacy attorneys, among others. He doesn’t hide his strong feelings. Daugherty views FTC officials as “professional bullies” that are “subjective, devious, and manipulative.” (p. 447).

There are even more rings in the circus, as there is an ongoing feud between LabMD and a company called Tiversa, a feud that affects some of the facts in the case upon which the FTC complaint is based. 

There are numerous cases being litigated here, and this matter is starting to seem like it belongs in a Charles Dickens novel.

The 11th Circuit’s Opinion

The 11th Circuit held that “the Commission’s cease and desist order, founded upon LabMD’s general negligent failure to act,” was unenforceable because it did not specifically list the data security protections that LabMD needed to implement to comply with the order. The court saw the FTC as holding LabMD to an “indeterminate standard of reasonableness.”

The court explicitly sidestepped the issue of the FTC’s power to use Section 5 unfairness to enforce privacy and security violations. Indeed, the court noted: “Because Congress thought impossible the task of legislating a comprehensive list of unfair acts or practices, it authorized the Commission to establish unfair acts or practices through case-by-case litigation.” This case-by-case approach is our focus in The FTC and the New Common Law of Privacy. Although most cases have resulted in settlements, the FTC complaints and settlement orders have been functioning as a kind of precedent with effects similar to those of common law cases. These complaints are not binding precedent, of course. But they do follow a coherent, consistent approach. The FTC’s data security complaints are grounded in a reasonable adherence to industry standards.

The 11th Circuit opinion focuses on the particulars of the order the FTC sought against LabMD, not on the underlying theory of unfairness or on the use of negligence as a standard to find unfairness. The court states: "We will assume arguendo that the Commission is correct and that LabMD’s negligent failure to design and maintain a reasonable data-security program invaded consumers’ right of privacy and thus constituted an unfair act or practice. The second question LabMD’s petition for review presents is whether the Commission’s cease and desist order, founded upon LabMD’s general negligent failure to act, is enforceable. We answer this question in the negative."

By sidestepping the first question, the 11th Circuit has seemingly avoided a split with the 3rd Circuit in FTC v. Wyndham Worldwide Corp.

On the second question, the court holds: “In sum, the prohibitions contained in cease and desist orders and injunctions must be specific. Otherwise, they may be unenforceable. Both coercive orders are also governed by the same standard of specificity, as the stakes involved for a violation are the same—severe penalties or sanctions.” The court notes: “In the case at hand, the cease and desist order contains no prohibitions. It does not instruct LabMD to stop committing a specific act or practice. Rather, it commands LabMD to overhaul and replace its data-security program to meet an indeterminable standard of reasonableness.” 

Implications of the LabMD Opinion

1. On its face, the opinion could have been worse for the FTC.

This was not a strong case for the FTC. The FTC has many opportunities to drop it. There was little precedential value in this action against LabMD, the facts were under a cloud of dispute, the record was a mess, and the case was a wild circus. Strategically, the smart thing for the FTC to have done would have been to drop the case in light of the disputed facts and messy background. There was little to gain by pursuing the LabMD case but a lot to lose. 

This opinion could have been far worse for the FTC. The FTC will now have to list specific measures in its cease and desist orders. What remains unclear is how this opinion will affect the FTC’s consent decrees reached in settlements with companies (the vast majority of cases), as these are agreed to by the companies subject to them, so these companies would not be challenging them in the way that LabMD did. Perhaps more companies will fight the FTC rather than settle.   

2. The court overlooks the history and context of the FTC’s jurisprudence regarding privacy and security. 

The court overstates how “indeterminate” the order was. Certainly, the order, alone on its face is vague. But there’s an entire body of FTC jurisprudence and longstanding practices that one can look to for guidance about what the order requires. As we have noted before in our analysis of all the FTC data security cases, the FTC doesn’t just invent its view of data security from out of nowhere. Instead, it looks to the most used standards, such as NIST 800-53 and ISO 27001. The Gramm-Leach-Bliley Safeguards Rule follows a similar approach.

Moreover, as we noted in our article, The FTC and the New Common Law of Privacy, there is a considerable body of case documents (complaints and consent decrees) as well as reports and other guidance documents that provide a lot of information about how the FTC views data security. 

The FTC’s jurisprudence is no less indeterminate than many other bodies of law. And the FTC order isn’t just an isolated document. There is a context and a lot of background. It is widely available and far easier to digest than the law in most other areas. The court didn’t seem to consider the background, history, and nuance of established data security practices and standards and the FTC’s coherence with them. 

3. The court’s holding ironically incentivizes FTC micromanaging and checklist-like security rules.

The irony is that the FTC’s vagueness here gives companies a lot more leeway in defining what their next steps should be. The court is afraid of the FTC “micromanaging” (p. 30) companies, but the irony is that the very general and vague requirements will involve much less micromanaging than listing specific measures that companies should undertake. 

Industry seems to want both flexibility and specificity in their data security rules. But companies can’t have it both ways. When specific laws like COPPA place detailed restrictions on companies, they often complain they are being micromanaged. They bemoan the inefficiencies created by rules that can be both over and under inclusive. They say they need flexibility to respond to contextual threats that are difficult to respond to ahead of time.

The same has been said of data security. Many in industry don’t want a government agency to force them to use specific measures – they want flexibility to use the measures they deem best, especially in an area of fast-paced change. The minute the ink dries on a specific list of measures, new developments can make some of these measures obsolete. 

The 11th Circuit seems to be asking for singular points of failure and a checklist of data security duties. But that’s not how good data security rules work. Companies seeking to protect data need the flexibility to make context-sensitive actions which, alone, are not a huge deal but collectively are very consequential. In some contexts, specific actions like encrypting data, requiring complex password protocols, and segmenting networks will matter more than others. It depends upon the relevant threat models, which are constantly changing. The more specificity is required ex ante, the more hamstrung companies are to be responsive to change and context.

Through reasonableness, the FTC uses industry norms as the measure of a company’s security. But this decision would seem to go in the other direction, channeling the FTC to reduce data security to a specific list of no-no’s instead of giving the company flexibility to respond to context-dependent threats. It is far from clear that a checklist approach is better than a more generalized reasonableness approach. And, it is far from clear that industry benefits from the FTC taking such an approach. 

4. How limited is the FTC after this? 

This opinion can be read narrowly, as just requiring more specific measures in a cease and desist order. Read narrowly, the opinion doesn’t even restrict what the FTC can require in a voluntary settlement agreement. If that’s the case, only a few FTC cases would be affected. The opinion says very little about the FTC’s general power to enforce Section 5 unfairness.

But if the opinion is interpreted more broadly to limit the FTC’s reasonableness approach to data, the agency might be tempted to retreat and start pulling back on its enforcement. We think that this would be a big mistake. Doing so would cede leadership to the states and to the EU. It would jeopardize the EU/US Privacy Shield, which relies upon the FTC holding companies accountable for sound privacy and data security practices. 

Instead, we hope the FTC will see this case as a lesson learned. The FTC’s choice to pursue a cease and desist order against LabMD seems ill-advised. Even if the FTC wins, the case doesn’t establish anything new. The facts are in dispute. FTC actions that don’t have a high return for the risk are best avoided anyway. Perhaps even the specificity demanded by the 11th Circuit can still be flexible enough to avoid harmful micromanagement. Perhaps we’ll finally see some movement on Capitol Hill giving the FTC explicit data security rulemaking power. The court’s decision might ultimately only affect the FTC’s approach at the margins, but if it results in less enforcement and a restrictive, rote, and context-blind approach to data security rules, then we will all be worse off. 

Daniel J. Solove is the John Marshall Harlan Research Professor of Law at George Washington University Law School and the founder of TeachPrivacy, a privacy awareness and security training company. He is the author of 10 books and more than 50 articles. 

Woodrow Hartzog is a Professor of Law and Computer Science at Northeastern University’s School of Law and College of Computer and Information Science. He is the author of Privacy’s Blueprint: The Battle to Control the Design of New Technologies, published in 2018 by Harvard University Press. 

Professor Solove's Privacy + Security Training

Highly-Engaging and Highly-Effective Security Training

Who's Speaking at the Privacy+Security Forum?


Do we blamed for network security when a server lost data ....

回复
Kip Boyle

Cyber Resilience Thought Leader | CEO, Cyber Risk Opportunities | Cybersecurity LinkedIn Learning Course Instructor | Co-host Cyber Risk Management Podcast | Amazon Best Selling Author | International Keynote Speaker

6 年
Michael J. Daugherty

Cybersecurity CEO @ LabMD | Privacy, Advocacy

6 年

Well it looks like you have moved a bit, finally acknowledging the FTC should have left LabMD alone, but you still refuse to say hello to the elephant in room. That elephant is the FACT that your unsupervised FTC was so ignorant that they sat in bed for YEARS with Tiversa and didn’t recognize it was a criminal enterprise. Oh, but they knew, because they helped Tiversa commit a felony by moving medical files. If you want to whistle past that graveyard that speaks volumes. The FTC didn’t vet their evidence. That’s not a point of view. They deserved to get burned. They played fast, loose and dirty. You appear fine with it. Why is that? Best of luck running a business and then not scoff at a regulator who anoints themselves as the decider. It’s unlawful. Read the law. Do you actually opine that it’s ok to have a bunch of lawyers and academics hold the reins of regulation peppered with their desire to move the goal posts ad nauseam, communicate no standards, and regulate on a case by case basis? Go live in Cuba. I hear it’s cheap there, but only good if you sit at the top. There is no due process there, freedom is toast, but the data is safe. Er wait....that sounds like a dream to some stubborn people I know.

要查看或添加评论,请登录

Daniel Solove的更多文章

  • Privacy Scholarship News

    Privacy Scholarship News

    I have a few items of scholarship news to share. SSRN Downloads: A Personal Milestone I’m excited and grateful for this…

    1 条评论
  • U.S. State Privacy Laws: Making Sense of the Mess

    U.S. State Privacy Laws: Making Sense of the Mess

    The year kicked off with several privacy laws coming into effect, and there are several more scheduled to become active…

    8 条评论
  • 2024 Highlights: Privacy and AI Training and Whiteboards

    2024 Highlights: Privacy and AI Training and Whiteboards

    Here’s a roundup of my privacy training and whiteboards in 2024. Training European Union AI Act NIST Privacy Framework…

    5 条评论
  • 2024 Highlights: Privacy and AI Cartoons and Posts

    2024 Highlights: Privacy and AI Cartoons and Posts

    Here’s a roundup of my cartoons and blog posts for 2024. CARTOONS Notice and Choice Personal Data AI Restaurant AI…

    3 条评论
  • 2024 Highlights: Privacy and AI Scholarship

    2024 Highlights: Privacy and AI Scholarship

    Here’s a roundup of my scholarship for 2024. But first, a preview of my forthcoming book (Feb 2025): ON PRIVACY AND…

    3 条评论
  • 2024 Highlights: Privacy and AI Webinars

    2024 Highlights: Privacy and AI Webinars

    Here’s a roundup of my webinars from 2024. Don’t want to miss a video? Please subscribe to my YouTube channel.

    1 条评论
  • What Kafka Can Teach Us

    What Kafka Can Teach Us

    Although Kafka shows us the plight of the disempowered individual, his work also paradoxically suggests that empowering…

    4 条评论
  • The Tyranny of Algorithms

    The Tyranny of Algorithms

    We live today increasingly under the tyranny of algorithms. They rule over us.

    21 条评论
  • FERPA & School Privacy

    FERPA & School Privacy

    When it comes to privacy issues, schools are in the Dark Ages. I cannot think of any other industry that is so far…

    1 条评论
  • Why Individual Rights Can't Protect Privacy

    Why Individual Rights Can't Protect Privacy

    Today, the California Privacy Protection Agency (CPPA) published a large advertisement in the San Francisco Chronicle…

    17 条评论

社区洞察

其他会员也浏览了