UK court rules on fake consent in gambling ads, Google faces data transfer issues, New York strengthens health data privacy.
Privacy Corner Newsletter: January 30, 2025
By Robert Bateman and Privado.ai
In this edition of the Privacy Corner Newsletter:
Before we begin…
Bridge 2025: A Technical Privacy Summit is happening in just a week!
Tune in alongside thousands of privacy professionals. Join 44 speakers sharing their insights and expertise over two days packed with fast-paced sessions.
Highlight sessions include:
Discover the missing link between privacy and engineering at Bridge, an event designed to help you go from non-prescriptive privacy laws to practical privacy engineering solutions.
The event runs online from Feb 5-6, 2025. See you there!
UK court rules gambling firm breached cookie consent rules
The High Court of England and Wales has ruled that an online gambling company unlawfully used cookies to target a vulnerable person with direct marketing.
? What’s the background?
The anonymous claimant, “RTM”, spent thousands of pounds gambling on SBG’s app during the relevant period (2017 and 2018). He alleged that SBG profiled him and targeted him with direct marketing using certain data collected to comply with gambling industry regulations.
SBG was obliged to collect certain data about users’ behavior—such as their total spending, time spent gambling, other “safer gambling” data—and suppress their accounts once they met a certain threshold.?
But unless and until a user met that “suppression threshold”, SBG would also use this “safer gambling” data for marketing purposes.
For example, RTM often gambled in the morning—a tell-tale sign of problematic gambling behavior. SBG used this information to send RTM marketing and special offers in the morning.
? Did SBG get consent?
There’s some debate between RTM and SBG about whether the company had obtained consent for these activities.
But the focus of this judgment is whether RTM was, in the circumstances, capable of providing consent.
Based on the two data protection laws in place throughout the relevant period (the UK Data Protection Act 1998 and UK GDPR), the judge provides a three-step test for establishing whether consent meets the required legal standard:
The judge found issues with all three steps of the test and concluded that SBG had used cookies and send direct marketing unlawfully.
? So, do controllers need to understand the mental state of every single data subject?
This judgment has implications for companies relying on consent in the gambling industry and beyond.
However, the case probably doesn’t require controllers to account for every data subject’s potential vulnerabilities before relying on their consent. “Businesses cannot operate… at the level of inquiring into every individual customer's subjective state of mind.”?
That said, the judge found that businesses must balance the commercial benefits of data processing with respect for individual autonomy by factoring in “decision points about consent” that optimize people’s decision-making, such as:
It might be difficult for controllers in certain industries to pass the “consent tests” provided in this case. For most businesses, the judgment is a reminder of the importance of:
Data transfers: Google accused of sending personal data from the US to China and Russia
Google faces allegations of violating the Protecting Americans’ Data from Foreign Adversaries Act (PADFAA) by allegedly transferring personal data to entities in “foreign adversary countries”.
? Hold on… This is about data transfers from the US?
That’s right—since last April, the US has a (fairly modest) law prohibiting certain transfers of “personally identifiable sensitive data” to “foreign adversary countries” including Russia, China, North Korea, and Iran.
Unlike Chapter V of the GDPR, the PADFAA is quite limited in scope—it applies to “data brokers”, only covers certain types of data, and is riddled with the sorts of exemptions we’ve come to expect from US privacy laws.
? How does Google allegedly violate this law?
The complainants explain how Google’s RTB system collects personal data about US individuals, such as their locations, political views, health, ethnicity, and online behavior.
The complaint references the Interactive Advertising Bureau (IAB)’s Content Taxonomy, implemented by Google, which provides a list of “segments” to which users are assigned based on inferences about their personality and interests, including “defence industry”, “sexual conditions”, and “Judaism”.
The complainants allege that Google has shared such data with businesses such as the Chinese tech firm Tencent and the Russian Demand-Side Platform (DSP) RuTarget, both of which have previously been listed as certified participants in the RTB system.
? So is Google still doing this?
While Google has cut ties with some of its partners after becoming aware of their foreign adversary status, the company says non-disclosure agreements prevent it from revealing whether it continues to deal with foreign adversary companies.
According to internal records obtained by the ICCL, Google shares data about US individuals around 31 billion times per day. Due to an alleged lack of control over what happens to that data downstream, the complainants argue that it continues to be accessible to foreign adversaries in violation of the PADFAA.
? What happens next?
The complainants have requested that the FTC:
The FTC’s response depends partly on the approach of President Trump’s new pick for FTC Chair, Andrew Ferguson, who has indicated that he might take a less proactive position on privacy enforcement than the outgoing Chair, Lina Khan.
New York passes an extremely far-reaching health privacy law
New York’s State Senate and Assembly have passed S929, the New York Health Information Privacy Act (NYHIPA).
? Is this law tougher than Washington’s My Health My Data Act (MHMDA)?
Washington’s MHMDA set the high-water mark for US health privacy laws, but the NYHIPA could cause even more concern for companies that process health-related data.
The most radical part of New York’s new law is how it applies. Under the law, a “regulated entity" means any entity that:
So according to this definition, the NYHIPA could apply to, for example:
It’s not clear that geoblocking New York would help to avoid liability under this law for any company processing “regulated health information”.
? But are many companies processing regulated health information?
The law’s definition of “regulated health information” is also very broad, covering any information that is:
The definition explicitly includes:
How does the law protect regulated health information?
First of all, selling regulated health information is outright banned, with the broad “sale” definition US privacy-watchers have become accustomed to.
There are two other conditions under which a regulated entity may process regulated health information:?
Obtaining a valid authorization won’t be easy. Among many other conditions, you can’t request a valid authorization within 24 hours of the individual downloading an app or requesting a product or service.
The permitted purposes for which regulated entities can process regulated health information without a valid authorization are limited—and they specifically exclude research and marketing. But they do include providing a product or service requested by the individual.
There’s no private right of action attached to this law, but it includes civil penalties of up to 20% of a company’s New York turnover.?
What We’re Reading
Great dad | Inspired Risk Management and Security | Cybersecurity | AI Governance & Security | Data Science & Analytics My posts and comments are my personal views and perspectives but not those of my employer
1 个月The same google data transfer violation could also be applicable for Meta and other data collectors