UK court rules on fake consent in gambling ads, Google faces data transfer issues, New York strengthens health data privacy.

UK court rules on fake consent in gambling ads, Google faces data transfer issues, New York strengthens health data privacy.

Privacy Corner Newsletter: January 30, 2025

By Robert Bateman and Privado.ai

In this edition of the Privacy Corner Newsletter:

  • A UK court ruling suggests that controllers should factor in data subjects’ potential vulnerabilities before relying on “consent”
  • Google is accused of violating US law by transferring personal data to China and Russia
  • New York passes an incredibly strict health privacy law
  • What we’re reading: Recommended privacy content for the week




Before we begin…

Bridge 2025: A Technical Privacy Summit is happening in just a week!

Tune in alongside thousands of privacy professionals. Join 44 speakers sharing their insights and expertise over two days packed with fast-paced sessions.

Highlight sessions include:

  • Keynote: The Future of Privacy Technology → From Trust to Evidence
  • Adtech Privacy Risks: Increasing Regulation and Enforcement
  • How Privacy Engineers Delivery ROI
  • The Evolution of the DPO: Reducing Risk at the Speed of Innovation
  • From the Trenches: Solving AI Governance Challenges

Discover the missing link between privacy and engineering at Bridge, an event designed to help you go from non-prescriptive privacy laws to practical privacy engineering solutions.

The event runs online from Feb 5-6, 2025. See you there!




UK court rules gambling firm breached cookie consent rules

The High Court of England and Wales has ruled that an online gambling company unlawfully used cookies to target a vulnerable person with direct marketing.

  • In RTM v Bonne Terre Ltd, a gambling addict sued Sky Betting and Gaming (SBG) over allegations that the company targeted him with marketing based on information about his addiction.
  • The court found that while SBG ostensibly obtained consent for both cookies and direct marketing, the consent did not meet the standard required under data protection law—partly because of the claimant’s vulnerabilities.
  • The case suggests that companies should consider whether their users are subjectively capable of providing freely-given consent before relying on this legal basis for processing.


? What’s the background?

The anonymous claimant, “RTM”, spent thousands of pounds gambling on SBG’s app during the relevant period (2017 and 2018). He alleged that SBG profiled him and targeted him with direct marketing using certain data collected to comply with gambling industry regulations.

SBG was obliged to collect certain data about users’ behavior—such as their total spending, time spent gambling, other “safer gambling” data—and suppress their accounts once they met a certain threshold.?

But unless and until a user met that “suppression threshold”, SBG would also use this “safer gambling” data for marketing purposes.

For example, RTM often gambled in the morning—a tell-tale sign of problematic gambling behavior. SBG used this information to send RTM marketing and special offers in the morning.


? Did SBG get consent?

There’s some debate between RTM and SBG about whether the company had obtained consent for these activities.

But the focus of this judgment is whether RTM was, in the circumstances, capable of providing consent.

Based on the two data protection laws in place throughout the relevant period (the UK Data Protection Act 1998 and UK GDPR), the judge provides a three-step test for establishing whether consent meets the required legal standard:

  1. An individual's “subjective consent” (in this case, whether RTM’s ability to consent was impaired by his gambling addiction)
  2. The “quality of autonomy” in any decisions they made about consent (whether RTM was given the necessary notices and mechanisms to freely opt into the processing), and?
  3. The controller’s “evidential basis” for relying on consent (whether SBG could show that RTM gave consent)

The judge found issues with all three steps of the test and concluded that SBG had used cookies and send direct marketing unlawfully.


? So, do controllers need to understand the mental state of every single data subject?

This judgment has implications for companies relying on consent in the gambling industry and beyond.

However, the case probably doesn’t require controllers to account for every data subject’s potential vulnerabilities before relying on their consent. “Businesses cannot operate… at the level of inquiring into every individual customer's subjective state of mind.”?

That said, the judge found that businesses must balance the commercial benefits of data processing with respect for individual autonomy by factoring in “decision points about consent” that optimize people’s decision-making, such as:

  • Providing “good quality, accessible, relevant and accurate information… to guide the decision-making processes
  • Taking steps to “focus individuals' minds soberly and separately” on the consent request
  • Avoiding “distracting” people with the “attractions” that might come with consent (such as special offers)

It might be difficult for controllers in certain industries to pass the “consent tests” provided in this case. For most businesses, the judgment is a reminder of the importance of:

  • Providing data subjects with all the information necessary to make an informed decision
  • Clearly documenting consents
  • Avoiding dark patterns in consent requests.


Data transfers: Google accused of sending personal data from the US to China and Russia

Google faces allegations of violating the Protecting Americans’ Data from Foreign Adversaries Act (PADFAA) by allegedly transferring personal data to entities in “foreign adversary countries”.

  • The complaint was lodged with the Federal Trade Commission (FTC) by nonprofits the Electronic Privacy Information Center (EPIC) and Irish Council for Civil Liberties (ICCL) Enforce.
  • The groups allege that Google exposes personal data about people in the US, including sensitive information and their “employment with the military and intelligence community”.
  • The complaint focuses on Google’s Real-Time Bidding (RTB) process, which governs how personal data is shared for advertising purposes.


? Hold on… This is about data transfers from the US?

That’s right—since last April, the US has a (fairly modest) law prohibiting certain transfers of “personally identifiable sensitive data” to “foreign adversary countries” including Russia, China, North Korea, and Iran.

Unlike Chapter V of the GDPR, the PADFAA is quite limited in scope—it applies to “data brokers”, only covers certain types of data, and is riddled with the sorts of exemptions we’ve come to expect from US privacy laws.


? How does Google allegedly violate this law?

The complainants explain how Google’s RTB system collects personal data about US individuals, such as their locations, political views, health, ethnicity, and online behavior.

The complaint references the Interactive Advertising Bureau (IAB)’s Content Taxonomy, implemented by Google, which provides a list of “segments” to which users are assigned based on inferences about their personality and interests, including “defence industry”, “sexual conditions”, and “Judaism”.

The complainants allege that Google has shared such data with businesses such as the Chinese tech firm Tencent and the Russian Demand-Side Platform (DSP) RuTarget, both of which have previously been listed as certified participants in the RTB system.


? So is Google still doing this?

While Google has cut ties with some of its partners after becoming aware of their foreign adversary status, the company says non-disclosure agreements prevent it from revealing whether it continues to deal with foreign adversary companies.

According to internal records obtained by the ICCL, Google shares data about US individuals around 31 billion times per day. Due to an alleged lack of control over what happens to that data downstream, the complainants argue that it continues to be accessible to foreign adversaries in violation of the PADFAA.


? What happens next?

The complainants have requested that the FTC:

  • Launch an investigation into Google’s RTB practices, focusing on whether they violate PADFAA or the FTC Act.
  • Order Google to stop any unlawful collection and disclosure of sensitive data about US individuals.
  • Require Google to adopt a “data minimization, protection, and deletion” program under FTC supervision.
  • Require Google to prove that it complies with the PADFAA.

The FTC’s response depends partly on the approach of President Trump’s new pick for FTC Chair, Andrew Ferguson, who has indicated that he might take a less proactive position on privacy enforcement than the outgoing Chair, Lina Khan.


New York passes an extremely far-reaching health privacy law

New York’s State Senate and Assembly have passed S929, the New York Health Information Privacy Act (NYHIPA).

  • The NYHIPA has a very wide territorial scope, applying to any entity, regardless of its location, that processes “regulated health information” about a New York resident.
  • The law also includes a broad definition of “regulated health data” and a restrictive list of permitted processing activities.
  • The NYHIPA takes effect on January 22, 2026.


? Is this law tougher than Washington’s My Health My Data Act (MHMDA)?

Washington’s MHMDA set the high-water mark for US health privacy laws, but the NYHIPA could cause even more concern for companies that process health-related data.

The most radical part of New York’s new law is how it applies. Under the law, a “regulated entity" means any entity that:

  • Controls the processing of regulated health information of an individual who is a New York resident.
  • Controls the processing of regulated health information of an individual who is physically present in New York while that individual is in New York, or
  • Is located in New York and controls the processing of regulated health information.

So according to this definition, the NYHIPA could apply to, for example:

  1. A French health app intended for the EU market with one New York resident among its users (whether the user is in New York or not).
  2. A UK health app with one British user who uses the app while on vacation in New York.
  3. A New York-based healthcare app whose users live exclusively in Australia.

It’s not clear that geoblocking New York would help to avoid liability under this law for any company processing “regulated health information”.


? But are many companies processing regulated health information?

The law’s definition of “regulated health information” is also very broad, covering any information that is:

  • Reasonably linkable to an individual, or a device, and?
  • Collected or processed in connection with the physical or mental health of an individual.

The definition explicitly includes:

  • Location or payment information that relates to an individual's physical or mental health, or
  • Any inference drawn or derived about an individual's physical or mental health that is reasonably linkable to an individual, or a device.


How does the law protect regulated health information?

First of all, selling regulated health information is outright banned, with the broad “sale” definition US privacy-watchers have become accustomed to.

There are two other conditions under which a regulated entity may process regulated health information:?

  • It has obtained a “valid authorization” from the individual?
  • Its activities fall under the law’s limited set of permitted purposes.?

Obtaining a valid authorization won’t be easy. Among many other conditions, you can’t request a valid authorization within 24 hours of the individual downloading an app or requesting a product or service.

The permitted purposes for which regulated entities can process regulated health information without a valid authorization are limited—and they specifically exclude research and marketing. But they do include providing a product or service requested by the individual.

There’s no private right of action attached to this law, but it includes civil penalties of up to 20% of a company’s New York turnover.?


What We’re Reading

Mauricio Ortiz, CISA

Great dad | Inspired Risk Management and Security | Cybersecurity | AI Governance & Security | Data Science & Analytics My posts and comments are my personal views and perspectives but not those of my employer

1 个月

The same google data transfer violation could also be applicable for Meta and other data collectors

回复

要查看或添加评论,请登录

Privado.ai的更多文章

社区洞察