TikTok & Reddit Under Scrutiny, AI Transparency in the EU, and US Cracks Down on Location Data

TikTok & Reddit Under Scrutiny, AI Transparency in the EU, and US Cracks Down on Location Data

Privacy Corner Newsletter: March 13, 2025

By Robert Bateman and Privado.ai

In this edition of the Privacy Corner Newsletter:

  • The UK ICO announces investigations into three tech companies it suspects of violating children’s data protection rules.
  • The CJEU rules on providing ‘meaningful information’ when making automated decisions under the GDPR.
  • California is conducting an “investigative sweep” into the sharing of location data.
  • What we’re reading: Recommended privacy content for the week.


ICO investigates TikTok, Imgur, and Reddit over kids’ privacy allegations


The UK data protection authority (DPA), the Information Commissioner’s Office (ICO), has announced investigations into three tech companies over their handling of children’s personal data.

  • TikTik faces an investigation into how it “uses 13–17-year-olds' personal information to make recommendations to them.”
  • The probes into social media platform Reddit and image hosting website Imgur concern how those companies “assess the age of their child UK users.”
  • In a keynote at the International Association of Privacy Professionals (IAPP) conference in London on Wednesday, the UK Information Commissioner said the investigations should “serve as a warning shot” to smaller organizations.


? How much do we know about these investigations?


In a press release issued last week, the ICO says:?

  • Its investigation into TikTok concerns content recommendations among 13-17-year-old users?
  • Its investigations into Reddit and Imgur are about the companies’ “age assurance” measures

In Wednesday’s speech at the IAPP Data Protection Intensive conference, Information Commissioner John Edwards asked:

“How do these platforms protect children’s personal information? How do their recommender systems work, how are they using a 13 year old’s preferences, viewing habits, shares and likes to serve them content, and keep them on the platform? And, in the case of Reddit and Imgur, how do they assess the age of their users and tailor their content accordingly?”


? What will happen as a result of these investigations?

Edwards did not reveal whether the ICO was pursuing formal enforcement against the three companies. However, he said other companies processing children’s data should take the investigative sweep as “a sign to get your own house in order.”

In 2023, the regulator issued a £12.7 million ($16.5 million) fine against TikTok. That investigation was initiated by the UK’s previous Commissioner, who proposed a higher fine of £27 million ($35 million).

Last year, the ICO concluded an investigation into Snapchat’s “My AI” chatbot without taking formal action after the company presented the regulator with five successive data protection impact assessments (DPIAs)..

In its recent press release, the ICO referenced five companies—X, Sendit, BeReal, Dailymotion, and Viber—who had reportedly made or committed to improvements around how they handle children’s data.

“My message is simple,” says a statement from Information Commissioner John Edwards on the ICO’s website. “If social media and video sharing platforms want to benefit from operating in the UK they must comply with data protection law.”

But balancing data protection and online safety obligations is far from simple.

Together with guidance on the Online Safety Act from the UK’s media regulator, Ofcom, the ICO’s new investigations might help UK businesses better understand whether and how they should be assessing their users’ ages.


CJEU rules on ‘meaningful information’ when making automated decisions

The Court of Justice of the European Union (CJEU) has published a judgment explaining the extent of controllers’ transparency obligations when conducting “automated decision-making” under the GDPR.

  • In Case C?203/22 Dun & Bradstreet (D&B) Austria, the data subject was rejected for a phone contract despite her ostensibly strong credit score. She alleged that credit checking firm D&B failed to provide “meaningful information” about the involved in the decision.
  • The court ruled that controllers must provide clear and intelligible information to help data subjects understand the link between their personal data, any decision-making algorithms, and the automated decision in question.
  • The CJEU also found that where providing such information would reveal a trade secret or violate the rights of a third party, it must be submitted to a court or Data Protection Authority (DPA) to balance the rights of the parties concerned.


? What’s the background?

The data subject (“CK”) applied for a mobile phone contract but was rejected based on an automated creditworthiness assessment by D&B.

CK requested information from D&B under Article 15(1)(h) GDPR, which entitles data subjects to “meaningful information about the logic involved” in automated decision-making that falls under Article 22 GDPR (including credit checks).

However, the information D&B provided appeared to show that CK had good credit. When ordered to provide additional information, D&B refused, claiming that doing so would expose its trade secrets.

The case ended up at the CJEU, with the referring court asking for an interpretation of various issues related to Article 15(1)(h) GDPR.


? What is ‘meaningful information about the logic involved’?

The CJEU determined that, In the context of Article 15(1)(h) GDPR, “meaningful information” must:

  • Explain the procedure and principles actually used in automated decision-making about the data subject, such as profiling for creditworthiness
  • Be provided in a concise, transparent, intelligible, and easily accessible form
  • Enable the data subject to verify whether the automated decision is accurate

Meaningful information does not consist of complex mathematical formulas or detailed descriptions of every step of the automated process..


? What if the information includes personal data about third parties?

If the information cannot be provided without infringing on the rights of other persons, the controller must provide the information to a DPA or court.?

The DPA or court must then balance the rights and interests at stake and decide what information can be disclosed to the data subject.


? What if the information includes trade secrets?

If trade secrets are involved, then—again—the controller must submit the information to a DPA or court.?

This body will independently assess whether a trade secret genuinely exists and, if so, what information the data subject is entitled to,

National laws that automatically exclude the right of access when trade secrets are involved are not compatible with the GDPR. EU law requires an individual balancing of interests in each case.


California Attorney General announces location data enforcement sweep

California Attorney General (AG) Rob Bonta has announced an “investigative sweep” into the processing of precise location data in violation of the California Consumer Privacy Act (CCPA).

  • AG Bonta says his office has written to “advertising networks, mobile app providers, and data brokers” suspected to be illegally processing location data.
  • The CCPA recognizes precise location data as a type of “sensitive personal information”.
  • CCPA-covered businesses must facilitate California residents’ right to “limit the use and disclosure” of their precise location and other sensitive personal information.


? What does the CCPA say about location data?

Under the CCPA, location data is “sensitive personal information” when it’s accurate within a radius of 1,750 feet.?

One of the CCPA’s more complicated (and, as yet, untested) provisions is the “right to limit the use and disclosure of sensitive personal information”.?

If a consumer exercises their “right to limit”, a business may only use the consumer’s sensitive personal information for one of eight purposes, set out in the California Privacy Protection Agency’s (CCPA) regulations.?


Here’s a summarized version of each of the purposes:

  1. To perform services or provide goods expected by an average consumer, such as using geolocation for navigation apps but not for gaming apps.
  2. To prevent, detect, and investigate security incidents affecting stored or transmitted personal information, such as sharing login data with security firms investigating breaches.
  3. To resist malicious, deceptive, fraudulent, or illegal actions and prosecute offenders, including investigating discrimination claims using ethnicity or message contents.
  4. To ensure physical safety, such as sharing geolocation data with law enforcement in cases of suspected kidnapping.
  5. For short-term, transient use, like nonpersonalised advertising that does not involve profiling or data disclosure to third parties.
  6. To perform services on behalf of the business, including maintaining accounts, processing transactions, verifying customer information, and providing analytics.
  7. To verify or maintain the quality, safety, or performance of products or. services, such as using driver’s license data to test recognition software in car rentals.
  8. To collect or process sensitive personal information without inferring characteristics about the consumer, such as using search queries for health-related articles without profiling users.

If a business uses precise location data—or any other sort of sensitive personal information—for purposes other than those listed above, the business must establish a way for consumers to exercise their “right to limit”.

Note that the list does not include targeted advertising, selling information to data brokers, and other commercial activities, so businesses must stop using precise location data for such purposes if requested.




What We’re Reading

要查看或添加评论,请登录

Privado.ai的更多文章