TikTok & Reddit Under Scrutiny, AI Transparency in the EU, and US Cracks Down on Location Data
Privacy Corner Newsletter: March 13, 2025
By Robert Bateman and Privado.ai
In this edition of the Privacy Corner Newsletter:
ICO investigates TikTok, Imgur, and Reddit over kids’ privacy allegations
The UK data protection authority (DPA), the Information Commissioner’s Office (ICO), has announced investigations into three tech companies over their handling of children’s personal data.
? How much do we know about these investigations?
In a press release issued last week, the ICO says:?
In Wednesday’s speech at the IAPP Data Protection Intensive conference, Information Commissioner John Edwards asked:
“How do these platforms protect children’s personal information? How do their recommender systems work, how are they using a 13 year old’s preferences, viewing habits, shares and likes to serve them content, and keep them on the platform? And, in the case of Reddit and Imgur, how do they assess the age of their users and tailor their content accordingly?”
? What will happen as a result of these investigations?
Edwards did not reveal whether the ICO was pursuing formal enforcement against the three companies. However, he said other companies processing children’s data should take the investigative sweep as “a sign to get your own house in order.”
In 2023, the regulator issued a £12.7 million ($16.5 million) fine against TikTok. That investigation was initiated by the UK’s previous Commissioner, who proposed a higher fine of £27 million ($35 million).
Last year, the ICO concluded an investigation into Snapchat’s “My AI” chatbot without taking formal action after the company presented the regulator with five successive data protection impact assessments (DPIAs)..
In its recent press release, the ICO referenced five companies—X, Sendit, BeReal, Dailymotion, and Viber—who had reportedly made or committed to improvements around how they handle children’s data.
“My message is simple,” says a statement from Information Commissioner John Edwards on the ICO’s website. “If social media and video sharing platforms want to benefit from operating in the UK they must comply with data protection law.”
But balancing data protection and online safety obligations is far from simple.
Together with guidance on the Online Safety Act from the UK’s media regulator, Ofcom, the ICO’s new investigations might help UK businesses better understand whether and how they should be assessing their users’ ages.
CJEU rules on ‘meaningful information’ when making automated decisions
The Court of Justice of the European Union (CJEU) has published a judgment explaining the extent of controllers’ transparency obligations when conducting “automated decision-making” under the GDPR.
? What’s the background?
The data subject (“CK”) applied for a mobile phone contract but was rejected based on an automated creditworthiness assessment by D&B.
CK requested information from D&B under Article 15(1)(h) GDPR, which entitles data subjects to “meaningful information about the logic involved” in automated decision-making that falls under Article 22 GDPR (including credit checks).
However, the information D&B provided appeared to show that CK had good credit. When ordered to provide additional information, D&B refused, claiming that doing so would expose its trade secrets.
The case ended up at the CJEU, with the referring court asking for an interpretation of various issues related to Article 15(1)(h) GDPR.
? What is ‘meaningful information about the logic involved’?
The CJEU determined that, In the context of Article 15(1)(h) GDPR, “meaningful information” must:
Meaningful information does not consist of complex mathematical formulas or detailed descriptions of every step of the automated process..
? What if the information includes personal data about third parties?
If the information cannot be provided without infringing on the rights of other persons, the controller must provide the information to a DPA or court.?
The DPA or court must then balance the rights and interests at stake and decide what information can be disclosed to the data subject.
? What if the information includes trade secrets?
If trade secrets are involved, then—again—the controller must submit the information to a DPA or court.?
This body will independently assess whether a trade secret genuinely exists and, if so, what information the data subject is entitled to,
National laws that automatically exclude the right of access when trade secrets are involved are not compatible with the GDPR. EU law requires an individual balancing of interests in each case.
California Attorney General announces location data enforcement sweep
California Attorney General (AG) Rob Bonta has announced an “investigative sweep” into the processing of precise location data in violation of the California Consumer Privacy Act (CCPA).
? What does the CCPA say about location data?
Under the CCPA, location data is “sensitive personal information” when it’s accurate within a radius of 1,750 feet.?
One of the CCPA’s more complicated (and, as yet, untested) provisions is the “right to limit the use and disclosure of sensitive personal information”.?
If a consumer exercises their “right to limit”, a business may only use the consumer’s sensitive personal information for one of eight purposes, set out in the California Privacy Protection Agency’s (CCPA) regulations.?
Here’s a summarized version of each of the purposes:
If a business uses precise location data—or any other sort of sensitive personal information—for purposes other than those listed above, the business must establish a way for consumers to exercise their “right to limit”.
Note that the list does not include targeted advertising, selling information to data brokers, and other commercial activities, so businesses must stop using precise location data for such purposes if requested.
What We’re Reading