Generative AI and GDPR, Fines for Location Data Sharing
Privacy Corner Newsletter: May 3, 2024
By Robert Bateman and Privado.ai
In this week’s Privacy Corner Newsletter:
Does noyb’s complaint against OpenAI misunderstand generative AI?
Max Schrems’ campaign group, noyb, has submitted a complaint against OpenAI, alleging that the generative AI application ChatGPT violates the GDPR.
? Does noyb expect OpenAI to fix AI hallucinations overnight?
No, noyb apparently understands the issues around certain generative AI applications “hallucinating” inaccurate personal data and likely prompted ChatGPT with the expectation that it would generate false information.
Essentially, ChatGPT generates text based on the probability that strings of characters will appear sequentially based on patterns in its corpus of training data. Given imperfect information, inaccurate outputs are inevitable.
The complainant (possibly Max Schrems himself) asked ChatGPT for his date of birth. ChatGPT confidently asserted that the complainant’s date of birth was a particular day. On this occasion, its output was inaccurate.
The question is whether this aspect of OpenAI’s activities violates the GDPR.
? So does ChatGPT violate the GDPR?
Noyb thinks so, and similar allegations have been made in Italy , Poland , and some other EU countries. OpenAI denies the allegations and has made several changes to ChatGPT since its launch in an attempt to bring the platform into GDPR compliance.
After reading the incorrect reporting of his date of birth, noyb’s complainant submitted access and erasure requests to OpenAI.
Under the right of access, the complainant requested:
In response to the access request, OpenAI provided the complainant’s account details but not the information requested above.
Under the “right to erasure,” the complainant asked OpenAI to erase “the data subject’s incorrect date of birth from the results displayed by ChatGPT.” OpenAI reportedly replied that “there is no way to prevent its systems from displaying the data subject’s inaccurate date of birth in the output…”
As such, noyb alleges that OpenAI has:?
Note that there is no allegation that OpenAI has violated the right to erasure. But noyb requests that the Austrian DPA order OpenAI to rectify or erase the complainant’s inaccurate date of birth in order to comply with the GDPR’s “accuracy” principle.
The complaint acknowledges that OpenAI might be technically unable to meet this requirement, but states that this would not excuse the company from complying with the GDPR’s principles.
? What happens next?
OpenAI recently set up an establishment in Ireland. The normal process would be for the Austrian DPA to forward the complaint to the Irish DPA under the GDPR’s “One Stop Shop” process.
However, noyb has requested the Austrian DPA handle this complaint directly.
The complaint alleges that OpenAI’s Californian entity effectively acts as the controller regarding the processing of the relevant personal data via ChatGPT, and so the One Stop Shop supposedly does not apply.
This argument might be as important as the allegations about ChatGPT itself. Many cross-border complaints, particularly those involving Ireland (the vast majority), have languished for several years under the One Stop Shop process.
If noyb successfully argues that OpenAI cannot benefit from the One Stop Shop process, this could be a significant victory against other US tech companies.
领英推荐
FCC does an FTC, fines wireless carriers $200m for ‘illegally sharing access to customers’ location data’
The Federal Communications Commission (FCC) has issued almost $200 million in penalties to four of the largest wireless carriers in the US for allegedly sharing location data without appropriate notice or consent.
? What did these carriers allegedly do wrong?
These four carriers allegedly shared their customers’ location data with third pirates without consent.
For example, AT&T ran a “Location-Based Services program” until March 2019. The company allegedly sold location data to two “location information aggregators” (data brokers) who then resold access to the data to third parties, including other data brokers.
AT&T put contracts in place with the two data aggregators with whom it shared location data directly. However, the FCC says audits revealed serious data security issues among these companies (which AT&T said were resolved).
? Is that illegal?
Under Section 222 of the Communications Act, wireless carriers:?
Here’s how the law defines some of those terms:
The FCC found that the information these carriers shared was not “aggregate” and that people hadn’t consented to its sharing.
? How much was each company fined?
Here are the total fines issued against each company:
Note that Sprint and T-Mobile have merged since the investigations concluded.
? This all sounds familiar…
Indeed, the FTC took related enforcement action against data aggregators InMarket and X-Mode Social earlier this year.?
The FTC’s investigations took aim at a different stage in the data supply chain the FCC’s, but they show that the FCC is not the only authority concerned about the sale of location data and other sensitive information.
The FCC’s Privacy and Data Protection Task Force makes this sentiment clear, calling privacy and data protection “a top priority” and “crucial to all Americans”.
So if you’re operating in the US—and even if you’re not regulated by the increasingly aggressive FTC—think carefully about how you use personal data.
FTC finalizes updated Health Breach Notification Rule: It’s definitely not just about security incidents anymore
The FTC has finalized amendments to the Health Breach Notification Rule (HBNR).
? Why the new rule?
Here’s the background:
Two out of five commissioners opposed the new rule, arguing that it is broader than Congress intended. Nonetheless, the rule has been finalized and will take effect 60 days from its publication in the Federal Register on April 26 (June 25).
? So what’s new?
Here’s a summary of the changes under the revised HBNR:
Here’s the gist: Companies in the health space (and not covered by HIPAA) should now have little doubt that the HBNR covers health apps and the non-consensual sharing of health information via APIs and similar technologies.
Along with the FTC’s enforcement against health companies under the FTC Act and new health privacy laws in Washington, Nevada, and elsewhere, the revised HBNR is another example of how drastically the US privacy landscape is changing—in healthcare and practically every other area.
What We’re Reading
Privacy, Data Protection, Compliance | CIPP, CIPM, FIP
6 个月Where literalism and maximalism meet, there will be implausible expectations on incomplete technologies. ??