Hey Siri, Pay Me $95 Million

Hey Siri, Pay Me $95 Million

Sometimes, it might be a drug deal discussion, a conversation about a surgical procedure, or a casual restaurant recommendation. What do these all have in common? Surprisingly, these are all examples of real-life private exchanges that may have been overheard by contractors working for Apple—listening in on conversations accidentally recorded by Siri.

On a couple of occasions when I have delivered training webinars, I have cited Apple as an example of a company that has used privacy as a unique selling proposition to improve its position in the market and potentially gain market share - as a way to show that privacy isn't only a cost centre, and can be a profit centre for organizations if implemented properly.?

I sometimes included photos of the billboard Apple famously erected during the Consumer Electronics Show (CES) in Las Vegas proclaiming, “What happens on your iPhone, stays on your iPhone,” ?and referenced the TV advertisements they’ve made that focused on the privacy benefits of their devices.?

Yet, a Guardian exposé in 2019 (yes we need to start in 2019 to get the full context of this story - stay with me, please) suggested that some of what happens on or around your iPhone might, in fact, end up in a contractor’s headphones.

Back then, a whistleblower disclosed to The Guardian that Apple employed external agencies to review snippets of Siri recordings for “quality control.” Apple contended that these audio clips were anonymized and only used to improve the voice assistant’s accuracy. However, the whistleblower revealed that Siri was easily triggered by routine noises and gestures - such as by the sound of a zipper or by simply raising one’s wrist, capturing far more than users intended. These “snippets” frequently included medical consultations, confidential business calls, and even intimate moments. While Apple claimed the recordings were anonymized, the whistleblower noted that sensitive metadata (such as location or contact details) sometimes accompanied the audio.

In short, it quickly became evident that Apple’s public statements about user privacy were being challenged by the realities of how Siri actually functioned - completely accidentally, of course.?

Fast-forward to January 2025, and Apple has now agreed to a proposed $95 million settlement in a class action lawsuit.?

Court processes from the case -? Lopez et al. v. Apple Inc. lawsuit (Case No. 5:19-cv-04577) - offer a glimpse into how Apple’s vaunted privacy image wound up at odds with allegations of secret recordings. The class action complaint, filed in the U.S. District Court for the Northern District of California, is led by a mother who is also acting as a representative for her minor child, A.L. The complaint claimed that Apple’s Siri-enabled devices—ranging from iPhones and iPads to Apple Watches—often activated themselves unintentionally, capturing private moments that were subsequently analyzed by human reviewers, citing the California Invasion of Privacy Act and the California Consumers Legal Remedies Act.

According to the complaint, many of these “unlawful and intentional” recordings involved conversations that were never preceded by the “Hey Siri” wake phrase. In one section, the plaintiffs highlighted the special vulnerability of minors who neither purchased nor consented (because of their ages, any purported consent would be invalid anyway).

Interestingly, Apple’s public image as a privacy champion was also a key part of the case against the company. The complaint highlighted how the company repeatedly assured consumers—and even testified before Congress—that Siri would only listen after a clear wake command. The plaintiffs argued that if users had known about these hidden recordings, they might have opted out of purchasing Siri-enabled products.?

The remedies sought included $5,000 in damages per violation, and the deletion of all unauthorized recordings, along with stronger privacy safeguards and the cessation of all unauthorized eavesdropping—particularly when children are involved.

Ultimately, the settlement document now filed as being the terms of the agreement by the pirates doesn’t include any admission of wrongdoing by Apple, but the company has agreed to pay $95 million to resolve the case, nevertheless.?

Who Can Claim? - Any U.S.-based individual who owned or used a Siri-enabled device between September 17, 2014, and December 31, 2024, is potentially eligible for a share of the settlement. The $20 is per device, so if you’re an Apple device collector, you just may have hit paydirt.?

As an aside - the question of who actually makes money from lawsuits about privacy (and generally, tech compliance) failures (if the allegations are true) like this is a very interesting one. If you're interested in reading more about that, let me know and I will write an article on it.

Going Forward: Siri and ChatGPT Integration

Around the same time as the lawsuit settlement, Apple introduced new AI features under the banner of “Apple Intelligence,” a suite of capabilities that leverage advanced machine learning partly on-device and partly in the cloud. With iOS 18.2, Siri now integrates with ChatGPT, allowing for more advanced, context-aware responses.

But here’s the thing - even discounting the concerns about privacy when it was just Siri processing personal data, combining Siri’s voice recognition with ChatGPT’s generative AI introduces fresh concerns about what data is shared and where it’s stored. For the first time, Apple has gone outside its own “walled garden,” partnering with OpenAI to power certain Siri requests via ChatGPT.

Here are some of the most important questions you’ll need to consider when next you activate Siri/ChatGPT on your shiny new phone:

Third-Party Data Processing

Because ChatGPT is run by OpenAI, users’ Siri requests (including attachments, photos, or documents) may be sent outside Apple’s ecosystem. This possibility is especially concerning if those recordings pick up the voices of other people who are unaware or have not consented to being recorded. If a user speaks to Siri in a shared space (e.g., during a family gathering, in a meeting, or around children), the AI could capture additional voices or personal details that neither Apple nor OpenAI originally intended to collect.

Location Data

Apple asserts it obscures specific IP addresses when transmitting data to OpenAI, but it still provides a general location. In many contexts—especially in family settings where children are present—this location data becomes more sensitive. Even approximate geographical details could reveal patterns about users’ daily routines or about minors’ whereabouts. If Siri triggers itself unintentionally or processes multiple requests in the background, the accumulation of location markers could become a silent repository of personal information.

Storing User Requests

Apple states that OpenAI is forbidden from using Siri requests for training its models—unless required by law—but verifying such a mandate in real time poses challenges. If a user logs in with a ChatGPT account, OpenAI’s own privacy policies take effect. That raises questions about exactly how user data is stored and whether it might eventually be “de-identified” and fed into larger AI training sets. Users have limited transparency or control over what happens once the data leaves Apple’s direct oversight. This murkiness intensifies if recorded voices belong to third parties or minors who never explicitly granted permission.

Voice Recording and Advanced Analysis

Siri’s evolution toward “visual intelligence” means it could interpret and analyze images, documents, or other media—potentially in tandem with ChatGPT’s generative capabilities. This expansion of data types complicates the question of consent. For instance, if someone uses Siri to scan or describe a sensitive medical document, how does Apple ensure that OpenAI’s models do not extract or store identifiable information from those images? Similarly, if a bystander appears in a video or a child’s voice is in the background, are there safeguards to prevent that data from being retained or cross-referenced with other datasets?

In the end, the tension between convenience and confidentiality is at the heart of every AI-driven feature. As we weigh the benefits of voice assistants that can recommend a restaurant, schedule a doctor’s appointment, or even parse a complex legal brief, it’s worth asking to what extent those conveniences outweigh the risks.

So, what do you think? Are the benefits of a hyper-personalized AI assistant worth the potential privacy risks? Or does the possibility of your personal life appearing in a contractor’s snippet feed outweigh the convenience of hands-free assistance? Do you still trust that “what happens on our iPhone stays on our iPhone?”

Sources:

https://www.courtlistener.com/docket/16027233/lopez-v-apple-inc/

https://www.theguardian.com/technology/2019/jul/26/apple-contractors-regularly-hear-confidential-details-on-siri-recordings

https://www.reuters.com/legal/apple-pay-95-million-settle-siri-privacy-lawsuit-2025-01-02/

P.S. I've been extremely busy prepping for my solicitor qualification exams, hence the hiatus here. I'll be more regular going forward so please do share this with others who'll find it interesting so they can subscribe to get future issues directly. Also, let me know if there are topics you'd like to see covered.

Oluwatomisin Okunola Amb.

Legal Analyst|Content Writer|Climate Action Enthusiast|Digital Marketer

1 个月

Brilliantly written. I believe that technology cannot be fully controlled. Moreover, these tech companies are always breaking one rule or the other in a bid to improve these technologies. We just need to be keen especially when signing Ts & Cs.

Oluwamurewa Jubee

Corporate law|| Litigation|| Blockchain||Fintech||web3

1 个月

This article is brilliantly written and also reminds me of John Grisham's novels particularly "The King of Torts" Clay Carter suddenly became so rich after winning a Class action suit against a Pharmaceutical Company. With the advent of machine learning, deep learning, Artificial intelligence, I hope we do not sacrifice Data Privacy on the altar of Technological breakthroughs.

Abdulhaleem Salisu Shehu

Compliance and Monitoring | Startups | AI Governance | Intellectual Property | Islamic Finance |

1 个月

I enjoyed reading this.

Bibitayo Emmanuel Ojo

Data Protection Analyst | Regulatory Compliance Specialist | AI Ethics

1 个月

Your last paragraph suggests risk management at an individual level. However, I've come to realise that risk rating to individuals is subjective. AI assistants like Siri despite posing a threat to privacy would still be tolerated by a lot of people who may not really see anything awkward in the embedded exploitation. While most people do not care about privacy, I hope I can also have a share of the settlement fees.??

要查看或添加评论,请登录

Ademola Adekunbi的更多文章

社区洞察

其他会员也浏览了