Apple Intelligence is here! (nearly)
Craig Barraclough
Operational focused CTO | AI Thought Leader | Financial Services | Technology Services | Strategic Business Leadership | Technology and Operations | Risk Committee Member | Scuba Diver
At their annual developer conference (WWDC) yesterday Apple unveiled lots of new features in the upcoming iOS 18 (see more here) but the main focus was on Apple Intelligence their new AI system which is expected to be integrated into the majority of it’s product set.
The AI features will be powered by Apple's own technology but also critically by tools provided OpenAI – the makers of ChatGPT. In fact ChatGPT will also be integrated in Apple devices later this year. ?
Even before the announcement there were louds calls of concern from security experts worried about data privacy and how user data may be handed over to 3rd parties, like OpenAI, without us knowing it.
Elon Musk even tweeted (X’d) last night “If Apple integrates OpenAI at the OS level, then Apple devices will be banned at my companies. That is an unacceptable security violation... And visitors will have to check their Apple devices at the door, where they will be stored in a Faraday cage"
If you listened to the rhetoric it seems to be a strange play by Apple, a company who are notoriously security conscious.
So, what’s the problem?
Modern AI requires a lot of computing power that even the latest smartphones can't handle on their own. This means for certain tasks (not clear which) your data will be sent to remote servers in the cloud for processing. This is where the privacy concerns come in, as your personal information could be exposed to various other companies or hackers, in this case the worry is OpenAI, who don’t have a great track record of safety and security.
But if you dig into this further it is clear some very smart people at Apple have been thinking about this problem.
Apple's answer is Private Cloud Compute - a way for your iPhone to securely outsource complex AI tasks to Apple's own data centres. Your data never leaves Apple's tightly controlled environment, and it's processed in a secure, encrypted form that even Apple can't see.
How It Works
When your iPhone needs extra power for an AI task, it connects to special servers in Apple's data centres. These servers use advanced security measures like secure boot, hardware encryption keys, and "trusted execution environments" to ensure your data is isolated and protected.
Your iPhone sends the task to multiple servers simultaneously and one processes the request. This server processes the task in an isolated, secure environment without being able to access or view the actual data. The results are then sent back to your iPhone.
Apple has also implemented measures to prevent tracking, such as hiding your IP address and rotating which server handles your requests.
领英推荐
OpenAI links
In terms of OpenAI, based on the information provided, it seems Apple has taken steps to protect user privacy when integrating ChatGPT into Apple Intelligence, but there are still some potential privacy concerns. Apple has stated that it will obscure users' IP addresses before sending information to OpenAI, and that OpenAI "won't store requests" from Apple Intelligence.
This suggests that individual user data will not be directly shared with or stored by OpenAI. However, even without IP addresses or identifying individual users, the anonymised and aggregated data from Apple Intelligence requests could still be useful for OpenAI to train its models.
So while individual user data may not be shared, OpenAI could potentially benefit from the collective data. Additionally, Apple mentioned that users will have the option to log in with their ChatGPT account when using the integration. In that case, users would no longer be anonymised by Apple and would be subject to OpenAI's privacy policy. This could potentially allow OpenAI to access more user data.
Overall, while Apple has emphasised privacy and stated it won't directly send user data to OpenAI, there are still potential avenues for OpenAI to indirectly benefit from or access some user data through the Apple Intelligence integration, especially if users opt to log in with ChatGPT accounts.
Concerns and trade offs
Some security experts worry that, despite Apple's efforts, this system could still be vulnerable to hardware or software flaws that might expose user data. There are also concerns that once this capability exists, Apple might be tempted to use it more broadly, potentially reducing user privacy over time.
However, others argue that for most users, the alternative is often much worse - sending personal data to companies with far fewer privacy protections than Apple, i.e. OpenAI.
As best I can tell, Apple does not have explicit plans to announce when your data is going off-device to Private Compute and this also won’t be something you can opt in or out of. In fact, you won’t necessarily know when it’s happening. It will just happen.
Ultimately, as with a lot of tech, it's a trade-off between convenience and advanced capabilities enabled by cloud processing, versus the inherent risks of taking data off-device.
Apple believes Private Cloud Compute strikes a reasonable balance, but only time will tell if users agree.
We'll all be too busy creating Genmojis to care!
Joint Director @ Plannr · CTO · Fintech made beautifully simple
8 个月I'm running iOS18 already, and the other changes they've made are newsworthy in their own right. They've introduced the ability to lock apps behind faceID, remove labels from apps on the home screen and even reduce the distraction of apps by tinting them to blend into the background wallpaper. Oh, and the passwords app is a nice touch too! Nice little upgrades!