For the personal AI on Apple's iPhone and Microsoft's Copilot PC, you are an open book. But who is reading along?
Marc Hijink
Latest book: FOCUS - The ASML Way - Inside the power struggle over the most complex machine on earth. New translations: China Q4 ’24 South Korea Q1 ’25 Taiwan Q2 ’25 Poland Q2 ’25
Both iPhones and Windows PCs are getting built-in AI that delves into personal data: emails, browsing history, photos, locations, and app messages. These new, compact AI models work locally, but there are serious privacy risks. No wonder Microsoft is recalling its all-seeing 'Recall' feature.
Look, it just fits in your pocket. That's the message behind Apple's slogan 'AI for the rest of us.' To build artificial intelligence for ‘everyone’, the tech company is introducing a portable variant of large AI models like ChatGPT, which generates texts, images, and other digital media and automate computer tasks.
The world holds its breath to see how much office work can be replaced by AI. For people with a real job that is still a distant concern. But now Apple, simultaneously with Microsoft, makes AI standard in its operating systems – and it starts meddling with our personal lives.
The AI models that companies like OpenAI, Google, and Meta are developing are bulky, far from flawless, and sometimes sloppy with data. Apple doesn't dare to touch that. The company designed more compact AI models that work locally, on a phone or a computer. Such a specialized model can do less, but it also makes fewer mistakes. And it works much faster than sending commands to a data center and waiting for an answer. Other AI gadgets, without local inferencing, are terribly slow.
Supercycle for iPhones
Apple's AI for everyone doesn't apply to every wallet. You need the most expensive iPhone to use the new features, or a Mac with a relatively recent chip (M1 or faster). So it's a way to sell new devices – Apple investors are hoping for a 'supercycle,' a year in which many more users than usual upgrade their device.
What does 'Apple Intelligence' offer? Siri, the aging Apple assistant, is getting a much-needed makeover. And besides common tricks like text suggestions, image editing, translations, and summaries, Apple wants to automate the daily hassle on a phone. The AI model digs into mail and chat messages, searches for forgotten concert tickets and urgent appointments, and calculates the shortest route to the proverbial school performance of your (grand)children.
Apple loves such real-life examples. Like: if you need to pick up your mother from the airport, your iPhone will find out when her flight arrives by analyzing her email in your inbox and comparing the data with an up-to-date list of arrival times.
That will save you at least fifteen seconds of your life. But it only needs to go wrong once – 'where were you?' – and you'll never trust automatic iPhone advice, or your family, again.
Phones are full of private photos, health data, financial information, and intimate conversations with family members and colleagues. According to Apple, you can safely let your iPhone analyze that data because the AI models run on the device itself and nothing is shared with the outside world.
However, the Apple model doesn't work entirely locally. For tasks that are too complex for the iPhone, a helpline is called in: a cloud AI service that thinks along remotely and is shielded from prying eyes.
Even Apple doesn't know what happens on those servers. Says Apple. But even if it is technically sound, others will still try to get their hands on that very personal data. Law enforcement agencies can already demand access to cloud storage services.
Generative AI is a legal minefield for Apple. The company doesn't want any trouble with deepfakes and therefore doesn't let you generate photorealistic images. The AI model is trained on data from which swear words have been removed, so Siri won't swear. For general questions that 'they' can't handle themselves, Siri calls in the help of ChatGPT. Apple clearly states that Siri is not responsible for the content. Whatever ChatGPT hallucinates, Apple washes its hands of it.
Personal computer are getting really personal
Earlier this month, Microsoft showed its variant of the local AI models, specifically for Windows. Copilot+, that's the name of the new personal computers that are so powerful they can handle (smaller) AI calculations themselves.
How small is small? The entry-level version of Microsoft's own Phi-3 model, for example, has fewer than four billion parameters – the number of variables in the model. By comparison, ChatGPT-4 is said to contain over 1,700 billion parameters.
In the new Windows 11 version, you can generate texts and images and translate live with AI. Basically everything you can do with ChatGPT or Dall-E, but faster. Microsoft keeps the door open for other AI models – Apple does that too.
To process as many tasks locally as possible, Microsoft sets steep hardware requirements: 16 gigabytes of RAM and a chip with an enhanced AI-part, the so-called NPU or neural processing unit. These new Windows laptops – now running on ARM-processors too - can compete in speed and battery life with Apple's MacBooks.
The PC industry hopes that the new Windows features are reason enough to buy a new device. That goes against the trend: you could use your computer longer because most software runs in data centers. Now you have to pay attention to computing power again, expressed in 'tops' – trillion operations per second.
The chip market will be busy: the supercycle of PCs and phones follows the supercycle of Nvidia. Due to these AI chips, and all the chips needed to feed the AI models with data, chip machine maker ASML expects explosive growth and has to double its footprint in the Dutch province of Brabant.
The all-seeing eye
Tech companies are so enthusiastic about the new AI features that they quickly overlook the drawbacks. For example, Microsoft choked on 'Recall.' It is an all-seeing eye that takes a screenshot every five seconds of whatever you are doing on your computer. That data is stored and analyzed for three months.
Microsoft gives an example of an application: you glanced at a nice recipe for a goat cheese pizza earlier in the day but then totally forgot where you saw it. To prevent culinary chaos, Recall comes to the rescue and finds the recipe immediately. Just make sure you don't put any glue in it.
Security experts discovered that Recall in the Windows test version is poorly secured. A potential privacy drama, considering that Recall can also capture passwords, trade secrets, sleazy web visits, or secure chat messages. It's a goldmine for spies, criminals and law enforcement agencies.
Initially, Microsoft failed to see the problem because Recall is stored locally, on the PC itself. But the discomfort kept growing, even before the first Windows Copilot+ laptop hit the store. According to British regulator ICO companies like Microsoft must first consider any privacy risks before releasing a product. That is why Microsoft decided to leave Recall out of the first CoPilot+ PC’s, hitting the shelves next week.
Although these new AI tricks seem fun, the most important feature is the ability to turn them off. But don't forget up to pick up your mom from the airport.
This article by Marc Hijink was published in Dutch June 13th, 2024, and translated by AI. Read more episodes of ‘Achter de schermen / Behind the scenes’ at NRC.
Senior Director, Content Protection at Wiley
5 个月Who? ?? https://p4sc4l.substack.com/p/gpt-4o-the-appointment-of-paul-m