Apple is eating the generative AI pie as well. Apple Intelligence, which was unveiled today at WWDC 2024, is largely a product of local generative models of different sizes, but even the power of the A17 Pro chip isn't always enough to handle every one of your important queries.
Sometimes Apple will have to go to the cloud. Not just any cloud, of course, but its own Private Compute Cloud, where your data is protected in ways that Apple says it might not be in other cloud-based generative AI systems.
In a deep-dive session after WWDC 2024, Apple Senior Vice President of Software Engineering Craig Federighi and Apple Senior Vice President of Machine Learning and AI Strategy John Giannandrea explained exactly how AI works. Apple and the systems it will support, like this. Siri will decide when to keep your queries on the device, when to communicate with Apple's Private Compute Cloud, and how Apple Intelligence decides what to share with that cloud.
“We're in the early stages here,” Federighi said as he explained the AI journey, the challenges Apple faced, how they solved them, and the path forward.
What Apple is doing here is no small feat, and you could say that Apple dug the hole in which it sits. Apple Intelligence is essentially a series of generative AI models of different sizes that look deep inside your iPhone to learn about you. Knowing it means they can help you in ways that other LLM and generative AI models probably can't. It's like when your partner or parents can reassure you because they know everything about you, while a stranger can only guess what might comfort you, but they are very likely to be wrong. Knowing you and all the data on your phone is Apple Intelligence's superpower and potential weakness, especially when it comes to privacy.
Federighi explained that Apple created a two-part solution to mitigate this problem and avoid a disaster.
First, built-in intelligence decides which parts of all your data are crucial to getting the right answer. It then sends only that data (encrypted and anonymous) to the Private Compute Cloud.
The second part of the solution is how the cloud is built and how it manages data. This is a cloud that runs on efficient Apple Silicon but has no permanent storage. Security researchers have access to the server but not its data to perform privacy audits. The iPhone will not send these bits of data to a server that has not been publicly verified. Federighi compared it to keys and tokens found on cryptocurrency servers.
“No one, not Apple or anyone else, would have access to your data,” Federighi added.
To be clear, your device data is at the center of what Apple is doing with Apple Intelligence and the new Siri. It's a “rich understanding of what's on your device,” and that knowledge base “will only get richer over time,” Giannandrea said.
We also got an idea of how Siri's semantic index, which can analyze data from across the phone, including meta information in photos and videos, is boosted when combined with Apple Intelligence models. All of this helps to understand what he means, Federighi said.
Apple has been working on the semantic index for years. “So it's really a story of us building over many, many years toward a really powerful capability in the device.”
The couple also clarified which models you will wear and when. It turns out that the local ones, unless you request, say, ChatGPT, are all from Apple.
“It's important to re-emphasize that Apple Intelligence and the experiences we talk about are built on models made by Apple,” Federighi added.
As is often the case, Apple trained these models with data. Some of this comes from the public web (based on Apple's ongoing project on web-based search), although Giannandrea said publishers can choose not to include their data. Apple also licensed news archive data and even applied some internal data to its broadcast model.
The duo also confirmed that Apple Intelligence will only work on iPhones with the A17 Pro chip. By way of explanation, Giannandrea said that “the core fundamental models require an enormous amount of computing.” Federighi added that the latest A17 Pro neural engine is “twice as powerful as the previous generation” and has an advanced architecture to support Apple's AI. All of which is probably a huge convenience for iPhone 15 (A16 Bionic) and iPhone 14 (Pro and Standard) owners.
As for how Apple Intelligence will work with third-party models, Federighi noted that some of them have expertise that you may not find in their models, such as answering the question: “What can I do with these ingredients?” Fedeigi then added something that could inadvertently put the OpenAI platform in an unwanted light: “Even hallucinations are useful; “You end up with a strange meal.”