Unpacking Apple Intelligence: Apple’s vision for personalized AI

Tanay Jaipuria
Author
No items found.
Apple's has announced a welcome new entrant to the AI space. With technological innovation and a continued focus on user privacy, Apple Intelligence is a leap forward for personal AI agents.

Apple made waves at WWDC24 last month with the unveiling of Apple Intelligence, an OS-level agents platform. At its core, Apple Intelligence combines powerful generative models with the user's personal context, all while maintaining Apple's stringent privacy standards.

In this post, I’ll discuss the key technical components of Apple Intelligence and offer some analysis of what Apple’s new AI platform will mean for agent adoption in the future.

Source: CNET

What is Apple Intelligence?

At its heart, Apple Intelligence marks a significant upgrade to Siri, Apple’s flagship digital assistant, adding new capabilities and making it more contextually aware. While the integration with ChatGPT was showcased prominently in the WWDC demo, some observers mistakenly concluded that Apple had merely announced a ChatGPT integration at the operating system level.

But the truth is, Apple Intelligence is more sophisticated than that. You can think of Apple Intelligence as a system that can perform Retrieval Augmented Generation (RAG) across all the data the user has on their device, route it to the appropriate model and then take actions in the right application on the user’s behalf.

The platform has several key components to make that happen.

Models: On-device models that are fine-tuned for use cases like writing text, generating images, prioritizing notifications and taking actions across applications. Larger models on Apple's servers are also optimized for these same tasks and functionalities.

Orchestrator: The ability to determine how to best handle a user request, whether that be through on-device models, Apple’s server models or ChatGPT. In the ChatGPT case, the user must approve the action first.

Semantic index: An on-device database that indexes the semantic meaning of all the user’s files, photos, emails and other content on the device. This index enables quick retrieval of relevant information when responding to user queries.

App Intents toolbox: A database of the capabilities for each app on the user’s device, so Siri can take an action with an app based on a query from the user — like sending an email through the Mail app, editing a photo through the Photos app or sending a calendar invite.

Source: Apple

What does Apple Intelligence deliver?

Now that we’ve covered some of the nuts and bolts, let’s look at the implications of what Apple Intelligence promises to deliver.

Blazing fast response times with on-device models

One of Apple Intelligence’s standout features is the way it prioritizes on-device processing. By running AI models directly on devices like the iPhone 15 Pro, Apple ensures that tasks are completed with minimal latency.

To do this, Apple is leveraging two proprietary 3B parameter models: a language model and a diffusion model. The models have a token latency of just 0.6 milliseconds. This means users will get lightning-fast responses without their data ever leaving their device for a subset of queries.

While the model is limited in size and doesn’t have access to off-device information, it stacks up pretty well for the set of tasks it's focused on, related to text generation or summarization and app commands. Apple’s own benchmarking found that users rated the experience of the on-device models higher than the server models, as shown in the results below.

Source: Apple

Apple’s move should encourage more vendors to run smaller models on devices, given their privacy and latency benefits and potential cost savings.

Privacy with Private Cloud Compute

For queries that require more than the on-device models, Apple has server models that will run on its new Private Cloud Compute (PCC) system.

Source: Apple

PCC is a cloud intelligence system designed for private AI processing, so complex AI tasks can be executed in the cloud without storing user data there. That means personal information remains secure and inaccessible, even to Apple.

The PCC system extends the privacy and security that Apple is known for into the cloud, ensuring that user data is used solely for executing requests and is not retained.

For now, only Apple’s server models will support this private approach. But it will be interesting to see if, over time, Apple partners with other model providers to make these technologies available in the same manner.

Enhanced intelligence with ChatGPT integration

In addition to their on-device and server models — which are largely focused on answering questions related to personal context — Apple Intelligence has an integration with OpenAI’s ChatGPT. As Tim Cook explains in an interview with Marques Brownlee, Siri will be able to use ChatGPT to answer questions that require world knowledge.

Source: Apple

For those queries, the user can access ChatGPT for free, even if they aren’t a subscriber — not surprising, perhaps, now that OpenAI’s GPT-4 model is available for free anyway.

Notably, Apple isn’t paying OpenAI for this integration. Instead, they’re “paying” OpenAI by distributing ChatGPT to more than 1 billion Apple users. According to Bloomberg:

“Apple isn't paying OpenAI as part of the partnership, said the people, who asked not to be identified because the deal terms are private. Instead, Apple believes pushing OpenAI's brand and technology to hundreds of millions of its devices is of equal or greater value than monetary payments, these people said.”

In addition, OpenAI won’t be able to train on any of the data, although obviously, they still receive the user’s prompt[BH1]  because their AI systems are not yet running on Apple’s aforementioned private cloud infrastructure.

But if the integration does drive ChatGPT Plus subscriptions for OpenAI, the deal could be a huge success for them.

Accelerated adoption with “smarter” agents

Apple Intelligence could signify a big step forward for AI agents, given how deeply integrated it is in the OS. It will work well out of the box for both retrieving context across all of a user’s Apple’s products (contacts, emails, photos, etc.) and taking actions in Apple’s applications (calendar, email, photos, etc.).

Some of the examples Apple showcased in the launch were relatively straightforward. They included questions like:

● “When is my mom’s flight landing?”

● “What’s my plan for dinner?”

● “What’s the weather at X? Can you create an event for that?”

Source: Apple

But as more third-party app developers open up the information in their apps to Apple Intelligence’s semantic index, and enable Siri to take actions in their application by integrating the App Intents framework (which has been available for a few years now), Siri could become a powerful personal assistant to simplify and enhance everyday tasks and interactions.

It’ll be interesting to see if it catches on with developers, and indeed with end users, and whether it will be enough to breathe new life into Siri.

A promising start for Apple Intelligence

Overall, Apple Intelligence seems like a well-thought-out initial foray into AI from Apple that prioritizes privacy and ease of use for end users. It’s a good mix of RAG across all user data, a suite of models including on-device all the way to GPT-4o and the ability to take actions across apps.

I’ll be watching to see if it results in true agentic use cases, or remains used for relatively simple tasks or queries, as Siri is largely used today.

Since Apple Intelligence will only be available on Apple 15 Pro and later devices, Apple seems to be betting on it reducing the growing replacement cycle for iPhones. Stock markets certainly believe it may.

If you're working in the AI space and would like more analysis and insights on AI-powered trends and technologies, make sure you sign up for my Substack newsletter.

Wing Logo
Thanks for signing up!
Form error, try again.