Siri will reportedly evolve into a full-fledged chatbot this fall

Macworld

A new report from Bloomberg’s Mark Gurman details changes coming to Siri this fall, with the introduction of iOS 27, macOS 27, and the other annual updates. Siri, currently just a simple digital assistant, will undergo a transformation into a complete AI chatbot, with a new interface on iOS, iPadOS, and macOS.

With the OS 26.4 update (likely to arrive with the next 2-3 months), Siri is still expected to undergo a massive overhaul. While the interface will remain the same, the “brains” of Siri will be swapped out for what is known internally as “Apple Foundation Models version 10.” It is based on Google’s Gemini technology and operates with 1.2 trillion parameters.

This version of Siri will be smarter and more capable, including the ability to get information from the web. It will also bring the capabilities Apple promised back in summer 2024—the ability to read and understand the user’s screen, take actions within apps, and build a profile of the user based on their interactions so it can operate with “personal context” that is relevant to the individual.

The next big update comes a couple of months later in the OS 27 updates with a version of Siri code-named Campos. This will be built on an even more capable foundation model, Apple Foundation Models 11, which Gurman claims is comparable to Gemini 3.

In the coming OS 27 update, users will summon Siri the same way they do now, by saying the “Siri” codeword or holding the side button. But the interface will change from a simple listening assistant to a full-fledged chatbot interface. It will do what chatbots today do—answer questions, search the web, create text and images, and analyze uploaded files.

But it will also be able to recognize open windows on a Mac or screen contents on iOS or iPadOS in order to take actions or suggest commands. It will of course still be able to do things third-party chatbot services can’t today, like control smart home devices, set timers, and control device options and features (like enabling airplane mode or turning on the flashlight on your iPhone).

Siri will also be more deeply integrated into Apple’s built-in applications like Mail, Photos, Music, Podcasts, TV, and Xcode. So users will be able to issue commands like finding a photo based on specific parameters and editing it in a specific way. “Take the photo I took at the park yesterday and crop it to show only my face” is the sort of thing it seems Apple is aiming to enable.

The report says Apple is currently discussing how much about the user the chatbot will be allowed to remember. Chatbots such as ChatGPT and Gemini can remember past interactions and conversations to better fulfill requests according to the user’s past preferences and details. But Apple may severely limit this ability in the interest of protecting user privacy. This could be particularly important as Apple is currently discussing whether to perform cloud-based AI activity for this future Siri on Google’s own Gemini servers, which run the company’s own powerful TPUs (Tensor Processing Units). The Siri update coming in OS 26.4 will run on Apple’s own Private Cloud Compute servers, powered by Apple Silicon.

Finally, Bloomberg reports Apple is designing Campos, the OS 27-era Siri, in a modular way so that the underlying models can be swapped out over time. This will allow them to move away from Gemini-powered models in the future in favor of their own, or move to another company’s model. It will also make it easier to change this feature to work in restricted regions where it may be required to operate with a foundation model from a specific company.

This report neatly builds on Enchanté, the Apple internal chatbot for testing and productivity we exclusively reported earlier today.

Leave a Reply

Your email address will not be published. Required fields are marked *

Back To Top