Why Apple’s iOS 26.4 Siri Upgrade Will Be Bigger Than Originally Promised

In the iOS 26.4 update coming this spring, Apple will introduce a new version of Siri that will overhaul the way we interact with the personal assistant and what it can do.


The iOS 26.4 version of ‌Siri‌ will not work like ChatGPT or Claude, but will rely on Large Language Models (LLM) and has been updated from the ground up.

Upgraded architecture

The next generation version of ‌Siri‌ will use advanced large language models, similar to those used by ChatGPT, Claude and Gemini. Apple doesn’t implement full chatbot interactions, but no The upgrade is better than what is available now and long overdue.

Right now, ‌Siri‌ uses machine learning, but doesn’t have the reasoning capabilities that LLM models convey. ‌Siri‌ relies on several task-specific models from one step to the next to complete a request. ‌Siri‌ needs to determine the intent of the request, pull the relevant information (time, event, name, etc.) and then use APIs or apps to complete the request. It is not an all-in-one system.

In iOS 26.4, ‌Siri‌ will have the LLM core on which everything else is built. Instead of just translating voice to text and finding keywords to trigger, ‌Siri‌ will actually understand the specifics of what the user is asking and apply reasoning to it.

Enhancement of LLM

‌Siri‌ today is usually fine for simple tasks like setting a timer or alarm, sending a text message, turning a smart home device on or off, answering a simple question or controlling a device function, but it doesn’t understand anything more complex, it can’t complete multi-step tasks, it can’t interpret wording that isn’t in the desired structure, it has no personal questions, and it follows.

Most of these problems should be solved by LLM because ‌Siri‌ will have something like a brain. LLMs can understand the nuances of a request, determine what someone actually wants, and take action to deliver that information or complete the desired action.

We already know some of what LLM ‌Siri‌ will be able to do, as Apple detailed the Apple Intelligence features it plans to implement when iOS 18 debuts.

Siri Apple Intelligence features promised

Apple outlined three specific ways ‌Siri‌ will improve, including personal context, the ability to see what’s on the screen to know what the user is talking about, and the ability to do more within and between apps.

‌Siri‌ will understand pronouns, refer to content on screen and in apps, and have short-term memory for subsequent requests.

Personal context

With personal context, ‌Siri‌ will be able to track emails, messages, files, photos and more, learn more about you, helping you complete tasks and keep track of what you’ve been feeling.

  • Show me the files Eric sent me last week.
  • Find the email where Eric mentioned skating.
  • Find books that Eric recommended to me.
  • Where is the recipe Eric sent me?
  • What is my passport number?

Awareness on the screen

Screen awareness will allow ‌Siri‌ to see what’s on your screen and complete actions involving whatever you’re looking at. For example, if someone texts you an address, you can tell “Siri” to add it to your contact card. Or if you’re looking at a photo and want to send it to someone, you can ask ‌Siri‌ to do it for you.

Deeper application integration

Deeper app integration means ‌Siri‌ will be able to do more within and between apps, performing actions and tasks that aren’t possible with a personal assistant right now. We don’t have a full picture of what ‌Siri‌ will be capable of, but Apple has provided a few examples of what to expect.

  • Moving files from one application to another.
  • Editing a photo and then sending it to someone.
  • Get directions home and share your arrival time with Erik.
  • Send the email I wrote to Eric.

An update bigger than promised

At an all-hands meeting in August 2025, Apple’s head of software engineering, Craig Federighi, explained the ‌Siri‌ debacle to employees. Apple tried to merge the two separate systems, which failed.

There was one system for processing actual commands and another based on large language models, and the hybrid approach did not work due to the limitations of the current ‌Siri‌ architecture. The only way forward was to upgrade to a second-generation architecture built on a large language model.

At the August meeting, Federighi said that Apple has successfully redesigned ‌Siri‌ and that Apple will be able to introduce a bigger upgrade than it promised in iOS 18.

“The work we did on this comprehensive rebuild of ‌Siri‌ gave us the results we needed,” Federighi told staff. “That put us in a position where we couldn’t just deliver what we announced, but deliver a much bigger upgrade than we imagined.”

Adoption of Google Gemini

Part of Apple’s problem was that it relied on self-built AI models that couldn’t match the capabilities of its competitors. Apple began considering a third-party model for ‌Siri‌ and other future AI features shortly after ‌Siri‌ was shelved, and in January Apple announced a multi-year partnership with Google.

For the foreseeable future, Apple’s AI features, including a more customized version of ‌Siri‌, will use a custom model that Apple created in collaboration with Google’s Gemini team. Apple plans to continue working on its own internal models, but for now it will rely on Gemini for many features for the public.

‌Siri‌ in iOS 26.4 will be more like Google Gemini than ‌Siri‌ today, albeit without full chatbot capabilities. Apple plans to continue running some features on the device and use Private Cloud Compute to maintain privacy. Apple will store personal data on the device, anonymize requests and continue to allow deactivation of AI features.

What’s not coming in iOS 26.4

‌Siri‌ won’t work as a chatbot, so the updated version won’t have long-term memory or back-and-forth conversations, and Apple plans to use the same voice interface with limited typing functionality.

Embarrassing delay of Apple’s Siri

In what has become an infamous move, Apple went all out and showed off a smarter version of ‌Siri‌ powered by Apple Intelligence when it unveiled iOS 18 at the Worldwide Developers Conference in 2024. Apple said these features would come in an update to iOS 18, but right around the time the launch was expected, Apple admitted that ‌Siri‌ was not ready and would be delayed until spring 2026.

Apple's more personal Siri adApple's more personal Siri ad

Apple executives hit the press trail to explain the shortcomings of ‌Siri‌ after WWDC 2025, promising bigger and better things for iOS 26 and explaining what went wrong. The ‌Apple Intelligence‌ ‌Siri‌ features we saw at WWDC 2024 were actually implemented and not fake, but behind the scenes ‌Siri‌ didn’t work as well as expected and Apple was dealing with quality issues.

As Apple advertised new ‌Siri‌ features with the iPhone 16, some people who bought an iPhone for the new functionality were upset about the delay and filed a lawsuit. Apple managed to quietly settle the case in December 2025, so most of the ‌Siri‌ snafu was resolved.

Internal restructuring

A misstep with the debut of ‌Siri‌ and the failure of the hybrid architecture led Apple to restructure its entire AI team. Apple AI chief John Giannandrea has been fired from leading the ‌Siri‌ team, replaced by Vision Pro chief Mike Rockwell.

Apple CEO Tim Cook no longer trusted Giannandrea’s ability to oversee product development, and Giannandrea is set to retire in the spring of 2026. Rockwell reports to Federighi, and Federighi has told employees that new management has “overloaded” ‌Siri‌ development. Federighi has apparently played a key role in changing Apple’s approach to AI and is making decisions that will allow the company to catch up with rivals.

Apple has struggled to retain AI employees amid the ‌Siri‌ issue and recruiting strategies from companies like Meta. Meta has wooed several key AI engineers from Apple, offering pay packages of up to $200 million. At Apple’s all-hands meeting in August, Cook and Federighi tried to reassure employees that artificial intelligence is critically important to the company. “There is no project that people take more seriously,” Federighi said of “Siri.”

Cook said Apple is “investing” to be a leader in AI.

iOS 26.4 Siri launch date

Apple has promised that a new version of ‌Siri‌ will arrive in the spring of 2026, which is when we expect iOS 26.4. Testing on iOS 26.4 is expected to begin in late February or early March, with the launch following around April.

LLM Siri compatibility

The new version of ‌Siri‌ will likely run on all devices that support ‌Apple Intelligence‌, although Apple did not specifically provide details. Some new ‌Siri‌ features may also come to older devices.

Chatbot upgrade to iOS 27

Apple plans to upgrade ‌Siri‌ even further in the iOS 27 update, making Siri a chatbot. ‌Siri‌ will work like Claude or ChatGPT, able to understand and engage in conversation back and forth.

Details about the ‌Siri‌ interface and how the chatbot version of ‌Siri‌ will work are still scarce, but iOS 26.4 will be a stop on the way to a version of ‌Siri‌ that will actually work like the products from Anthropic and OpenAI.

Leave a Comment