
Elyse Betters Picaro / ZDNET
ZDNET’s key takeaways
- A new LLM Siri should launch sometime in March.
- This new Siri may act like an AI chatbot but offer specific skills.
- This Siri also needs to be less error-prone.
There’s a scene from an episode of Larry David’s “Curb Your Enthusiasm” in which David is in his car trying to get driving directions from Siri. But each time he utters a command, Siri completely misinterprets what he says and serves up the wrong response. This goes on for a while. Finally out of patience, David hurls some nasty expletives at Siri, which the voice assistant also misunderstands, leaving him with no directions and lots of frustration.
That scene strikes a familiar chord among those of us who’ve lived with Siri over the years. Despite plenty of promised improvements and fixes, Apple’s voice assistant is still as frustrating to use as ever. Reportedly, Apple will soon be releasing a revamped version of Siri designed to act more like an advanced AI chatbot, a la ChatGPT and Google Gemini. Will this new LLM Siri finally be the one we’ve always wanted? Let’s see what’s in store.
Also: Apple’s AI search engine could be driven by Google and help revive Siri, report says
Reports of a new LLM (large language model) Siri surfaced in late 2024. At the time, Apple watcher Mark Gurman said that this upcoming version was already being tested internally on iPhones, iPads, and Macs as a standalone app. The goal was to launch the new Siri sometime in the spring of 2026. And so far, that timeframe appears to be on track.
The latest reports noted by 9to5Mac and others say that LLM Siri will debut with iOS 26.4 in March. iOS 26.3 is currently in its first round of beta testing in advance of an expected release around the end of January. That means iOS 26.4 should pop up as a beta in February.
Apple did not immediately respond to a request for comment.
Depending on the launch date and based on previous releases of iOS .4 updates, the new version should go live for everyone by the end of March. Some of the timing will also depend on how well the new Siri performs and whether any fine-tuning is required.
What to expect from a new Siri
Like ChatGPT, Gemini, and Microsoft Copilot, the new Siri would use advanced LLMs for more natural conversations. LLMs are trained on a vast amount of data to learn how to handle and process human language to sound and act less robotic in their responses.
In Siri’s case, the assistant will reportedly process certain interactions on the device for both speed and privacy reasons. But more advanced tasks will require cloud-based processing via technology from the likes of OpenAI or Google.
Among the specific features, App Intents will allow Siri to work with Apple’s own apps and those of third parties. As examples, you should be able to ask Siri to open and edit a certain photo in the Photos app, check on flight information from the American Airlines app, or reorder a certain tea from the Amazon app. Developers will naturally need to integrate App Intents into their apps for this to work.
Also: This handy Apple Intelligence feature saves me over $200 a year
Another skill called “personal context knowledge” will let Siri perform tasks based on its awareness of the data and preferences on your device. For example, you might ask it to find a specific text from your wife or locate the number of your electronic passport. Already rolled out in dribs and drabs, this feature should roll out in full with iOS 26.4.
Next up is onscreen awareness. Here, Siri will be able to “see” and work with what’s on the screen based on your request. You could ask it to add the address currently on display to your contact list or summarize an article you’re viewing in the browser.
One more trick on the list is known as “World Knowledge Answers.” With this skill, Siri will function like a regular search engine as it scours the web to answer your question or request. Ask it who won the World Series in 1978 or describe the events that led to World War I, and it should provide a direct answer rather than just display a link to a website.
Along with these new AI-powered features are hopes that the new Siri won’t be as error-prone as it is now. That means it should be able to understand your requests and provide the right answer. No confusing of names or places, or other items. No off-beat and totally incorrect responses. No giving up when it can’t come up with an answer.
Of course, Apple has made promises before. Released in 2024 with iOS 18, Apple Intelligence fell flat because Apple promised too much and delivered too little. Yes, Apple’s AI can help with certain tasks. But compared with an actual AI bot like ChatGPT or Gemini, it misses the mark. That’s true even today, with several interim updates since the initial release and a new version of iOS in 2025.
Now the onus is on Apple to see if it can rescue Siri from its sordid past. That means a true assistant that not only knows how to use AI but won’t fall down on the job so often. Like many Apple users, I’m anxiously waiting to see what LLM Siri has in store. And I’m hoping the new version won’t be one that we’ll want to curse at because it can’t provide the right driving directions.