If software is judged by the difference between what it is actually capable of compared to what it promises, Siri is unquestionably the worst built-in iOS application. I cannot think of any other application which comes preloaded with a new iPhone that so greatly underdelivers, and has for so long.

Siri is thirteen years old, and we all know the story: beneath the more natural language querying is a fairly standard command-and-control system. In those years, Apple has updated the scope of its knowledge and responses, but because the user interface is not primarily a visual one, its outer boundaries are fuzzy. It has limits, but a user cannot know what they are until they try something and it fails. Complaining about Siri is both trite and evergreen. Yes, Siri has sucked forever, but maybe this time will be different.

At WWDC this year, Apple announced Siri would get a whole new set of powers thanks to Apple Intelligence. Users could, Apple said, speak with more natural phrasing. It also said Siri would understand the user’s “personal context” — their unique set of apps, contacts, and communications. All of that sounds great, but I have been down this road before. Apple has often promised improvements to Siri that have not turned it into the compelling voice-activated digital assistant it is marketed to be.

I was not optimistic — and I am glad — because Siri in iOS 18.1 is still pretty poor, with a couple of exceptions: its new visual presentation is fantastic, and type-to-Siri is nice. It is unclear exactly how Siri is enhanced with Apple Intelligence — more on this later — but this version is exactly as frustrating as those before it, in all the same ways.

As a reminder, Apple says users can ask Siri…

  • …to text a contact by using only their first name.

  • …for directions to locations using the place name.

  • …to play music by artist, album, or song.

  • …to start and stop timers.

  • …to convert from one set of units to another.

  • …to translate from one language to another.

  • …about Apple’s product features and documentation, new in iOS 18.1.

  • …all kinds of other stuff.

It continues to do none of these things reliably or predictably. Even Craig Federighi, when he was asked by Joanna Stern, spoke of his pretty limited usage:

I’m opening my garage, I’m closing my garage, I’m turning on my lights.

All kinds of things, I’m sending messages, I’m setting timers.

I do not want to put too much weight on this single response, but these are weak examples. This is what he could think of off the top of his head? That is all? I get it; I do not use it for much, either. And, as Om Malik points out, even the global metrics Federighi cites in the same answer do not paint a picture of success.

So, a refresh, and I will start with something positive: its new visual interface. Instead of a floating orb, the entire display warps and colour-shifts before being surrounded by a glowing border, as though enveloped in a dense magical vapour. Depending on how you activate Siri, the glow will originate from a different spot: from the power button, if you press and hold it; or from the bottom of the display, if you say “Siri” or “Hey, Siri”.

You can also now invoke text-based Siri — perfect for times when you do not want to speak aloud — by double-tapping the home bar. There has long been an option to type to Siri, but it has not been surfaced this easily, and I like it.

That is kind of where the good news stops, at least in my testing. I have rarely had a problem with Siri’s ability to understand what I am saying — I have a flat, Canadian accent, and I can usually speak without pauses or repeating words. There are writers who are more capable of testing for improvements for people with disabilities.

No, the things which Siri has flubbed have always been, for me, in its actions. Some of those should be new in iOS 18.1, or at least newly refined, but it is hard to know which. While Siri looks entirely different in this release, it is unclear what new capabilities it possesses. The full release notes say it can understand spoken queries better, and it has product documentation, but it seems anything else will be coming in future updates. I know a feature Apple calls “onscreen awareness”, which can interpret what is displayed, is one of those. I also know some personal context features will be released later — Apple says a user “could ask, ‘When is Mom’s flight landing?’ and Siri will find the flight details” no matter how they were sent. This is all coming later and, presumably, some of it requires third-party developer buy-in.

But who reads and remembers the release notes? What we all see is a brand-new Siri, and what we hear about is Apple Intelligence. Surely there must be some improvements beyond being able to ask the Apple assistant about the company’s own products, right? Well, if there are, I struggled to find them. Here are the actual interactions I have had in beta versions of iOS 18.1 for each thing in the list above:

  • I asked Siri to text Ellis — not their real name — a contact I text regularly. It began a message to a different Ellis I have in my contacts, to whom I have not spoken in over ten years.

    Similarly, I asked it to text someone I have messaged on an ongoing basis for fifteen years. Their thread is pinned to the top of Messages. Before it would let me text them, it asked if I wanted it to send it to their phone number or their email address.

  • I was driving and I asked for directions to Walmart. Its first suggestion was farther away and opposite the direction I was already travelling.

  • I asked Siri to “play the new album from Better Lovers”, an artist I have in my library and an album that I recently listened to in Apple Music. No matter my enunciation, it responded by playing an album from the Backseat Lovers, a band I have never listened to.

    I asked Siri to play an album which contains a song of the same name. This is understandably ambiguous if I do not explicitly state “play the album” or “play the song“. However, instead of asking for clarification when there is a collision like this, it just rolls the dice. Sometimes it plays the album, sometimes the song. But I am an album listener more often than I am a song listener, and my interactions with Siri and Apple Music should reflect that.

  • Siri starts timers without issue. It is one of few things which behaves reliably. But when I asked it to “stop the timer”, it asked me to clarify “which one?” between one active timer and two already-stopped timers. It should just stop the sole active timer; why would I ask it to stop a stopped timer?

  • I asked Siri “how much does a quarter cup of butter weigh?” and it converts that to litres or — because my device is set to U.S. localization for the purposes of testing Apple Intelligence — gallons. Those are volumetric measurements, not weight-based. If I ask Siri “what is the weight of a quarter cup of butter?”, it searches the web. I have to explicitly say “convert one quarter cup of butter to grams”.

  • I asked Siri “what is puente in English?” and it informed me I needed to use the Translate app. Apparently, you can only translate from Siri’s language — English, in this case — to another language when using Siri. Translating from a different language cannot be done with Siri alone.

  • I rarely see the Priority Messages feature in Mail, so I asked Siri about it. I tried different ways to phrase my question, like “what is the Priority Messages feature in Mail?”, but it would not return any documentation about this feature.

Maybe I am using Siri wrong, or expecting too much. Perhaps all of this is a beta problem. But, aside from the last bullet, these are the kinds of things Apple has said Siri can do for over a decade, and it does not do so predictably or reliably. I have had similar or identical problems with Siri in non-beta versions of iOS. Today, while using the released version of iOS 18.1, I asked it if a nearby deli was open. It gave me the hours for a deli in Spokane — hundreds of kilometres away, and in a different country.

This all feels like it may be, perhaps, a side effect of treating an iPhone as an entire widget with a governed set of software add-ons. The quality of the voice assistant is just one of a number of factors to consider when buying a smartphone, and the predictably poor Siri is probably not going to be a deciding factor for many.

But the whole widget has its advantages — you can find plenty of people discussing those, and Apple’s many marketing pieces will dutifully recite them. Since its debut in 2011, Apple has rarely put Siri front-and-centre in its iPhone advertising campaigns, but it is doing just that with the iPhone 16. It is showcasing features which rely on whole-device control — features that, admittedly, will not be shipping for many months. But the message is there: Siri has a deep understanding of your world and can present just the right information for you. Yet, as I continue to find out, it does not do that for me. It does not know who I text in the first-party Messages app or what music I listen to in Apple Music.

Would Siri be such a festering scab if it had competitors within iOS? Thanks to an extremely permissive legal environment around the world in which businesses scoop up vast amounts of behavioural data to make it slightly easier to market laundry detergent and dropshipped widgets, there is a risk to granting this kind of access to some third-party product. But if there were policies to make that less worrisome, and if Apple permitted it, there would be more intense pressure to improve Siri — and, for that matter, all voice assistants tied to specific devices.

The rollout of Apple Intelligence is uncharacteristically piecemeal and messy. Apple did not promise a big Siri overhaul in this version of iOS 18.1. But by giving it a new design, Apple communicates something is different. It is not — at least, not yet. Maybe it will be one day. Nothing about Siri’s state over the past decade-plus gives me hope that it will, however. I have barely noticed improvements in the things Apple says it should do better in iOS 18.1, like preserving context and changing my mind mid-dictation.

Siri remains software I distrust. Like Federighi, I would struggle to list my usage beyond a handful of simple commands — timers, reminders, and the occasional message. Anything else, and it remains easier and more reliable to wash my hands if I am kneading pizza dough, or park the car if I am driving, and do things myself.