Updating the Record on iOS 26 Usage Share

I made a mistake on Friday: instead of waiting to polish a more comprehensive article, I effectively live-blogged my shifting understanding of how StatCounter was collecting its iOS version number data by way of updates and edits to existing posts. In my own defence, I did not know the rate of users updating to iOS 26 would become as much of a story unto itself as it has. So allow me to straighten this out.

Here is the background: StatCounter publishes market share data by country and user technology based on statistics it collects from its web analytics package which, it says, is used by over a million websites totalling around five billion page views monthly. I have not heard of many of the sites using its analytics, but it seems to be a large enough and generic enough sample that it should be indicative — more so than, say, visitors to my audience-specific website. Ed Hardy, over at Cult of Mac, used StatCounter’s figures to report, on January 8, that “only about 15% of iPhone users have some version of the new operating system installed”. Hardy compared this to historical StatCounter figures showing a 63% adoption rate of iOS 18 by the same time last year, 54% on iOS 17 the year prior, and 62% on iOS 16 the year before that. If true, this would represent a catastrophic reluctance for iPhone users to update.

If true.

I do not think the iOS 26 uptake rate is about 15%. I think it is lower than the 54–63% range in previous years, but not by nearly that much. I think StatCounter has been misinterpreting iOS 26’s user base since last year because of a change Apple made to Safari.

If the phrase “user agent” does not make you respond by tipping your head to the side like my dog did when I asked him if he knew what I meant by that, you can skip this paragraph. A user agent string is a way for software to identify itself when it makes an HTTP request on behalf of a user. A user agent might describe the type and version of a web browser, the operating system, and have other information so that, in the old days, websites could check for compatibility. This leads to user agent strings that look a little silly:

Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/134.0.0.0 Safari/537.36 Edg/134.0.0.

This does not represent a Firefox user, despite starting with “Mozilla”, nor does it represent a Safari or Chrome user, despite the mentions of “Safari”, “Chrome”, and “AppleWebKit”. It is a user agent string for Microsoft Edge, which is begging to be treated like its competitors.

This is a simplified explanation, but it is important for how StatCounter works. When someone browses a website containing its analytics code, it reads the user agent string, and that is how StatCounter determines market share. The above user would be counted for Edge market share (“Edg/134.0.0”) and Windows. Which version of Windows? Well, while “NT 10.0” suggests it is Windows 10, it is also used by Edge running on Windows 11 — that part of the user string has been frozen. The Chromium team did the same thing and reduced the amount of specific information in the user agent string. This removes a method of fingerprinting and is generally fine.

This movement was spearheaded by Apple in 2017, when Ricky Mondello announced Safari Technology Preview 46 “freezes Safari’s user agent string. It will not change in the future”. But this remained a desktop-only change until September 2025, when Jen Simmons and others who work on WebKit, announced that the version of Safari shipping in iOS 26 would have its user agent stuck on the previous version of iOS:

Also, now in Safari on iOS, iPadOS, and visionOS 26 the user agent string no longer lists the current version of the operating system. Safari 18.6 on iOS has a UA string of:

Mozilla/5.0 (iPhone; CPU iPhone OS 18_6 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/18.6 Mobile/15E148 Safari/604.1

And Safari 26.0 on iOS has a UA string of:

Mozilla/5.0 (iPhone; CPU iPhone OS 18_6 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/26.0 Mobile/15E148 Safari/604.1

Apple justified this change only by implication, writing “we highly recommend using feature detection instead of UA string detection when writing conditional code”. But, as Jeff Johnson points out, this change does not eliminate version detection entirely:

[…] because Safari is always inseparable from the OS, so it’s possible to derive the iOS version from the Safari version, which continues to be incremented in the User-Agent. On macOS, in contrast, the latest version of Safari typically supports the three latest major OS versions, so Safari 26 can be installed on macOS 15 Sequoia and macOS 14 Sonoma in addition to macOS 26 Tahoe, and therefore the User-Agent — which actually says “OS X 10_15_7”! — is a little more effective at obscuring the OS version.

I noticed this, too, and it led to a mistake I made in my first guess at understanding why StatCounter was reporting some iOS 26 traffic, but not a lot. I thought StatCounter could have made a change to its analytics package to interpret this part of the user agent string instead, but that it may not have rolled out to all of its users. I was wrong.

What actually appears to account for iOS 26’s seemingly pitiful adoption rate is that third-party browsers like Chrome and Brave produce a user agent string that looks like this, on my iPhone running iOS 26.3:

Mozilla/5.0 (iPhone; CPU iPhone OS 26_3_0 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) CriOS/144.0.7559.53 Mobile/15E148 Safari/604.1

Safari, meanwhile, produces this user agent:

Mozilla/5.0 (iPhone; CPU iPhone OS 18_7 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/26.3 Mobile/15E148 Safari/604.1

“iPhone OS 26_3_0” on Chrome, but “iPhone OS 18_7” in Safari. And iOS 18.7 also exists, with a similar user agent string as Safari, albeit with “Version/18.7” in place of “Version/26.3”. The operating system version is the same in both, however: “18_7”. StatCounter’s iOS 26 data is not reflective of all iOS users — just those using third-party browsers that still have the current iOS version in their user agent string.

Even though third-party browsers are available on iOS, most users browse the web through Safari. And that means StatCounter is almost certainly counting the vast majority of people on iOS 26 as iOS 18.7 users. I retrieved those user agent strings using StatCounter’s detection utility, which is how it says you can validate the accuracy of its statistics. And it seems they are not. (I asked StatCounter to confirm this but have not heard back.)

The actual rate of iOS 26 adoption is difficult to know right now. Web traffic to generalist websites, like the type collected by StatCounter, seems to me like it would be a good proxy had its measurement capabilities kept up with changes to iOS. Other sources, like TelemetryDeck, indicate a far higher market share — 55% as I am writing this — but its own stats reported nearly 78% adoption of iOS 18 at this time last year, far higher than StatCounter’s 63%. TelemetryDeck’s numbers are based on aggregate data from its in-app analytics product, so they should be more accurate, but that also depends on which apps integrate TelemetryDeck and who uses them. What we can see, though, is the difference between last year and this year at the same time, around 23 percentage points. For comparison, in January 2024, TelemetryDeck reported around 74% had updated to iOS 17 — iOS 26 is 19 points less.

If its reporting for this year is similarly representative, it likely indicates a 20-point slide in iOS 26 adoption. Not nearly as terrible as the misleading StatCounter dashboard suggests, but still a huge slide compared to prior years. Apple will likely update its own figures in the coming weeks for a further point of comparison. However, even though there are early indications iOS 26 is not as well-received as its predecessors, what we do not know is why that is. Fear not, however, for there are obvious conclusions to be drawn.

Hardy, later in the same Cult of Mac article:

It’s not that millions of iPhone users around the world have somehow overlooked the launch of iOS 26 followed by iOS 26.1 and iOS 26.2. They are holding off installing the upgrades because this is Apple’s most controversial new version in many years. The reason: Liquid Glass — a translucent and fluid new interface. Many elements of the UI go semi-transparent, while clever effects make it seem like users are looking through glass at objects shown on the screen behind the Control Center and pop-up windows.

David Price, of Macworld, made the same assumption based on Hardy’s story — twice:

It’s debatable whether the egregious design of last year’s OS updates falls under the category of arrogance or incompetence; perhaps it’s both. But the takeaway for Apple should be that customer loyalty is finite, and there are consequences when you consistently lower your quality-control standards. When your entire business is built on people liking you, it’s best not to take them for granted.

I have no particular affinity for Liquid Glass. I am not sure its goals are well-conceived, and I do not think it achieves those objectives.

Even so, I think the aversion to Liquid Glass is so strong among some commentators that erroneous stats are fine so long as they are confirmation of their biases. Put it this way: if just 15% of users had, indeed, upgraded to iOS 26 and the reason for so many people remaining on previous versions is Liquid Glass, surely that should mean a corrected percentage — perhaps 55%, perhaps lower — is indicative that most people are not actually bothered by Liquid Glass, right?

Yes, there is a likely 20-point gap and, if that is due to Liquid Glass, it should be cause for worry at the highest levels of Apple. iOS is a mass-market operating system. The audience is not necessarily obsessed with information density or an adequate contrast ratio. If a redesign of iOS were exciting, people would race to update, just as they did when iOS 7 was launched. They appear hesitant. Maybe the reason is Liquid Glass, or maybe something else. Or maybe there are further measurement errors.

Whatever the case, I would avoid believing articles making sweeping conclusions based on a single data point. After all, if that number is shown to be incorrect, it destabilizes the whole argument.