Pixel Envy

Written by Nick Heer.

Fuzzy User Interfaces

John Gruber, in his iPhone 4S review from October 2011:

iOS is explicit and visual. Everything you can do in iOS is something you can see and touch on screen. The limits are visible and obvious. Siri, on the other hand, feels limitless. It’s fuzzy, and fuzzy on purpose. There’s no way to tell what will work and what won’t. You must explore.

The thing that Siri did more than almost anything explicitly “Siri” was to introduce a sense of an explorative, fuzzy layer of the operating system. Today’s iOS is a similar creature to the one that shipped with the iPhone 4S; you could hand someone in 2011 an iPhone from today and it would feel familiar. But there has been significant and noticeable growth in the aspects of the operating system that are fuzzy.

Take, for example, the ability for Siri to remind you about whatever is presently onscreen:

This works in a lot of apps — Messages, Phone, News, Maps, and others. It works with third-party apps that have adopted that search indexing functionality we talked about earlier. […]

But, like much of Siri’s functionality, it’s buried under a layer of guesswork and unexpected failures. You can’t, for example, ask Siri to remind you about anything in Photos or Music — say, if you found a playlist in For You that you wanted to listen to later.

Or consider the 3D Touch functionality on the iPhones 6S, which works in some — but not all — apps, from both first- and third-parties.

This isn’t exclusive to iOS, either. Here’s the Verge’s Dieter Bohn on Android’s new “Now On Tap” feature:

Now on Tap plugs into Google’s vast knowledge of the web, but it seems pretty stupid about Google’s vast knowledge about me. Contacts I talk to regularly don’t pop up in Now on Tap, for example, and the suite of apps I depend on aren’t always options.

For now, the feature is a little frustrating in exactly the same way that Siri was frustrating when I first used it. It’s hard to know what Now on Tap can and can’t do — and even if you do know, sometimes it gets it wrong. There are only so many chances a “guess what you need” feature can whiff before it trains you to think that you can’t rely on it.

The discoverability and invisibility that makes these features so magical when they work is the same thing that makes it so frustrating when they don’t. The invisibility of this interface is also what allows us to take chances with it; it creates a sense of limitlessness until, of course, we stumble upon the limits of its application. Ask Siri “will I need a coat today?”, and I’m provided a weather report. Ask “will I need a tuque today?” and I get results somewhere between nothing and bupkis.

As we gain more confidence in the abilities of these fuzzy interfaces, they get better: more capable, more accurate, and more helpful. Theoretically, our trust in them tracks at a similar rate to their improvement, but this is impossible: they cannot improve without our interaction, because a team of people in Cupertino cannot realistically sample all accents, slang, timbres, and general mouth and throat sounds on their own. Add to that a plethora of possible background noises, the strong likelihood that we do not all enunciate as though we are Thomas Sheridan, and the typical size of the microphones in our devices, and it’s a small wonder that they can understand us at all.

One of the biggest challenges that the software must overcome in order to become better — whereby better I mean can be used with confidence that they will not confuse “two” and “too” in a dictated text message — is that we need to keep using them despite their immaturity. And that’s a big request when they do, indeed, keep confusing “two” and “too”. The amount of times that Siri has butchered everything from text messages to reminders to even the simplest of web searches has noticeably eroded my trust in it.

I implore you to not misread this; this is not a condemnation of Siri, Google Now, or any other contextually-sensitive or “personal assistant”-type software. It’s far better than it ever has been. But it will take continued patience from us and regular, noticeable improvements from the teams building this software for us to feel confident in its abilities.