A team of former U.S. government intelligence operatives working for the United Arab Emirates hacked into the iPhones of activists, diplomats and rival foreign leaders with the help of a sophisticated spying tool called Karma, in a campaign that shows how potent cyber-weapons are proliferating beyond the world’s superpowers and into the hands of smaller nations.
[…]
The ex-Raven operatives described Karma as a tool that could remotely grant access to iPhones simply by uploading phone numbers or email accounts into an automated targeting system. The tool has limits — it doesn’t work on Android devices and doesn’t intercept phone calls. But it was unusually potent because, unlike many exploits, Karma did not require a target to click on a link sent to an iPhone, they said.
In 2016 and 2017, Karma was used to obtain photos, emails, text messages and location information from targets’ iPhones. The technique also helped the hackers harvest saved passwords, which could be used for other intrusions.
It isn’t clear whether the Karma hack remains in use. The former operatives said that by the end of 2017, security updates to Apple Inc’s iPhone software had made Karma far less effective.
This story is just one part of a deeper investigation from Schectman and Bing into surveillance activities by the United Arab Emirates on dissidents and activists, which is worth reading. Remarkably, it even cites a named source.
The timing of the capabilities of this exploit coincide with the introduction of iMessage media previews. If I were looking to create a security hole in an iPhone without any user interaction, that’s the first place I’d look. Also, note that this report states that this exploit is now “far less effective”; it does not say that the vulnerabilities have been patched.
A new post by Justin O’Beirne is an immediate must-read for me, and this latest one is no exception. In fact, it’s maybe the one I would most recommend because it’s an analysis of the first leg of a four-year project Apple unveiled earlier this year. Here’s what Matthew Panzarino wrote at the time for TechCrunch:
The coupling of high-resolution image data from car and satellite, plus a 3D point cloud, results in Apple now being able to produce full orthogonal reconstructions of city streets with textures in place. This is massively higher-resolution and easier to see, visually. And it’s synchronized with the “panoramic” images from the car, the satellite view and the raw data. These techniques are used in self-driving applications because they provide a really holistic view of what’s going on around the car. But the ortho view can do even more for human viewers of the data by allowing them to “see” through brush or tree cover that would normally obscure roads, buildings and addresses.
Regardless of how Apple is creating all of its buildings and other shapes, Apple is filling its map with so many of them that Google now looks empty in comparison. […]
And all of these details create the impression that Apple hasn’t just closed the gap with Google — but has, in many ways, exceeded it…
[…]
But for all of the detail Apple has added, it still doesn’t have some of the businesses and places that Google has.
[…]
This suggests that Apple isn’t algorithmically extracting businesses and other places out of the imagery its vans are collecting.
Instead, all of the businesses shown on Apple’s Markleeville map seem to be coming from Yelp, Apple’s primary place data provider.
Rebuilding Maps in such a comprehensive way is going to take some time, so I read O’Beirne’s analysis as a progress report. But, even keeping that in mind, it’s a little disappointing that what has seemingly been prioritized so far in this Maps update is to add more detailed shapes for terrain and foliage, rather than fixing what places are mapped and where they’re located. It isn’t as though progress isn’t being made, or that it’s entirely misdirected — roads are now far more accurate, buildings are recognizable, and city parks increasingly look like city parks — but the thing that frustrates me most about Apple Maps in my use is that the places I want to go are either incorrectly-placed, not there, or have inaccurate information like hours of operation.
As has become a bit of a tradition around here, I have a review of iOS 12 coming; however, it won’t be out today. Turns out trying to find an apartment in Calgary right now is difficult and time consuming.
In the interim, please read Federico Viticci’s excellent deep dive into iOS 12. It’s far more detailed than mine will ever be and, as the iOS automation expert, he’s uniquely gifted in explaining this update’s improvements to Siri and the new Shortcuts app.
I’ve been using my iPhone X for nearly a week now and, while I have some thoughts about it, by no means am I interested in writing a full review. There seem to be more reviews of the iPhone X on the web than actual iPhone X models sold. Instead, here are some general observations about the features and functionality that I think are noteworthy.
The Hardware
The iPhone X is a product that feels like it shouldn’t really exist — at least, not in consumers’ hands. I know that there are millions of them in existence now, but mine feels like an incredibly well-made, one-off prototype, as I’m sure all of them do individually. It’s not just that the display feels futuristic — I’ll get to that in a bit — nor is it the speed of using it, or Face ID, or anything else that you might expect. It is all of those things, combined with how nice this product is.
I’ve written before that the magic of Apple’s products and their suppliers’ efforts is that they are mass-producing niceness at an unprecedented scale. This is something they’ve become better at with every single product they ship, and nothing demonstrates that progress better than the iPhone X.
It’s such a shame, then, that the out-of-warranty repair costs are appropriately high, to the point where not buying AppleCare+ and a case seems downright irresponsible. Using the iPhone X without a case is a supreme experience, but I don’t trust myself enough to do so. And that’s a real pity, because it’s one of those rare mass-produced items that feels truly special.
The Display
This is the first iPhone to include an OLED display. It’s made by Samsung and uses a diamond subpixel arrangement, but Apple says that it’s entirely custom-designed. Samsung’s display division is being treated here like their chip foundry was for making Apple’s Ax SoCs.
And it’s one hell of a display. It’s running at a true @3x resolution of 458 pixels per inch. During normal use, I can’t tell much of a difference between it and the 326 pixel-per-inch iPhone 6S that I upgraded from. But when I’m looking at smaller or denser text — in the status bar, for example, or in a long document — this iPhone’s display looks nothing less than perfect.
One of the reasons this display looks so good is because of Apple’s “True Tone” feature, which matches the white balance of the display to the environment. In a lot of indoor lighting conditions, that’s likely to mean that the display is yellower than you’re probably used to. Unlike Night Shift, though, which I dislike for being too heavy-handed, True Tone is much subtler. Combine all of this — the brightness of the display, its pixel density, its nearly edge-to-edge size, and True Tone — with many of iOS’ near-white interface components and it really is like a live sheet of paper in your hand.
Because it’s an OLED display that has the capability of switching on and off individual pixels, it’s only normal to consider using battery-saving techniques like choosing a black wallpaper or using Smart Invert Colours. I think this is nonsense. You probably will get better battery life by doing both of those things, but I’ve been using my iPhone X exactly the same as I have every previous phone I’ve owned and it gets terrific battery life. Unless you’re absolutely paranoid about your battery, I see no reason in day-to-day use to treat the iPhone X differently than you would any other phone.
I’m a total sucker for smaller devices. I’d love to see what an iPhone SE-sized device with an X-style display would be like.
Face ID
Face ID is, for my money, one of the best things Apple has done in years. It has worked nearly flawlessly for me, and I say that with no exaggeration or hyperbole. Compared to Touch ID, it almost always requires less effort and is of similar perceptual speed. This is particularly true for login forms on the web: where previously I’d see the Touch ID prompt and have to shuffle my thumb down to the home button, I now just continue staring at the screen and my username and password are just there.
I’m going to great pains to avoid the most obvious and clichéd expression for a feature like this, but it’s apt here: it feels like magic.
The only time Face ID seems to have trouble recognizing me is when I wake up, before I’ve put on my glasses. It could be because my eyes are still squinty at the time and it can’t detect that I’m looking at the screen, or maybe it’s just because I look like a deranged animal first thing in the morning. Note, though, that it has no trouble recognizing me without my glasses at any other time; however, I first set up Face ID while wearing my glasses and that’s almost always how I use it to unlock my phone. That’s how it recognizes me most accurately.
UI Differences
Last week, I wrote that I found that there was virtually no learning curve for me to feel comfortable using the home indicator, and I completely stand by that. If you’ve used an iPad running iOS 11, you’re probably going to feel right at home on an iPhone X. My favourite trick with the home indicator is that you can swipe left and right across it to slide between recently-used apps.
Arguably, the additional space offered by the taller display is not being radically reconsidered, since nearly everything is simply taller than it used to be. But this happens to work well for me because nearly everything I do on my iPhone is made better with a taller screen: reading, scrolling through Twitter or Instagram, or writing something.
The typing experience is, surprisingly, greatly improved through a simple change. The keyboard on an iPhone X is in a very similar place to where it is on a 4.7-inch iPhone, which means that there’s about half an inch of space below it. Apple has chosen to move the keyboard switching button and dictation control into that empty space from beside the spacebar, and this simple change has noticeably improved my typing accuracy.
In a welcome surprise, nearly all of the third-party apps I use on a regular basis were quickly updated to support the iPhone X’s display. The sole holdouts are Weather Line, NY Times, and Spotify.
I have two complaints with how the user interfaces in iOS work on the iPhone X. The first is that the system still seems like it is adapting its conventions to fit bigger displays. Yes, you can usually swipe right from the lefthand edge of the display to go back to a previous screen, but toolbars are still typically placed at the top and bottom of the screen. With a taller display, that means that there can be a little more shuffling of the device in your hand to hit buttons on opposite sides of the screen.
My other complaint is just how out of place Control Centre feels. Notification Centre retains its sheet-like appearance if it’s invoked from the left “ear” of the display, but Control Centre opens as a sort of panelled overlay with the status bar in the middle of the screen when it is invoked from the right “ear”. The lack of consistency between the two Centres doesn’t make sense to me, nor does the awkward splitting of functionality between the two upper corners of the phone. It’s almost as though it was an adjustment made late in the development cycle.
Update: One more weird Control Centre behaviour is that it displays the status bar but in a different layout than the system usually does. The status bar systemwide shows the time and location indicator on the left, and the cellular signal, WiFi indicator, and battery level on the right. The status bar within Control Centre is, left to right: cellular signal, carrier, WiFi indicator, various status icons for alarm and rotation lock, location services indicator, Bluetooth status, battery percentage, and battery icon. The location indicator, cellular strength, and WiFi signal all switch sides; I think they should stay consistent.
I don’t know what the ideal solution is for the iPhone X. Control Centre on the iPad is a part of the multitasking app switcher, and that seems like a reasonable way to display it on the iPhone, too. I’m curious as to why that wasn’t shipped.
Cameras and Animoji
This is the first dual-camera iPhone I’ve owned so, not only do I get to take advantage of technological progress in hardware, I also get to use features like Portrait Mode on a regular basis. Portrait Mode is very fun, and does a pretty alright job in many environments of separating a subject from its background. Portrait Lighting, new in the iPhone 8 and iPhone X, takes this one step further and tries to replicate different lighting conditions on the subject. I found this to be much less reliable, with the two spotlight-style “stage lighting” modes to be inconsistent in their subject detection abilities.
The two cameras in this phone are both excellent, and the sensor captures remarkable amounts of data, especially if you’re shooting RAW. Noise is well-controlled for such a small sensor and, in some lighting conditions, even has a somewhat filmic quality.
I really like having the secondary lens. Calling it a “telephoto” lens is, I think, a stretch, but its focal length creates some nice framing options. I used it to take a photo of my new shoes without having to get too close to the mirror in a department store.
Animoji are absurdly fun. The face tracking feels perfect — it’s better than motion capture work in some feature films I’ve seen. I’ve used Animoji more often as stickers than as video messages, and it’s almost like being able to create your own emoji that, more or less, reflects your actual face. I only have two reservations about Animoji: they’re only available as an iMessage app, and I worry that it won’t be updated regularly. The latter is something I think Apple needs to get way better at; imagine how cool it would be if new iMessage bubble effects were pushed to devices remotely every week or two, for example. It’s the same thing for Animoji: the available options are cute and wonderful, but when Snapchat and Instagram are pushing new effects constantly, it isn’t viable to have no updates by, say, this time next year.
AppleCare+
I mentioned above that I bought AppleCare+ for this iPhone. It’s the first time I’ve ever purchased AppleCare on a phone, and only the second time I’ve purchased it for any Apple product — the first was my MacBook Air because AppleCare also covered the Thunderbolt Display purchased around the same time. This time, it was not a good buying experience.
I started by opening up the Apple Store app, which quoted $249 for AppleCare+ for the iPhone X. I tapped on the “Buy Now” button in the app but received an error:
Some products in your bag require another product to be purchased. The required product was not found so the other products were removed.
As far as I can figure out, this means that I need to buy an iPhone X at the same time, which doesn’t make any sense as the Store page explicitly says that AppleCare+ can be bought within sixty days.
I somehow wound up on the check coverage page where I would actually be able to buy extended coverage. After entering my serial number and fumbling with the CAPTCHA, I clicked the link to buy AppleCare. At that point, I was quoted $299 — $50 more than the store listing. I couldn’t find any explanation for this discrepancy, so I phoned Apple’s customer service line. The representative told me that the $249 price was just an estimate, and the $299 price was the actual quote for my device, which seems absurd — there’s simply no mention that the advertised price is anything other than the absolute price for AppleCare coverage. I went ahead with my purchase, filling in all my information before arriving at a final confirmation page where the price had returned to $249, and that was what I was ultimately charged.
It’s not the $50 that troubles me in this circumstance, but the fact that there was a difference in pricing at all between pages on Apple’s website. I don’t know why I was ever shown a $299 price, nor do I understand why I’m unable to use the Apple Store app to purchase AppleCare+ for my iPhone X using my iPhone X.
David Zax, in a must-read article for Fast Company, describes the litigation initiated by Casper against several mattress review websites:
On April 29, 2016, Casper filed lawsuits against the owners of Mattress Nerd, Sleep Sherpa, and Sleepopolis (that is, Derek), alleging false advertising and deceptive practices.
Mattress Nerd and Sleep Sherpa quickly settled their cases, and suddenly their negative Casper reviews disappeared from their sites, in what many onlookers speculated was a condition of the settlements. But by the end of 2016, when I started closely studying the lawsuits, Derek’s Casper review remained, defiantly, up on Sleepopolis. He was soldiering on in his legal battle with the mattress giant. People who knew him called Derek a fighter; one of his nicknames was “Halestorm.”
Casper had another way of referring to him. Derek was “part of a surreptitious economy of affiliate scam operators who have become the online versions of the same commission-hungry mattress salesmen that online mattress shoppers have sought to avoid,” Casper’s lawsuit alleged. The company complained that Derek was not forthright enough about his affiliate relationships, noting his disclosures were buried in a remote corner of his site. This did violate recently issued FTC guidelines, and Derek updated his site to comply.
This is a deeply disturbing piece. Derek Hales, the founder of Sleepopolis, was doing some shady things that seemed to be driven by the value of affiliate links more than his honest opinion of the mattresses. But Casper’s practices are even more suspect, beginning with this correspondence between CEO Phillip Krim and Jack Mitcham of Mattress Nerd:
In January 2015, Krim wrote Mitcham that while he supported objective reviews, “it pains us to see you (or anyone) recommend a competitor over us.”
Krim went on: “As you know, we are much bigger than our newly formed competitors. I am confident we can offer you a much bigger commercial relationship because of that. How would you ideally want to structure the affiliate relationship? And also, what can we do to help to grow your business?”
[…]
Krim then upped his offer, promising to boost Mitcham’s payouts from $50 to $60 per sale, and offering his readers a $40 coupon. “I think that will move sales a little more in your direction,” replied Mitcham on March 25, 2015. In the months that followed, Mattress Nerd would become one of Casper’s leading reviews site partners. (The emails surfaced due to another mattress lawsuit, GhostBed v. Krim; if similar correspondence exists with Derek Hales, it has not become public.)
It certainly sounds like Krim was, behind the scenes, financially incentivizing reviewers to push the Casper mattress. You’ll want to read Zax’s full article for the kicker to the Sleepopolis saga. It’s atrocious.
Update: I’ve been racking my brain all day trying to think about what the end of Zax’s story reminds me of:
“Hello!” ran the text beside the headshot. “My name is Dan Scalco and I’d like to personally welcome you to the brand new version of Sleepopolis. Here’s what’s up… On July 25th, 2017 our company acquired Sleepopolis.com …. Derek Hales and Samantha Hales are no longer associated with Sleepopolis.”
An italicized note added:
“In July 2017, a subsidiary of JAKK Media LLC acquired Sleepopolis.com. Casper provided financial support to allow JAKK Media to acquire Sleepopolis.”
David Carr, writing in the New York Times in 2014:
Last week, I read an interesting article about how smart hardware can allow users to browse anonymously and thus foil snooping from governments. I found it on what looked like a nifty new technology site called SugarString.
Oddly enough, while the article mentioned the need for privacy for folks like Chinese dissidents, it didn’t address the fact that Americans might want the same kind of protection.
There’s a reason for that, although not a very savory one. At the bottom of the piece, there was a graphic saying “Presented by Verizon” followed by some teeny type that said “This article was written by an author contracted by Verizon.”
SugarString writers were apparently prohibited from writing stories about net neutrality or the NSA’s spying activity — remember, this was in 2014, when both of those topics were especially concerning. So if you were going to SugarString for your tech news, you were highly misinformed. Likewise, if you were to visit Sleepopolis — owned by Casper — do you think you’d be getting a fair review of mattress buying options?
The reason I’ve been puzzled all day about this is because I’m nearly certain that there was a similar marketing-spun publication that was created by — I think — a mining or oil and gas company. I don’t think I’m making this up or misremembering it, so if you have any idea what I might be thinking about, let me know.
I’ve got my balcony door wide open this evening and the breeze it’s creating simply isn’t making a difference — I feel like I’m melting into my couch. I should be used to this after a record-shattering summer, but I am not. I live in Canada, in a city where snowfall has been recorded in every month. I am exhausted. I’m holding in one hand a glass of The Hatch’s 2016 “Rhymes with Door Hinge” and, with the other, I am balancing my iPad perhaps a little too precariously on my leg.
I’m flipping through one of the Atlantic’s excellent weekly photo galleries and I see an amazing picture that I know a friend of mine will love. I put down my glass of wine to be able to perform a somewhat tricky routine of dragging the photo with one finger, dragging the URL with another, swiping from the right-hand part of the screen to float Messages over Safari with a third finger, then navigating to that friend’s chat thread and dropping both the image and URL into a message to send it off. I’m impressed, but also not quite used to these complex interactions. I still feel clumsy sometimes when I do them — a thought that was underscored moments later when I went to pick up my glass of wine only to spill it all over my coffee table.
iOS 11, then: it gives you all kinds of fun new powers, especially on an iPad, but it won’t save you if you’re already a klutz.
I’ve been using iOS 11 daily since it was announced at WWDC and, rather than go through each feature point-by-point like an extended changelog with commentary, I thought I’d explore a bit of how this update feels different with daily use. There’s a lot to unpack and, while I think the vast majority of this upgrade is excellent and demonstrates clear progress in areas previously ignored, I feel there are some things that are really and truly confused. Let me show you what I mean.
The Weird Stuff
Let’s start with the lock screen, because that’s where pretty much every iOS interaction will start. When you unlock the device, the lock screen now slides up as though it’s a cover overtop the rest of the system. In some places, like notification preferences, Apple even calls it the “Cover Screen”. But, while this animation suggests that the lock screen is now sitting in an invisible place above the top of the screen, you can’t swipe upwards to unlock a non-iPhone X device — that action will scroll notifications instead — nor can you pull down from the top to lock it.
Making matters even more confusing, if you do pull down from the top of an unlocked device, the screen looks like the lock screen, but doesn’t actually lock the device.
Here’s another example: the iPad and other devices that don’t have 3D Touch displays now support some 3D Touch functionality. If you touch and hold on a notification on the lock screen, for example, it looks like you’re doing the “peek” gesture. The new grid-based Control Centre requires 3D Touch interactions on the iPhone but, again, those gestures have been substituted for touch-and-hold on the iPad. I guess these are fine adaptations, but it indicates to me that aspects of the system were designed in anticipation for a mix of devices that don’t yet exist and some — but not all — of the devices that do. It is inconsistent, though: while it’s possible to use 3D Touch interactions in Control Centre and on notifications in Notification Centre, similar “Peek” interactions don’t work on home screen icons or within apps.
The differences in iOS 11, then, continue to balance new functionality with further complications. But this should be no surprise to those who have used Apple’s ecosystem of devices for several years; it is merely accelerating a trend of growing the features of iOS without forgetting its roots. iOS was, in many ways, a fresh start for the future of computing and each iteration of the OS has built upon that. Sometimes, as above, it feels as though these additions are moving a little too fast. I notice this most when additions or updates feel perhaps incomplete, or, at least, not wholly considered.
As an example, this iteration of Control Centre is the third major interpretation since iOS 7, released just four years ago. It no longer splits its controls across two pages which, I’m sure, ought to make some people very happy — I was never bothered by that. Its grid-like layout has been touted as being “customizable”, but that’s only true of the app launching and single-function icons across the bottom: you know, the buttons for Calculator, Camera, or the flashlight. You can now choose from over a dozen different apps and functions, including screen recording and a quick-access remote for the Apple TV, and you’re no longer limited to just four of these controls — if there are too many, Control Centre will scroll vertically.
You’d think, though, that by turning Control Centre into a grid that it would be possible to rearrange sections of it by what you use most, or hide controls you never use. That isn’t possible in this version. You might also think that adding a level of customizability would make it possible to assign third-party apps to certain Control Centre launching points — for example, launching PCalc instead of Calculator, or Manual instead of Camera. But that hasn’t happened either. It is also not possible to change which WiFi network you are connected to from Control Centre, despite the additional depth enabled by 3D Touch controls.
Here’s another example of where things feel a bit incomplete: Slide Over and Split View on the iPad. Previously, dragging an app into either multitasking mode required you to swipe from the right edge to expose a grey panel full of oddly-shaped rounded rectangles, each of which contained an app icon. Apart from looking ugly, which it was, this UI made absolutely no sense to me. What were the rounded rectangles representing? Why did they need to be so large? Why did such an obviously unscalable UI ship?
Thankfully, this interface is no more for iOS. iPad multitasking is now made somewhat easier by the new systemwide floating Dock. It works and looks a little bit like the Dock on MacOS, insomuch as it contains your favourite apps and can be accessed from within any app simply by swiping upwards from the bottom of the screen. If you want to get an app into Split View or Slide Over, all you need to do is drag its icon up from the Dock and let it expand into a multitasking view on either side of the open app.
But hang on just a minute: if you’re on the home screen, dragging an app icon up from the Dock will remove that app from the Dock. So, in one context, the action is destructive; in others, it’s constructive. That inconsistency feels bizarre in practice, to say the least.
And then there’s the process of getting an app into a multitasking view when it isn’t a Dock app. You can start from the home screen or Spotlight in Notification Centre by finding your app, then touch and hold on the icon until it starts to float. Then, either launch an app with another of your fingers (if you’re starting on the home screen) or press the home button to close Spotlight. Wait until the app icon expands in place, then drop it on either side of the screen to get it into multitasking. It took me a little while to figure out this gymnastics routine and, if I’m honest with myself, it doesn’t feel fully considered. The Dock is brilliant, but the trickiness of getting non-Dock apps into a multitasking view doesn’t yet feel obvious enough.
There is, however, a minor coda of relief: the Dock has space on the righthand side, past the very Mac-like divider, for “suggested” apps. This area tends to include non-Dock apps that you’ve recently used, apps from Handoff, or apps triggered when you connect headphones. But, as this Dock area relies upon technology that is “learning” user patterns rather than being directly user-controlled, the apps you’re expecting may not always be in that area of the Dock. When it works, it’s amazing; when it doesn’t, you still have to do the somewhat-complicated dance of launching apps from the home screen.
Finally, the Dock has more of that pseudo-3D Touch functionality. You can touch and hold on a supported app’s icon to display a kind of popover menu, which looks a lot like the 3D Touch widgets that display on iPhone apps. But they’re not the same thing; apps that have a widget on the iPhone will have to add a different kind of functionality to show a very similar feature in the iPad’s Dock.
So these things — the Dock and Control Centre — feel like they are hinting at newer and more exciting things, but don’t quite conclude those thoughts. They feel, simply, rushed.
In other ways, though, it can sometimes feel like an addition to iOS has taken longer than it should.
Drag and Drop, Keyboard Flicks, and Other iPad Improvements
That statement, naturally, leads me neatly onto systemwide cross-application drag and drop, making its debut this year. There are apparently lots of reasons for why drag and drop was not in iOS previously — for example, it seems as though APFS and its cloning and snapshot features help enable a faster and more efficient drag and drop experience. The new Dock, which allows for more efficient app switching, also seems to have played a role. But regardless of why it took so many years for such a natural interaction to debut on Apple’s touch devices, we should focus on the what of it. Is it good?
Oh, yes. Very.
I love many of the iPad enhancements in this release, but none has been as strong for me as the implementation of drag and drop. Not only can you drag stuff across apps, the drag interactions are separate from the apps themselves. They kind of live in a layer overtop the rest of the system, so you can move around and find just the app you’re looking for — whether you launch it from the Dock, app switcher, home screen, or Spotlight.
But my favourite thing about drag and drop on iOS and the reason I’ve been so impressed by it is that you can use all of your fingers to “hold” dragged items until you’re ready to drop them. You can also drag items from multiple sources and even multiple apps. It’s crazy good, to the point where dragging and dropping on a traditional computer using a mouse cursor feels like a kludge. In fact, drag and drop is one of the biggest reasons why I’ve chosen to use an iPad more in the past few months than I did for the preceding year.
Developers do have to add support for drag and drop in their apps, but some UI components — like text areas — will support drag and drop in any app without the developer needing to make adjustments.
The other really big enhancement that has completely transformed my iPad experience is the new app switcher. Swiping from the bottom of the screen reveals the new floating Dock, but a continued (or second) swipe will show the new app switcher. Instead of showing a single app at a time, six thumbnails now fit onto the screen of my 9.7-inch model at once, making for a much better use of the display’s space. I’m not sure how many app thumbnails fit into a 12.9-inch model’s screen; I hope for more.
Other than being vastly more efficient, which makes the Swiss half of me extremely happy, the app switcher also preserves app “spaces”. When I’m writing, I like to have Slack and Tweetbot open in split-screen, put Safari and Notes together, and keep Byword in its own space. Now, whenever I switch between these, those pairings are retained: if I tap on Tweetbot in the Dock, I’ll see Tweetbot and Slack, exactly as I left them. This makes it really easy to construct little task-specific parts of the system.
Another great enhancement to the system is the new keyboard. Instead of having to navigate between letters, numbers, and symbols with a modal key, you can now swipe down on individual keys to insert common characters. It takes some getting used to — especially for the ways I type, I often insert a “0” where I mean to type a “p”, for instance. Unfortunately, this relatively common typing mistake isn’t caught by autocorrect. Maybe I’m just sloppy; I’m not sure. Even with my misplaced numerals, I appreciate this keyboard refinement. It makes typing so much faster, especially since I frequently have to type combinations of letters and numbers while writing Pixel Envy. I still think frequent patterns — say, postal codes, for example, which in Canada alternate between letters and numbers — should be automatically formatted as you type, but this keyboard is definitely a great step up once you get used to it.
There are some lingering problems I have with the iPad’s keyboard, in particular, however. I find that it occasionally registers keys tapped in fast succession as a two finger tap, which invokes a text selection mode. I have begun to replace entire sentences without realizing it because of this. I wish the iPad’s keyboard could do a better job of understanding the difference between fast typing and invoking selection mode. The goal should be to make the virtual keyboard as close to a physical keyboard in terms of user confidence and key registration accuracy. Also, I continue to have absolutely awful luck with autocorrect: it capitalizes words seemingly at random, changes word tense several typed words later — when I began typing the word “seemingly” just now, it changed “capitalizes” to “capitalized” — and is frequently a focus-disrupting nuisance. It can be turned off in Settings, but I find that the amount of times autocorrect is actually useful just barely outweighs the times that it is frustrating. Enhancing autocorrect is something I believe should be a focus of every iOS release, major or minor.
But, even with all the attention lavished upon the iPad this year, there are still some ultra-frustrating limitations. With the exception of Safari, you can only open one instance of an app at a time. I cannot tell you how frequently I have two different windows from the same app open at the same time on my Mac, and it’s really irritating to not be able to do that on my iPad, especially with the far better support for multiple apps in iOS 11.
There are other things that have left me wanting on the iPad, too, like the stubbornly identical home screen. I’m not entirely sure it needs a complete rethink. Perhaps, somewhere down the line, we could get a first page home screen that acts a little more like MacOS, with recent files, suggested apps, widgets, and a lot more functionality. But even in the short term, it would make sense to be able to add more icons on each page, especially on the larger-sized models.
And, strangely, in terms of space utilization, the iPad fares slightly worse on iOS 11 than it did running iOS 10 because Notification Centre has reverted to a single-column layout. There may be a reason for this — maybe even a really good one — but any attempt to rationalize it is immediately rendered invalid because the iPhone actually gains a two-column Notification Centre layout in landscape on iOS 11. I do not understand either decision.
I also think that it’s unfortunate that Siri continues to take over the entire display whenever it is invoked. I hope a future iOS update will treat Siri on the iPad more like a floating window or perhaps something that only covers a third of the display — something closer to the MacOS implementation than a scaled-up iPhone display. I know it’s something that’s typically invoked only briefly and then disappears, but it seems enormously wasteful to use an entire display to show no greater information than what is shown on the iPhone.
Siri
Here’s a funny thing about that previous paragraph: using the word “Siri” to describe Apple’s voice-controlled virtual assistant is actually a bit antiquated. You may recall that, in iOS 10, the app suggestions widget was renamed “Siri App Suggestions”; in iOS 11, it has become clear that “Siri” is what Apple calls their layer of AI automation. That’s not necessarily super important to know in theory, but I think it’s an interesting decision; it’s one thing for a website to note that their search engine is “powered by Google”, but I’m not sure Siri has the reputation to build Apple’s AI efforts on. Then again, perhaps it’s an indication that these efforts are being taken more seriously.
In any case, the new stuff: the personal assistant front-end for Siri has a new voice. In many contexts, I’ve felt it sounds more natural, and that alone helps improve my trust in Siri. However, I’m not sure it’s truly more accurate, though I perceive a slight improvement.
This idea of Siri as a magical black box is something I’ve written about several times here. I will spare you my rehashing of it. Of course, this is the path that many new technologies are taking, from Google and Amazon’s smart speakers to the mysterious friend recommendations in Facebook and LinkedIn. It’s all unfathomable, at least to us laypeople. When it works, it’s magical; when it doesn’t, it’s frustrating, and we have no idea what to do about it, which only encourages our frustration. These technologies are like having a very drunk butler following you everywhere: kind of helpful, but completely unpredictable. You want to trust them, but you’re still wary.
Even with a new voice and perhaps slightly more attentive hearing, Siri is still oblivious to common requests. I am writing these words from a sandwich place near where I live called the Street Eatery. It was recommended to me by Siri after I asked it for lunch recommendations, which is great. However, when I followed up Siri’s recommendation by asking it to “open the Street Eatery’s website”, it opened a Trip Advisor page for a place called the Fifth Street Eatery in Colorado, instead of the restaurant located blocks away that it recommended me only moments before.
In iOS 11, Siri also powers a recommendation engine in News, and suggests search topics in Safari when you begin using the keyboard. For example, when tapped on the location bar after reading this article about Ming-Chi Kuo’s predictions for the new iPhone, it correctly predicted in the QuickType bar that I may want to search more for “OLED”, “Apple Inc.”, or “iPhone”. But sometimes, Siri is still, well, Siri: when I tapped on the location bar after reading a review of an Indian restaurant that opened relatively recently, its suggestions were for Malaysian, Thai, and Indonesian cuisine — none of which were topics on that page. The restaurant is called “Calcutta Cricket Club”, and the post is tagged in WordPress with “Indian cuisine”, so I have no idea how it fathomed those suggestions. And there’s no easy way for me to tell Apple that they’re wrong; I would have to file a radar. See the above section on magical black boxes.
To improve its accuracy over time, Siri now syncs between different devices. Exactly what is synced over iCloud is a mystery — Apple hasn’t said. My hunch is that it’s information about your accent and speech patterns, along with data about the success and failure of different results. Unfortunately, even with synced data, Siri is still a decidedly per-device assistant; you cannot initiate a chain of commands on one device, and then pick it up on another. For example, I wouldn’t be able to ask my iPad to find me recommendations for dinner, then ask my iPhone to begin driving directions to the first result without explicitly stating the restaurant’s name. And, even then, it might pick a restaurant thousands of miles away — you just never know.
User Interface and Visual Design
At the outset of this review, I wrote that I wanted primarily to relay my experiences with the iOS 11 features I use most and had the greatest impact on how I use these devices. I want to avoid the temptation of describing every change in this version, but I don’t think I can describe the ways I have used with my iPhone and iPad without also writing about the ways in which Apple has changed its visual design.
Every new major release of iOS gives Apple the chance to update and refine their design language, and iOS 11 is no exception. Last year, Apple debuted a new style of large, bold titles in News, Music, and the then-new Home app; this year, that design language has bled throughout the system. Any app defined by lists — including Mail, Phone, Contacts, Wallet, Messages, and even Settings — now has a gigantic billboard-esque title. It kind of reminds me of Windows Phone 7, only nicer. I like it a lot and, based on the screenshots I’ve seen so far, it appears to work well to define the upper area of the iPhone X.
In practice, though, this treatment means that the top quarter of the screen is used rather inefficiently in an app’s initial view. You launch Settings, for example, and the screen is dominated by a gigantic bold “Settings” label. You know you’re in Settings — you just launched it. A more cynical person might point to this as an indication that all post-iOS 7 apps look the same and, therefore, some gigantic text is needed to differentiate them. I do not believe that is the case — there is enough identifying information in each app, between its icon, layout, and contextually-relevant components.
And yet, despite the wastefulness of this large text, I still think it looks great. The very high resolution displays in every device compatible with iOS 11 and Apple’s now-iconic San Francisco typeface combine to give the system a feeling of precision, intention, and clarity. Of course, it’s worth asking why, if it’s so great, a similar large header is not shown as one triangles further into an app. I get the feeling that it would quickly become overbearing; that, once you’re deep within an app, it’s better to maximize efficiency — in magazine terms, the first page can be a cover, but subsequent levels down within the same app should be the body.
Fans of clarity and affordances in user interfaces will be delighted to know that buttons are back. Kind of. Back when iOS 7 debuted, I was among many who found the new text-only “buttons” strewn throughout the system and advocated for in the HIG as contentious and confusing. Though I’ve gotten more used to them over the past several years, my opinion has not changed.
iOS 11 is part of what I’m convinced is a slow march towards once again having buttons that actually look like buttons. The shuffle and looping controls in Music, for instance, are set against a soft grey background. The App Store launcher in Messages is a button-looking button. But, lest you think that some wave of realization has come across the visual designers working on iOS, you should know that the HIG remains unchanged, as does the UIButton control.
There are some noteworthy icon changes in this update as well. I quite like the new Contacts icon and the higher-contrast icon for Settings, but I have no idea what Apple’s designers were thinking with the new Calculator icon. It’s grey; it has a glyph of a calculator on it in black and orange. And I reiterate: it is grey. The Reminders icon has been tweaked, while the new Maps icon features a stylized interpretation of Apple Park which, per tradition, is cartographically dubious. I don’t like the plain-looking Files icon; I remain less-than-enthusiastic about almost any icon that features a glyph over a white background, with the exceptions of Photos and the NY Times app.
The new App Store icon proved controversial when it launched, but I actually like it. The previous glyph was a carryover from MacOS and, while I don’t think that it was confusing anyone, I do think that this simplified interpretation feels more at home on iOS. The new iTunes Store icon is the less successful of the two redesigns, I feel. As Apple Music has taken over more of the tunes part of iTunes, it appears that the icon is an attempt to associate iTunes with movies and TV shows through the blending of the purple background colour and the star glyph — both attributes, though not identical, are used for the iMovie icon as well. But this only seems to highlight the disconnect between the “iTunes Store” name and its intended function.
Icons on tab bars throughout the system have also been updated. In some places, solid fills replace outlines; in others, heavier line weights replace thin strokes. I really like this new direction. It’s more legible, it feels more consistent, and it simply looks better. These are the kinds of refinements I have expected to see as the course correction that was iOS 7 matures. While it has taken a little longer than I had hoped, it’s welcome nevertheless.
And, for what it’s worth, the signal bars have returned to the status bar, replacing the circular signal dots. This reversion seems primarily driven by the iPhone X’s notched display, but every iPhone and iPad model gets the same status bar. I cannot figure out why the brand new Series 3 Apple Watch uses dots to display LTE signal strength.
To complement the static visual design enhancements, many of the system animations have been tweaked as well. When you lift an iPhone 6S or later, the screen now fades and un-blurs simultaneously; it’s very slick. The app launching animation has been updated, too, so that it now appears as though the app is expanding from its icon. It’s a small thing; I like it.
Assorted Notes and Observations
The App Store has been radically redesigned. I’m dumping it down in this section because, while I applaud the efforts behind separating games from other kinds of apps and I think the News tab is a great way to help users find apps that might be buried by the hundreds of thousands of others, it has not changed the way I use the App Store. I’m pretty settled into a certain routine of apps, so I don’t regularly need to look for more. I didn’t ever really think, during my experience testing it, to check the App Store for what is being featured or what collections have been created lately.
ARKit and Core ML are both very promising technologies that, I think, will need several more months in developers’ hands to bear fruit. Carrot Weather has a fun AR mode today, if you want to try it out.
There aren’t any new Live or Dynamic wallpapers in iOS 11. Live wallpapers were introduced two years ago; Dynamic wallpapers were introduced four years ago.
The new still wallpapers are a clear retro play. There are familiar six-colour rainbow stripes, a Retina-quality version of the Earth photograph from the original iPhone, and — for the first time — Apple has included a plain black wallpaper.
Apple Music has gained some social networking features that, I think, might actually work well. After iTunes Ping and Connect, this is the third time Apple has really tried to push any kind of social functionality (Connect still exists in Apple Music, but I don’t know anybody who actually uses it). Apple Music’s new user profiles can automatically show your friends what you’re listening to, and you can display your playlists too. I expect the automatic sharing aspect — as opposed to requiring users manually update their profiles — to be a primary factor if it continues to be as successful in general use as it has been for me in beta.
There’s also a new take on a shared party playlist. I sincerely doubt that many people go to house parties to control the playlist in a group setting. Maybe this will change with the launch of the HomePod but, like Apple’s previous attempts — Party Shuffle and iTunes DJ — I expect this feature to be largely forgotten.
As I mentioned last year, I think the Memories feature in Photos is one of the best things Apple has built in a long time. iOS 11 promises additional event types, like weddings and anniversaries, which provides more variety in the kinds of Memories that are generated. I love this kind of stuff.
The vast majority of system photo filters have been replaced with much more sensitive and realistic filters. I’ve used them several times. While they’re no replacement for my usual iPhone editing process, they work much better in a pinch than the ones that date back to iOS 7, simply because they’re less garish.
You can now set Live Photos to loop, “bounce” back and forth, or even convert them into long exposure photos. These are fine effects, but I wish the long exposure effect would do better at detecting faces or foreground objects and creating a blur in the background. This may be more sophisticated on iPhones equipped with dual cameras; I’m not sure.
There’s a new file format for video and images — the latter of which is probably the one that will cause the most unnecessary concern. Instead of JPG, photos are saved in the relatively new High-Efficiency Image Format, or HEIF. I have not noticed any compatibility issues, and you get smaller file sizes and fewer compression artifacts in return.
The new Files app ostensibly provides access to all of your files in iCloud Drive and supporting third-party apps. However, because the most major enhancement of this is third-party app support, my time with it while testing is limited to what I have in iCloud, which makes the app function similarly to the iCloud Drive app it replaces. I look forward to using it as more third-party apps support it.
Maps now supports interior maps for an effective handful of malls and airports. If you live in a very large city in the United States or China, this will likely be useful to you; for the rest of us, I guess they have to start somewhere.
Flyover has also been enhanced in Maps, turning it into a sort of Godzilla mode where you can walk around a city overhead from your living room. It is ridiculously cool. I couldn’t confirm whether this is built with ARKit.
There are two new full-screen effects in Messages: “Echo” and “Spotlight”. The former is easily the more interesting and fun of the two. Also, the app drawer has been redesigned so it’s way easier to use.
Messages will support peer-to-peer Apple Pay in the United States later this year — my understanding is that there is a regulatory delay holding it up. As of June, the iPhone 7 was available in about ninety other countries worldwide. There are probably legal requirements that need to be satisfied for it to roll out anywhere else but, as an end user, the reasoning matters little. All that matters to me about this feature is that it will not be available where I live, and that’s a huge bummer.
The 3D Touch shortcut to get into the app switcher has been removed in this version of iOS for reasons I can’t quite figure out. It took me a while to get used to its removal; I used it a lot in iOS 9 and 10.
Safari now takes steps to restrict ad tracking and retargeting cookies to twenty-four hours of data validity. The advertising industry’s biggest trade groups are furious about this. Their creepy selves can fuck straight off.
Final Thoughts
As I’ve been writing for a few years now in occasional posts here, it feels like Apple has been going through a simultaneous series of transitions. Their services business is growing dramatically, they’ve switched over to an SSD-and-high-resolution-display product lineup — for the most part — and have been demonstrating how nontraditional devices like the iPad and Apple Watch can supplant the Mac and iPhone in some use cases.
While this story obviously isn’t going to wrap up so long as technology and Apple keep pushing things forward, iOS 11 feels like it is starting to resolve some of the questions of past releases. Despite my complaints about the rushed-feeling Control Centre and multitasking implementations, I also think that Apple is doing a lot of things very right with this update. Drag and drop is awesome, Siri is getting better, there are visual design improvements throughout, and Apple Music’s social networking features are very fun.
There is a lot that I haven’t covered in this review. That’s deliberate — some features aren’t available where I live or on the devices I use, while other changes have been small enough that you may not notice them day-to-day. However, the cumulative effect of all of these changes is a more complete, well-rounded version of iOS. I do think that the action of putting apps into Slide Over or Split View needs a more considered approach, but I can’t let that spoil how much better the Dock is than the old scrolling list overlay.
The short version of this review is very simple: if you reach for one of your iOS devices instead of running to your Mac for an increasing number of tasks, as Apple is coaxing you to do with each update, you’ll love iOS 11. Even if you don’t, and your iOS devices remain a peripheral extension to your Mac, you’ll find much to love in this version. Make no mistake: this isn’t trying to bring the Mac to your iPhone or iPad; iOS 11 is all about building upon their capabilities in a very iOS-like way. I would expect nothing less and, despite my wishes throughout this review for more, I feel like iOS 11 feels more complete than any previous update. It’s one of those ones where there’s very little you can put your finger on, but there are a lot of small things that make the system better.
iOS 11 is available as a free update for 64-bit iOS devices only: the iPhone 5S or later, iPad Mini 2/iPad Air or later, and the sixth-generation iPod Touch.
What does international political corruption have to do with type design? Normally, nothing — but that’s little consolation for the former prime minister of Pakistan. When Nawaz Sharif and his family came under scrutiny earlier this year thanks to revelations in the Panama Papers, the smoking gun in the case was a font. The prime minister’s daughter, Maryam Sharif, provided an exculpatory document that had been typeset in Calibri — a Microsoft font that was only released for general distribution nearly a year after the document had allegedly been signed and dated.
A “Fontgate” raged. While Sharif’s supporters waged a Wikipedia war over the Calibri entry, type designer Thomas Phinney quietly dropped some history lessons about the typeface on Quora, and found himself caught in a maelstrom of global reporting. Phinney said that because Calibri has been in use for several years, people have forgotten that it’s a relatively new font. This has made Calibri a hot topic in document forgery as fakers fail to realize that this default Microsoft Word typeface will give itself away.
This wasn’t Phinney’s first forgery rodeo. He calls himself a font detective—an expert called upon in lawsuits and criminal cases to help determine documents’ authenticity based on forensic analysis of letterforms used, and sometimes the ways in which they appear on paper. Phinney even IDs each of his cases with a Sherlock-Holmesian title: The Dastardly Divorce, The Quarterback Conundrum, and The Presidential Plot.
This is such a great piece. Given how tedious it can be for even an expert like Phinney to ascertain a document’s authenticity, try to imagine the kind of forensic work that will be needed in the near future to try to identify whether a video of someone speaking is real.
Maciej Cegłowski, in an infinitely-quotable transcript from a talk he gave at Republica Berlin:
The danger facing us is not Orwell, but Huxley. The combo of data collection and machine learning is too good at catering to human nature, seducing us and appealing to our worst instincts. We have to put controls on it. The algorithms are amoral; to make them behave morally will require active intervention.
The second thing we need is accountability. I don’t mean that I want Mark Zuckerberg’s head on a pike, though I certainly wouldn’t throw it out of my hotel room if I found it there. I mean some mechanism for people whose lives are being brought online to have a say in that process, and an honest debate about its tradeoffs.
Cegłowski points out, quite rightly, that the data-addicted tech industry is unlikely to effectively self-regulate to accommodate these two needs. They’re too deeply-invested in tracking and data collection, and their lack of ethics has worked too well from a financial perspective.
Cegłowski, again:
But real problems are messy. Tech culture prefers to solve harder, more abstract problems that haven’t been sullied by contact with reality. So they worry about how to give Mars an earth-like climate, rather than how to give Earth an earth-like climate. They debate how to make a morally benevolent God-like AI, rather than figuring out how to put ethical guard rails around the more pedestrian AI they are introducing into every area of people’s lives.
The tech industry enjoys tearing down flawed institutions, but refuses to put work into mending them. Their runaway apparatus of surveillance and manipulation earns them a fortune while damaging everything it touches. And all they can think about is the cool toys they’ll get to spend the profits on.
The message that’s not getting through to Silicon Valley is one that your mother taught you when you were two: you don’t get to play with the new toys until you clean up the mess you made.
I don’t see any advantage to having a regulated web. I do see advantages to having regulated web companies.
All of us need to start asking hard questions of ourselves — both as users, and as participants in this industry. I don’t think users are well-informed enough to be able to make decisions about how their data gets used. Even if they read through the privacy policies of every website they ever visited, I doubt they’d have enough information to be able to decide whether their data is being used safely, nor do I think they would have any idea about how to control that. I also don’t think many tech companies are forthcoming about how, exactly, users’ data is interpreted, shared, and protected.
Update: If you — understandably — prefer to watch Cegłowski speak, a video of this talk has been uploaded to YouTube. Thanks to Felix for sending me the link.
The short answer: it depends on who you ask, and for what reasons.
Nathan Heller, the New Yorker:
The American workplace is both a seat of national identity and a site of chronic upheaval and shame. The industry that drove America’s rise in the nineteenth century was often inhumane. The twentieth-century corrective—a corporate workplace of rules, hierarchies, collective bargaining, triplicate forms—brought its own unfairnesses. Gigging reflects the endlessly personalizable values of our own era, but its social effects, untried by time, remain uncertain.
Support for the new work model has come together swiftly, though, in surprising quarters. On the second day of the most recent Democratic National Convention, in July, members of a four-person panel suggested that gigging life was not only sustainable but the embodiment of today’s progressive values. “It’s all about democratizing capitalism,” Chris Lehane, a strategist in the Clinton Administration and now Airbnb’s head of global policy and public affairs, said during the proceedings, in Philadelphia. David Plouffe, who had managed Barack Obama’s 2008 campaign before he joined Uber, explained, “Politically, you’re seeing a large contingent of the Obama coalition demanding the sharing economy.” Instead of being pawns in the games of industry, the panelists thought, working Americans could thrive by hiring out skills as they wanted, and putting money in the pockets of peers who had done the same. The power to control one’s working life would return, grassroots style, to the people.
The basis for such confidence was largely demographic. Though statistics about gigging work are few, and general at best, a Pew study last year found that seventy-two per cent of American adults had used one of eleven sharing or on-demand services, and that a third of people under forty-five had used four or more. “To ‘speak millennial,’ you ought to be talking about the sharing economy, because it is core and central to their economic future,” Lehane declared, and many of his political kin have agreed. No other commercial field has lately drawn as deeply from the Democratic brain trust. Yet what does democratized capitalism actually promise a politically unsettled generation? Who are its beneficiaries? At a moment when the nation’s electoral future seems tied to the fate of its jobs, much more than next month’s paycheck depends on the answers.
This is a long article, but it’s worth spending some time with. Heller does a fantastic job of delving into the nuances of “gig economy” jobs, and how participants are frequently sold a myth. That’s not to say that these jobs can’t be good, but rather that the groups of people who benefit most are often as imbalanced as in the broader economy.
Joshua Ho and Brandon Chester subjected the iPhones 7 to the rigorous battery of tests unique to AnandTech, and it’s a screamer: insane performance jumps over the already-fast iPhones 6S met with big leaps in battery life. Yet:
As Apple has rapidly added new features, UI performance has taken a hit, and the nature of the performance problems is such that throwing more hardware at them won’t make them go away because they’re often due to circumstances where rendering is blocked or is being done in a manner such that even large improvements in performance would not bring things back to 60fps. While I’m not going to comb through the entire OS to find all the cases where this happens, it happens enough that it’s something I would describe as a significant pain point in my experience as a user.
It’s nowhere near as egregious as the performance hiccups on Android phones, but iOS is increasingly adding instances where animations aren’t as smooth as they should be. Activating Notification Centre, scrolling through widgets in the Today view, and pulling down to show Spotlight are all instances where it’s reliably easy to cause a suboptimal animation.
Back in June, I had this crazy idea that I was going to review iOS 10 and WatchOS 3 this year, both in my usual long-but-not-too-long style. I drafted an entry for each in MarsEdit, made my notes, and then — nothing. Some real life stuff got in the way, I procrastinated, and I ended up only being able to finish my annual iOS review. I’m okay with that, because I’m pretty happy with the way it turned out this year, but I wish I got the time to write up my thoughts on WatchOS 3 as well.
That being said, I think Matt Birchler has done an outstanding job with his review. He touches on all the main points, in a beautifully-designed review, to boot.
Let’s get something out of the way upfront: iOS 10 is a big release. It’s not big in an iOS 7 way, with a full-system redesign, nor does it introduce quite so many new APIs and features for developers as iOS 8 did. But it’s a terrific combination of those two sides of the spectrum, with a bunch of bug fixes tossed in for some zest.
For some perspective, there has been more time between the release of iOS 10 and the original iPhone than between the release of the first iMac and the first iPhone. It’s come a long way, baby, and it shows.
Installing iOS 10 is a straightforward affair, particularly with the enhancements to the software update process initiated in iOS 9. It requires less free space than its predecessors to upgrade, and you can ask iOS to update overnight. Nice.
iOS 10 is compatible with most of the devices that iOS 9 was, but it does drop support for some older devices. A5-generation devices and the third-generation iPad are all incompatible; the iPhone 5 is the oldest device that supports iOS 10.
There are additional limitations for the few 32-bit devices that remain supported: Memories and “rich” notifications are are only supported on 64-bit devices. Raise to Wake is only supported on iPhones 6S and newer; it is not supported on any iPad or the iPod Touch. I find that a curious choice — surely Raise to Wake would be just as helpful, if not more so, on the iPad, given its much larger size. And it’s not like a lack of an M-class motion co-processor is an excuse, because both iPads Pro contain a derivative of the A9 processor in the iPhone 6S with the same embedded M9 co-processor.
Lock Screen, Widgets, and Notifications
Goodbye, Slide to Unlock
Back when I bought my first iPhone OS device in 2007 — a first-generation iPod Touch, as the iPhone wasn’t yet available in Canada — I was constantly being asked to demo two features for anyone who asked: slide to unlock, and pinch to zoom. Everyone I know wanted to plunk their finger onto the little arrow nubby and slide it across the bar.
Once again proving that they give nary a shit about legacy or tradition, Apple is dropping “slide to unlock”. Like any major shift — the transition from the thirty-pin dock connector to Lightning, or, say, the removal of the headphone jack — there will be detractors. But I’m not one of them.
Let’s start from the beginning. Back in the days when our iPhones were made of wood and powered by diesel, it made sense to place an interactive barrier on the touch screen between switching the phone on and accessing its functions. It prevented accidental unlocks, and it provided a deliberate delineation between waking the phone and using it.
The true tipping point for “slide to unlock” was the introduction of Touch ID. Instead of requiring an onscreen interaction, it became easier to press the Home button and simply leave your thumb on the button for a little longer to unlock the device. iOS 10 formalizes the completion of the transition to Touch ID. The expectation is that you have a passcode set on your device and that you’re using Touch ID; iOS 10 supports just four devices that don’t have Touch ID home buttons.
But I happen to have one of those devices: an iPad Mini 2. Because it’s an iPad — and, therefore, much larger than an iPhone — I’m far more likely to use the home button to wake it from sleep than I am the sleep/wake button. It took me a while to lose the muscle memory developed over many years to slide the home screen to unlock my iPad. I’m used to interacting with the hardware first, and the onscreen controls immediately after; iOS 10 upends all of this by requiring me to press the home button twice, followed by typing my passcode onscreen. It’s only slightly different, but it sent my head for a bit of a trip for a month or so. I still, on occasion, try to slide to unlock, and curse myself for doing so.
The lock screen interaction feels much better on my iPhone 6S for two reasons. First, my iPhone has the benefit of having the best Touch ID sensor Apple has ever shipped, which means that pressing once on the home button and leaving my finger on the sensor for a bit longer unlocks my phone — almost exactly the same interaction as before, with no additional friction. That’s something that you’ll find across most of the devices compatible with iOS 10, as most of those devices have Touch ID.
The second reason for the vastly improved lock screen experience on my iPhone is that it supports the new Raise to Wake feature. The Windows 10 phone I used for a week earlier this year had a similar feature, and I loved it then; I’m thrilled to see it come to the iPhone. Raise to Wake allows you to pick up your iPhone or pull it out of your pocket to switch on the screen. Awaiting notifications appear with a subtle zoom effect, as though they’re bubbling onscreen from the ether. I suspect a lot of lessons learned from developing the wrist activation on the Apple Watch went into building Raise to Wake, and it shows: I’ve found it to be extremely reliable when pulling my phone out of its pocket, and only a little less so when lifting my phone off a desk.
Throughout this section, I’ve been using the word “unlock” to refer to the same action it’s always been used for: going from the lock screen to the home screen. But this isn’t quite correct any more because it’s now possible to wake and unlock an iOS device without moving away from the lock screen. This is useful for, say, viewing private data in widgets, but it leads to a complication of terminology — when I say that I unlocked my phone, did I go to the home screen or did I remain on the lock screen?
To clarify the terminology, Apple is now referring to the once-“unlocking” act of going to the home screen as “opening” an iOS device. That makes a lot of sense if you think of your iPhone as a door; as I don’t have a Plus model, I do not.
Widgets
No matter what iOS device you use, the lock screen is now even more powerful. The familiar notifications screen sits in what is the middle of a sort of lock screen sandwich, with widgets on the left, and the camera to the right.
The widgets screen is actually just a copy of the Today view in Notification Centre; it’s also available to the left of the first home screen. That makes three places where widgets are available; yet, sadly, all three are identical. It seems to me that there are differences in the way one might use widgets in each location: on the lock screen, you may prefer widgets for the weather, your calendar, and the time of the next bus; in your Notification Centre, you may prefer to see your latest Pinboard bookmarks and what the next episode of your favourite TV show will be.
Widgets and notifications now share a similar frosted glass style, but older style widgets don’t suffer from a loss of contrast — if they haven’t been updated for iOS 10, they get a dark grey background instead. Widgets, notifications, the new Control Centre, and many UI components previously rendered as rounded rectangles are now drawn with a superellipse shape, similar to an expanded version of the shape of the icons on the home screen, or the iPhone itself. It’s a shape that’s simultaneously softer-looking and more precise, without the sharp transition between the rounded corner and the straight edge. I really liked this shape when it appeared on the home screen, and to see it used throughout apps and in widgets makes the whole system feel tied-together. It feels sophisticated, and very deliberately so.
In previous versions of iOS, the only place that widgets would appear is in the Today view, and if you have automatic app updates enabled, the only time you’d figure out if your favourite app had a new widget available was to scroll to the bottom of Today and find it. And, if you wanted to use a particular widget occasionally, but not always, you had to add and remove it from the Today view as you needed it.
In addition to the Today view in Notification Centre and on the home and lock screens, apps updated for iOS 10 also get to show their widgets in the 3D Touch menu that accompanies the app’s home screen icon. I think this is terrifically clever. It balances new functionality with the familiarity of the home screen that has remained effectively unchanged in its purpose and appearance in over nine years.
Notifications
In iOS 10, the similarities in style between notifications and widgets are not coincidental: notifications have been rewritten from the ground up to allow for far more interactivity directly from the notification itself. Notifications can now show dynamic content and images, and they support live updates. Their additional functionality probably explains why they’re so huge, too: it serves as an indication that each notification is interactive. Even so, their size and heavy emphasis on structure makes for a certain lack of elegance. They’re not ugly, but there’s something about the screen to the right that’s not particularly inviting, either.
Pressing on a notification from Messages, for instance, will display the past messages from that thread directly in the notification balloon; or, if the iPhone is unlocked, you can see several past messages. This is particularly nice as a way to reestablish context when someone has replied to an hours- or days-old thread. However, there’s no way to scroll back within a notification balloon — they’re not as fully interactive as they seem to be.
This year also marks the return of my least favourite bug from iOS 8: if you’re typing a quick reply and you tap outside of the keyboard or notification balloon, you lose everything you’ve typed. This bug was fixed in iOS 8.3, but has surfaced again in iOS 10. I’ve lost my fair share of texts due to a misplaced tap; I’m not sure why this remains an issue.
Apple also provides examples of rich data within an expanded notification balloon, like showing the position of a car on a map for a ride hailing app’s notification, or updating a sports score notification as more pucks are dunked in the goalpost. Or whatever. I didn’t have the opportunity to test those features, but I’m looking forward to developers making greater use of Notification Centre as a place to complete tasks without having to open the app.
Notification Centre also borrows a trick from the Apple Watch: you can now press on the clear button in the upper-right to clear all notifications. It really is everything you could have wanted.
Springboard
Default Applications
After a seemingly-endless climb in the number of preinstalled applications on fresh copies of iOS, finally, a plateau — this year’s total is the same as last year’s, at 33. Though the Home app is new, the Game Centre app has been removed, though the framework remains.
But 33 apps is still a lot, particularly when plenty of them will be squirrelled away by most users in a folder marked “Junk”, or the more-cleverly named “Crapple”. I’d make a handsome wager that a majority of iPhone users who have changed their home screen layout have placed the Stocks app in such a folder. Many others will do the same for Calculator, Clock, Contacts, Compass, and Voice Memos. People who don’t own an Apple Watch have no need for the Watch app, so they dump it in there, too.
We don’t really want this folder full of apps we never use on our phones. What we want is to remove them from our phones entirely, never to be seen again. And that’s kind of what you get in iOS 10: just tap and hold on any icon, and you’ll see delete buttons where you’ve never seen them before. Most of the apps you’d expect to be removable are; you can even remove some you might not expect, like Music, Maps, and Mail. As a result of this broad-reaching change, the on/off switch for the iCloud Drive app has been removed as well.
Want to restore an app? That’s pretty easy, too — just open the App Store and search for it.
There are, unfortunately, a couple of caveats that come with this new power. First, it’s important to know that the app isn’t being deleted from your iPhone — it’s simply being removed from the Home screen. This is in large part for security, according to Craig Federighi:
We’re not actually deleting the application binary, and the reason is really pretty two-fold. One, they’re small, but more significantly, the whole iOS security architecture around the system update is this one signed binary, where we can verify the integrity of that with every update.
That also means that even though the default apps appear in the App Store, they won’t get individual updates.
I see this as a limitation due to the way iOS has been built for the past decade, but I don’t necessarily see it always being this way. It would require a large effort to make these core apps independent of the system, but it’s not inconceivable that, one day, updates to these apps might be delivered via the App Store instead of rolling them into monolithic iOS versions.
So if the binary isn’t being removed, what is? Federighi, again:
[When] you remove an app, you’re removing it from the home screen, you’re removing all the user’s data associated from it, you’re moving all of the hooks it has into other system services. Like, Siri will no longer try to use that when you talk and so forth.
In most cases, this works entirely smoothly. If you remove Calculator, for example, it will also be removed from Control Centre. Even if you remove Calendar, it won’t break your ability to add new events or open .ics calendar files.
But if you remove Mail, be prepared to be in for a world of hurt. Mail is the only app permitted to open mailto: links, and no other app can be set to handle those should Mail not be present. When you tap on an email address or an mailto: link, you’ll be prompted to restore Mail; and, because all of its settings are removed when the app is hidden, you’ll have to rebuild your entire email setup. If you have just one email account, you’ll probably be fine, but if you have several, it’s a pain in the ass.
In earlier betas, tapping on a mailto: link would result in a Safari error page. While the shipping solution is slightly better — insomuch as something actually happens when tapping an email link — I wouldn’t consider this resolved by any stretch. Either it should be impossible to remove Mail, or it ought to be possible to select a third-party app to handle mailto: links.
Wallpaper
Bad news, everyone: aside from the blue-green waterfall image we’ve seen in the betas, there are no new wallpapers in iOS 10. In fact, with just fifteen still images included, and the removal of all but one of the “feather” images from iOS 9, and the loss of all but three of the ones added in iOSes 7 and 8, I think the wallpaper selection in iOS 10 might be at its most pitiful since the iPhone’s launch.
Luckily, we can set our own still images as wallpaper, but we have no way to add a custom dynamic wallpaper. And, for the third year in a row, there isn’t a single new dynamic wallpaper in iOS. I’m not sure if it’s something Apple forgot they added back in iOS 7, or if there are simply no new ideas beyond some bouncing circles. There are also no new live wallpapers.
Control Centre
Since its introduction in iOS 7, Control Centre has been a bit of a grab bag of quick-access shortcuts. To sort out all of its functionality, Apple created five groups of related items: toggles for system settings, a screen brightness slider, audio playback controls, AirDrop and AirPlay controls, and lightweight app shortcuts.
But having all of these controls on a single sheet is less than ideal. At a glance, there’s not quite enough space between disparate controls, which means that your thumb can easily tap the wrong thing when looking for a particular button. And that’s without adding new functionality, like a control for Night Shift — the kludgy-looking five-across row at the bottom is a clue that it doesn’t fit into the existing layout — or quick access controls for HomeKit.
Something clearly had to change, and Apple has addressed it in a rather dramatic fashion: a thorough redesign of Control Centre. It’s now split across two “sheets” — three, if you have connected HomeKit devices.
The initial response to the splitting of Control Centre, as I observed on Twitter and in industry press, was, at best, contentious. Adrian Kingsley-Hughes, in a widely-circulated ZDNet article published weeks after the first beta was released:
The iOS 10 Control Center ranks not only as one of the worst user interface designs by Apple, but as one of the worst by any major software developer.
That’s harsh — undeservedly so, I feel. In fact, I’d go so far as to say that the revised Control Centre is one of the smartest and best-considered user interfaces in iOS.
Let’s start with the actual act of splitting it up into multiple pages. As I noted earlier, there’s only so much Apple could do with the existing single-page layout. Since nobody would seriously propose that Control Centre should not gain any new functionality, there are only a few ways for it to be expanded while remaining on a single page: the controls could get smaller, Control Centre could get taller, or the available controls could be customizable.
Making the controls smaller is no good because thumbs aren’t getting any smaller. If anything, some controls — like the track position scrubber — are already far too small for my liking. Making Control Centre taller, meanwhile, isn’t good for usability either, because thumbs aren’t getting any longer.
As for customizing Control Centre, while I’ve heard rumours that it’s being worked on, it clearly hasn’t progressed to a public release yet. It’s a valid solution, but one that also has its own drawbacks and complexities — it could very quickly become a second-level home screen when the doors of customization are opened. That’s not to say it’s not a solvable problem; rather, that the solution hasn’t yet been finalized.
So: extending it over two panels makes sense. And, when you add to the mix the space requirements of several HomeKit devices, having a third page become available makes even more sense.
The beauty of this UI, though, is that it remembers which page you left it on. If you use the music playback controls as frequently as I do, that means you can turn Control Centre into an ever-present remote control for audio, with some additional controls available if, for some reason, you need to toggle WiFi.
Across the bottom of the first page of Control Centre sits a familiar array of quick actions: flashlight, timer, calculator, and camera. The icons in this array now support 3D Touch, so it’s even faster to set a timer, and you can set the flashlight to three different levels of intensity. Unfortunately, it isn’t possible to use 3D Touch on the top row of toggles. It would be helpful, for example, to be able to launch WiFi settings from its toggle, or to have the option to lock the screen in a horizontal orientation on the iPhone.
I think the large buttons for AirPlay and AirDrop are fairly nice. They look like buttons, provide the additional information required by both services in a fairly compact space, but are adequately thumb-sized. However, the gigantic Night Shift button leaves me perplexed. When I first saw it, I assumed that it would be split in half for a True Tone toggle. However, not only does the iPhone 7 not have a True Tone display, the only iOS device with one — the 9.7-inch iPad Pro — doesn’t feature this split toggle. This button is unnecessarily large, and I probably find it particularly grating because Night Shift makes my iPhone look like it has a diseased liver.
Music and News: Back to the Drawing Board
I don’t remember the last time Apple introduced an app in one version of their software, only to radically redesign it just a year later; I certainly can’t think of two instances where that’s happened. But it did, this year, with Music and News.
I’ve always had a funny relationship with the Music app on iOS. In many ways, it has long been one of the finest apps Apple has ever shipped with the platform, featuring prominently in the original iPhone demo and in plenty of ads; but, deep down, there are some baffling design and functionality choices. That imbalance reached something of a high point in iOS 8.4, when Apple Music was added to the mix. Because Apple Music, by design, blurs the delineation between music you own and music you stream, the UI decisions made to add that layer of functionality increased the complexity of Music.
News, meanwhile, was a fine app last year, but it wasn’t particularly imaginative. There was very little distinctive about it; it looked a bit generic, if anything.
Both of these apps have received a complete makeover this year. I’m bundling them together because both of them — and the new HomeKit front-end app called Home — share a common design language unlike anything else on the system. Their UIs are defined by very heavy weights of San Francisco, stark white backgrounds, and big imagery. I read an article just after WWDC — which, regrettably, I cannot find — that described these apps as having “editorial” interfaces, and I think that’s probably the most fitting adjective for this design language.
I’m struggling to understand why it’s being used in these three contexts, though — why in Music, News, and Home, but nowhere else? What do these three apps have in common? Music and News provide personalized recommendations and serve as windows into different media, but Home isn’t akin to either. Home and Music both provide direct control elements, but News doesn’t. If anyone can explain to me why these three apps get the same UI language that’s entirely different from any other app, I’d be happy to hear it.
Incongruity aside, I love the way Music and News look; Home is an app I’ve only seen in screenshots, because every time I try to launch it in my entirely HomeKit-free apartment, it just sits on this screen and spins away:
I’ve no idea what’s going on here. I don’t know if there’s simply no timeout, or maybe there is but it’s set to the year 2022, or maybe you’re just not supposed to be an idiot like me and launch Home if you don’t have any HomeKit devices. (This is also why I was unable to comment on the third, Home-centric page of Control Centre.)
That aside, I think this new design language is fantastic. It’s bold and full of character, but not in a way that feels gaudy or overbearing. They feel like glossy interactive magazines, at least on the surface. As you get deeper into each app, the big, bold titles are minimized — secondary and tertiary interfaces look pretty standard compared with the primary screens of each app.
I think it would be interesting if this design language made its way into more apps on iOS. I think Phone, Reminders, and even Mail could take to this style quite well. Of course, there’s the bigger question of how permanent this style is: it appears in one app that’s brand new, and two others that were redesigned within twelve months of their launch. That’s not to say it can’t or won’t last, but its currently limited application makes it perhaps more experimental than other design languages Apple has implemented throughout the system.
Animations
I’ve been an ardent supporter of Apple’s interface design direction over the past few years. Though some critics may bemoan a generally less expressive experience with the iconography and human interface components of many apps, I’ve found that expressiveness to surface in other means — primarily, as it turns out, through motion and animation. From the subtle parallax effects in Weather and Photos to the new super goofy iMessage effects — more on that later — animations have become as much a part of the iOS user interface as are buttons and icons.
Unfortunately, many of the Springboard animations added in iOS 7 felt like they slowed down actually using the system. While they looked great the first few times, waiting for a long and softly-eased animation to complete for every task became, rather quickly, an irritation more than a pleasant experience. This was exacerbated by the inability to cancel any of these animations: if you opened the wrong app or folder on your phone, you had to wait for the “opening” and “closing” animations to play before you could try again. In the grand scheme of things, not the worst UI crime imaginable, but a frustration nonetheless.
In iOS 10, animations have been tweaked throughout the system to feel far faster. In fact, I’d convinced myself that all of the animations actually were faster, until I compared them to an iPhone 5S running iOS 9 and found them to be virtually identical.
But there is one very subtle change that makes a world of difference: it’s now possible to cancel animations before they complete. Tapped on Mail rather than Messages in your dock? Just hit the home button and it instantly responds. It’s the same story for folders, too; but, sadly, not for multitasking or opening Notification Centre.
Other animations still look and feel as slow as they were when iOS 7 debuted, including the icons flying in after unlocking. This animation has always grated on me. It takes about a full second to play; I wish it took about half that time because it makes the system feel much slower than it actually is.
Animations like these are most effective when they imply meaning — a sense of space, or an action. This has long been something that iOS does pretty well. For example, when you tap on a message in the Mail inbox, the whole UI slides to the left to show the message, as though it were laying just to the right of what the screen could contain. This animation is combined with the familiar right carat (›) that’s placed in each cell, completing the spatial relationship between the inbox and each message.
In iOS 7, the rather confusing spatial relationship between Springboard elements was organized into a more straightforward hierarchy. However, some animations and interactions were not fully considered; as a result, this hierarchy did not maintain consistency. The folder animation, in particular, was confusing: tapping on it would hide all of the home screen icons and perform some kind of hyperspace zoom into the folder area.
This has been fixed in iOS 10. Folders now appear to expand and sit overtop the rest of the home screen which, naturally, blurs. This animation feels a lot faster and more logical, while preserving the order of depth established in iOS 7.
The Hidden UI
You may have noticed that many of the most exciting new features I’ve mentioned so far — like additional options in Control Centre, and expanding notifications — make heavy use of 3D Touch. Plenty more of the enhancements that I’ll chat about later do too. In iOS 10, 3D Touch has been upgraded from a curious optional extra to a functional aspect of the system, and there are some complexities that are inherent to such a shift.
Because 3D Touch adds depth to a system that is, by the nature of pixels on a piece of glass, flat, its functionality is not obvious unless you know it’s there first. Paradoxically, the expansion of 3D Touch ought to make it feel much more like an expectation than an option, but there remains a steep learning curve for users to understand that 3D Touch is not necessarily consistent between apps.
3D Touch is also a bit of an anomaly across the iOS lineup. Apple says that they have over a billion iOS devices in use around the world, but only the iPhones 6S and to-be-released 7 support it. They sold a little over 200 million iPhones in the year since the 6S was introduced, which means that a maximum of about 20% of the entire iOS base is able to use those features.
Without 3D Touch, the user experience of a feature like rich notifications really breaks down. Instead of pressing on the notification bubble, it’s necessary to swipe the notification to the left and tap the “View” button that appears, to see its options. Of course, this is a compromise that will scarcely be a memory in a couple of years, about 80% of existing iOS device users will, on launch day, have a less-than-satisfactory experience.
Keyboard
Of all of the images of Steve Jobs onstage at an Apple event, there are few more instantly memorable than this moment at Macworld 2007:
You might remember Jobs explaining that the keyboards “fixed in plastic” are a core issue with these phones, and that changing to a touch screen would allow for optimized controls for each application.
But one thing he didn’t mention — at least, not explicitly — is that the keyboard itself would see significant changes over the next nine versions of the operating system. From international keyboards and dictation, to the Predictive bar and case switching on the keycaps, the keyboard has come a long way since 2007. But it has always primarily been an explicit, active means of user input.
In iOS 10, the keyboard becomes a little more passive and a lot smarter by way of the QuickType bar. Instead of merely predicting what word you should type next based on what you’ve been typing so far, it now suggests inputs based on contextual prompts.
For example, if a webpage has a field for your email address, QuickType will suggest two of your email addresses. Or, if a friend texts you asking “Where are you?”, the keyboard will prompt you to send your current location.
And the improvements to the QuickType bar just keep getting better: as you’re typing, it can also suggest an appropriate emoji. Type “love” and you’ll see a heart; type “ugh”, and you’ll be prompted to add a straight-faced emoji. Unfortunately, as Apple is a strenuously PG-rated company, typing “shit” will not suggest the “pile of poo” emoji — though “crap” will — and typing “penis” won’t suggest the eggplant.
There are also some improvements to autocorrect. For users who type in multiple languages or mix languages, iOS now automatically handles corrections and suggestions in those other languages on the fly, ostensibly. For the most part, I’m monolingual, but I know a few sentences in other languages. Even after adding those languages as keyboards in Settings, I wasn’t able to get it to autocorrect to those languages if I didn’t manually select those keyboards.
The only time I ever saw a language switch in the QuickType bar without manually selecting another keyboard is when my girlfriend sent me a text reading “Yup yup yup”. QuickType decided that I should reply in what appears to be Turkish. I’ve noticed that these reviews get harder to write when I’m able to explain less about how the system works.
I’m entirely the wrong person to be trying this out; that it didn’t work for me means nothing. Maybe read Ticci’s review — that guy knows what he’s talking about.
3D Touch support has also been enhanced in the keyboard. The trackpad gesture now works far more reliably, and pressing harder on the delete key will erase text at about twice the speed.
Differential Privacy
Apple has long prided itself on standing up for the privacy of its users. They’ve fought the FBI, and have long resisted taking the relatively easy route of uploading all of their users’ data to their own servers to diddle around with in any way they want.
But there comes a time when even they will agree that it’s in the best interests of their users to detect trends, for instance, or enhance certain machine learning qualities.
In iOS 10, Apple is using a fairly esoteric field of study to enhance their machine learning capabilities. It’s called “differential privacy”, and they’re using it beginning only with the keyboard to learn new words.
You’ve probably heard a million explanations of how differential privacy works, so here’s the elevator pitch version, for reference: the keyboard tracks the words that you enter and how Autocorrect responds, and blends all of that with a lot of statistical noise. The data from you and hundreds of millions of other iOS users gets combined and the noise is averaged out, leaving certain trending words behind when they’re used by a significant number of people.
This isn’t a technique invented by Apple, but they’re the first to deploy it at this kind of scale. There are some people who are doubting its success, but there’s no way to tell whether it’s making a meaningful impact on our typing until iOS 10 reaches mass deployment.
Emoji
As part of the iOS 10 update, Apple has redesigned most of the characters in the “Smileys & People” category, along with a bunch of others in several more categories. The redesigned characters look a little more saturated to my eye, and a tiny bit softer. I really like them.
In addition to the redesigned characters, there are also a bunch of new and more diverse emoji that depict women in professions and activities previously represented only by men, as well as more variations for family characters. This is a good step forward — showing police officers, detectives, and swimmers as men while displaying women only as brides and princesses was clearly not representative of reality.
However, unlike on MacOS, there still isn’t a means to search for emoji in iOS 10. The keyboard may provide suggestions while typing, but it’s not the same as search: there’s only one suggestion, which necessitates a more precise guess to find the right emoji. I wish I could swipe down on the emoji keyboard to see a proper search field.
Siri
Before I jump into what’s new in Siri this year, I want to elaborate a little bit on where I see Siri today. To understand the current state of Siri is to understand why there are now APIs available to third parties.
The best place to start, I think, is with Steven Levy’s August profile of Apple’s artificial intelligence and machine learning technologies:
As far as the core [Siri] product is concerned, [Eddy] Cue cites four components of the product: speech recognition (to understand when you talk to it), natural language understanding (to grasp what you’re saying), execution (to fulfill a query or request), and response (to talk back to you). “Machine learning has impacted all of those in hugely significant ways,” he says.
I think it’s critical that we understand all four of these components: how they work on their own, in sequence, and how the unreliability of any component affects Siri as a whole.
So, let’s start with the first: speech recognition. One thing that has become consistently better with Siri’s ongoing development is its ability to clearly and accurately transcribe our speech. Even just a few years ago, it. was. necessary. to. speak. to. Siri. in. a. jolted. manner. William Shatner likely had few problems with Siri, but the rest of us found this frustrating.
In 2014, Apple transitioned Siri from a backend largely reliant upon third parties to one of their own design. The result was a noticeable and, perhaps, dramatic improvement in Siri’s speed and accuracy, to the extent that Apple felt confident enough to add real-time dictation with iOS 8.
But the quality of Siri’s transcription of homonyms and more esoteric words often leaves a lot to be desired, due in part to inconsistencies with the second component cited by Cue: the interpretation of what is being said. Here’s an easily reproducible example that you can try right now: tell Siri “remind me to sew my cardigan tomorrow at noon”. Siri doesn’t understand the context of the word “sew” nor its relationship to the word “cardigan”, so it always — or, at least, every time I’ve tried this — transcribes it as “so”.
Speech recognition and interpretation are, I would argue, two parts of a single “input” step in a given Siri interaction. The next two parts — execution and response — can also be combined into a single “output” step, and I think it has far deeper and more fundamental problems.
Nearly any frustration we have with any computer or any piece of software tends to boil down to a single truth: the output is not what we had expected, based on our input. Whether that’s because we open an app and it crashes, or our email doesn’t refresh on a timely basis, or perhaps because autocorrect inserts the wrong word every ducking time — these are regular irritations because they defy our expectations.
In many ways, Siri is truly amazing, typically answering our requests faster than we could ever type them out. But because Siri can do so much, we experiment, and rightfully expect that similar queries would behave similarly in their response.
Let’s start with a basic request — for instance, “hey Siri, how long would it take me to drive to work?” As expected, Siri will happily respond with information about the current traffic conditions and the amount of time it will take to get there. Now, change the word “drive” to “walk” in the exact same query, and witness an entirely different result:
These requests are nearly identical, but are treated vastly differently. The driving example works perfectly; the walking example doesn’t answer my question — I’m not looking for directions, I’m asking for a time estimate.
Worse still is when Siri fails to provide an answer to a specific request. Siri is akin to pushing the “I’m Feeling Lucky” button in Google: it ought to be the shortest, straightest line between asking something and getting an answer. If I ask Siri to “find me a recipe for banana bread”, I want a recipe, not a web search that gives me a choice of recipes. If I wanted options, I would have asked for them.
As Siri’s speech recognition and interpretation becomes more reliable, this becomes more of a problem. Based solely on anecdotal observations, I think that users will be more tolerant of an occasional mismatched result than they are of having to interact with Siri, so long as it remains fast and reliable.
With that, I’d like to propose a few guidelines for what a virtual assistant ought to be and do.
Speech recognition and transcription should prioritize context over a direct phonetic interpretation.
Similar commands should perform similarly.
Returning an absolute answer should be the highest priority. A web search should be seen as a last-ditch fallback effort, and every effort should be made to minimize its use. User interaction should, overall, be minimized.
These bullet points are, I’m sure, much more difficult to implement than I’ve made them out to be. Contextualizing a phrase to interpret which words are most likely to be spoken in relation to one another requires a great depth of machine learning, for example; however, I see these guidelines as a baseline for all virtual assistants to behave predictably.
SiriKit and Intents
While Apple is busy working on the fundamental components of Siri, they’ve opened up its capabilities to third-party developers who have been chomping at the bit since Siri was launched in 2011. Much like multitasking in iOS 4, the functionality of SiriKit is limited to unique scopes or domains:
VoIP
Messaging
Payments
Photo search
Workouts
CarPlay
Ride hailing
These individual scopes each have their own “Intents” and vocabulary, and these can be defined by developers. For example, Uber provides different levels of ride hailing service, and they can define those levels for Siri in their app’s metadata; or, a payment service could define different methods of payment. Developers can include shorthand and alternate variants of their app’s terminology within their app’s vocabulary metadata.
All of this stuff sounds like it’s going to be a great way to expand the capabilities of Siri without Apple having to chase down individual partnerships. Unfortunately, these precise app categories tend to be dominated by big players who wouldn’t care to let me test their new apps. I’m looking forward to seeing what I can do with these apps once they’re released into the wild, though, because I have lots of questions.
Maps
The first thing you’ll notice about Maps in iOS 10 is that it’s received a complete makeover. With its bold card-based layout, floating controls, and Proactive suggestions, it now looks like the app Apple has wanted it to be since they dropped Google and went their own way back in iOS 6. It has taken on some of the design cues established in Music and News, though not to the same degree. I find it even easier to use than the old version, though it does retain some of its — uh — charms.
The bigger news in Maps doesn’t come from Apple, though: third-party developers can now integrate their apps directly into Maps’ UI, using Intents and similar code to SiriKit. Only two kinds of Intents are available for Maps integration: ride hailing and restaurant reservations. Third-party restaurant reservation integration is only supported in Maps; Siri has supported OpenTable integration since iOS 6. It’s not a particularly glamorous integration, but it is a useful one. This could be taken one step further by adding an Intent for delivery services as well.
I couldn’t test any of the ride hailing stuff because Uber threw a hissy-fit over Calgary’s requirements that drivers carry proper licensing and that vehicles are inspected, so they don’t offer ride sharing here.
Photos
About a year ago, Benedict Evans posed an intriguing question: about how many photos are being taken today? Given that there are a couple of billion smartphones in the world, it’s probably a lot:
How many more were taken and not shared? Again, there’s no solid data for this (though Apple and Google probably have some). Some image sharing is probably 1:1 for taken:shared (Snapchat, perhaps) but other people on other services will take hundreds and share only a few. So it could be double the number of photos shared or it could be 10x. Meanwhile, estimates of the total number of photos ever taken on film range from 2.5-3.5 trillion. That in turn would suggest that more photos will be taken this year than were taken on film in the entire history of the analogue camera business.
That was last year; this year, there will no doubt be a far greater number of photos taken due to the continued proliferation of smartphones worldwide. We all know this, and we all know how difficult it has become to manage those photos.
A few months before Evans wrote that article on photos, Google tried to combat this problem by introducing Photos, to much critical and public acclaim. Instead of worrying about storing those photos on your device — a worry that will be present so long as companies like Apple continue to include inadequate local storage in their smartphone lineups — Google reasoned that it would make more sense to allow users to stash their photos in a cloud storage system. Not only does this free up local space on the device, it allows photos to benefit from the ridiculous redundancy built into Google’s cloud storage facilities.
To sweeten the deal, Google built software that would analyze the photos as they’re stored in Google Photos. It could identify objects and people within photos, which means that finding that one photo of your dog licking a burger became as quick and easy as a Google search.
By all accounts, Google Photos has been a rousing success; it became quite clear in the intervening year that these kinds of improvements were expected from Apple, too. But this intelligence has long been presumed to require a sacrifice on user privacy — a sacrifice that has seemed unlikely for Apple to make. Om Malik wrote what is perhaps the most cogent explanation of this assumed contradiction for the New Yorker in June 2015:
The battle between Google and Apple has shifted from devices, operating systems, and apps to a new, amorphous idea called “contextual computing.” We have become data-spewing factories, and the only way to make sense of it all is through context. Google’s approach to context is using billions of data points in its cloud and matching them to our personal usage of the Google-powered Web; Apple’s approach is to string together personal streams of data on devices, without trying to own any of it. If Google is taking an Internet approach to personal context, then Apple’s way is like an intranet.
From the surface, Google’s approach seems superior. Understanding context is all about data, and the company is collecting a lot more of it. Apple has your phone; Google has access to almost everything. […]
And one day, I wouldn’t be surprised to see an executive from Apple come onstage at the Moscone Center, take a page from its rival, and say that they’re doing the same things with your data that Google is.
That day came, kind of, about a year later, on June 13, 2016. An Apple executive — Craig Federighi, naturally — took the stage at the Bill Graham Auditorium to explain that they‘re not doing the same things with your data that Google is. Apple claimed that they were able to do the same kind of facial and scene recognition on your photos entirely locally.
That sounds pretty compelling: a marriage of privacy and capabilities. All of the power, yet none of the drawbacks. So: has it worked?
Well, there are lots of criteria one could use to judge that. At its core, it’s a simple question of Can you search for objects and see photos you’ve taken of them?, to which the answer is “yes, probably”. But it would be disingenuous and irresponsible of me to view Photos in a vacuum.
While this won’t be a full Apple Photos vs. Google Photos comparison, it seems appropriate to have at least some idea of a benchmark. With that in mind, I uploaded about 1,400 photos that I’d taken through June and July to Google Photos; those same photos also live in my iCloud Photo Library. But, before we get to that, let’s see what Photos has to offer on its own terms.
Upon updating to iOS 10, your existing photo library will be analyzed while your iPhone or iPad is plugged in and locked. How long this will take obviously depends on how many photos you have — my library of about 22,000 photos took a few days of overnight analysis to complete. However, new photos taken on an iPhone are analyzed as they make their way into the Camera Roll. Apple says that they make eleven billion calculations on each photo to determine whether there’s a horse in it. For real:
In fact, we do 11 billion computations per photo to be able to detect things like there’s a horse, there’s water, there’s a mountain.
And those calculations have determined that there are, in fact, horses in some of my photos:
There are lots more searches that are possible, too — an article from earlier this year by Kay Yin pegs the total number of scenes and objects that Photos will detect at 4,432. Yin told me in an email that they acquired the list through an analysis of Apple’s private PhotoAnalysis.framework. It includes everything from the obvious — food, museums, and musical instruments, to name a few — to the peculiar and surprising: ungulates, marine museums, and tympans all make an appearance on the list.
Weirdly, though, some searches still return zero results in Photos. You can’t search for photos by type — like screenshot, panorama, or Live Photos — nor can you search by camera brand or model. This information is within pretty much any photo, but is not indexed by Photos for reasons not entirely clear to me. Perhaps very few people will search for photos taken on their Canon DSLR, but it doesn’t make much sense to me to not allow that. It feels like an artificial limitation. The only way to find Live Photos within your library on your iPhone is still to thumb through each photo individually until you see some sense of movement.
For the myriad keywords Photos does support, however, there’s plenty of good news. After it has finished analyzing and indexing the photo library, searches are fast and respectably accurate, but it’s not perfect. In that “horse” screenshot above, you can see a photo of a dragonfly, for instance. A search of my library for “receipts” shows plenty of receipts that were indexed, but also some recipes, a photo of a railway timetable, and a photo of my wristband from when I was in the hospital a couple of years ago. In general, it seems to err in favour of showing too many photos — those that might be, say, a 70-80% match — rather than being too fine-grained and excluding potential matches.
Perhaps my biggest complaint with Photos’ search is that it isn’t available in the media picker. That wasn’t as big a deal in previous versions of iOS, but with the depth and quality of indexing in iOS 10, it would be really nice to be able to search within Messages or in an image picking sheet for a specific photo to send.
Apple’s facial recognition is also quite good, generally speaking. It’s reasonably adept at identifying photos of the same person when the face is somewhat square with the camera, but differences in hair length, glasses, mediocre lighting, and photos with a more sideways profile-like perspective tend to trip it up.
If you’re a completionist about this sort of thing, you’ll likely become frustrated with the most obvious mechanism for dealing with photos misidentified as being from different people. It’s not that it’s hard; it is, however, extremely tedious. To get to it, tap on the Albums tab within Photos, then tap People, then tap Add People. You’ll be presented with a grid of all of the faces identified in your photos.
The thumbnails are sorted in descending order of the number of photos found per face detected. The first few screens of these thumbnails will look fine — 21 instances of a face here, 40-odd there — but as you scroll, the number of photos per face drops precipitously. I got about a quarter of the way through my thumbnails before I started seeing instances of a single photo per detected face. You can add each to your People album, and assign a set of photos to a contact. If you’ve already started collecting photos with a specific contact in them, it will offer to merge any new photos you add to that contact.
Tapping more than one thumbnail in the Add People view will activate the Merge button in the lower-left corner. This allows you to select multiple photos featuring the same face and teach Photos that they are the same person. Unfortunately, it’s still quite laborious to sort through photos one-by-one, in some cases. To make matters worse, thumbnails will sometimes feature the faces of two people, making it difficult to determine which of them is being detected in this instance.
This is a time-consuming way of handling multiple photos from a single person. Despite its utility, I find this view to be frustrating.
Happily, there’s an easier method of teaching Photos which faces belong to which contact. If you tap on one of the faces you’ve already taught Photos about and scroll to the bottom of the screen, past the Details view — more on that later — you’ll see a Confirm Additional Photos option. Tap on it, and you’ll get a well-designed “yes” or “no” way of confirming additional photos of that person. There’s even some really pleasant audio feedback, making it feel a little bit like a game.
Unlike object detection, which seems to err on the side of including too many photos so as to miss as few potential matches as possible, facial detection errs on the side of caution. It may be much pickier about which faces are of the same person, but I haven’t seen a single false-positive. If there is a false-positive, the process for disassociating a photo with it is a bit bizarre: the button for Not This Person is hidden in the Share sheet.
But is all of this stuff as good as Google Photos? I’m not adequately prepared to fully answer that question, but here’s the short version: it seems real close.
I have common praise. Both successfully identified obvious objects within photos most of the time. Both also had the occasional miss — identifying an object incorrectly, and not identifying an object at all. Both struggle with plurals in searches, too: a search for “mushroom” in both apps returns photos I took of a cluster of mushrooms at the base of a tree, but searching “mushrooms” does not.
I found that both apps were similarly successful at recognizing faces, with a slight edge for Google. However, I’m not sure the pool of photos I uploaded to Google was comprehensive enough for me to figure out how good it was for recognizing a lot of different faces; my iCloud Photo Library has far more images in it with lots more faces. I’d love if someone uploaded an identical batch of tens of thousands of photos to both, and did a more thorough investigation.
My main concern with Apple’s attempt at photo recognition and categorization was that it wouldn’t be anywhere near competitive with Google’s offering. My (admittedly brief) comparison indicates that this simply isn’t the case. Apple’s offering is properly good.
But, perhaps because it’s doing all of the object and facial recognition on the device, locally, it doesn’t sync any of this stuff within iCloud Photo Library. I hope you enjoyed the tedium of assigning names to faces and confirming all which photos contain each of your friends, because you’re going to have to do that for every device that you own. Have fun!
There’s also a rather nice Details view for each photo. You can tap on the Details button in the upper right or, in a completely non-obvious manoeuvre, you can scroll the photo vertically. There, you’ll see a map of where the photo was taken, any people identified within the image, and related Memories.
And I haven’t even mentioned my favourite new feature.
Teddy told me that in Greek, “nostalgia” literally means “the pain from an old wound.” It’s a twinge in your heart far more powerful than memory alone. This device isn’t a spaceship, it’s a time machine. It goes backwards, and forwards… it takes us to a place where we ache to go again.
You’ve probably read that quote in a dozen other reviews and articles about photo-related things, but there’s a good reason for that: photographs are a near-flawless conduit between your eyeballs to your heart. Trite as that excerpt may be, I couldn’t think of a better summary of Memories in iOS 10.
See, the photo analysis that iOS 10 does is not “dumb”; it doesn’t simply leave the data that it collects sitting there for you to comb through. Photos actually tries to do something with the information embedded in all of those images and videos: locations, dates, and — now, in photos — people and objects. It assesses that data looking for anything that might tie certain sets of images together, like those taken within a certain timeframe, or a set of photos from a trip abroad. It automatically groups those photos and videos together into small albums, and creates a short slideshow video from them. That, in a nutshell, is Memories.
I usually take pictures of buildings and empty fields, so my first set of Memories were not particularly inspiring. Don’t get me wrong — the albums were fine, but none were moving or emotional.
And then, one day, Photos surprised me by generating an album of photos of me and my girlfriend over the course of the past year. I guess it figured out that there are a few photos on my phone of us together, and a lot of her, and it put together an album and a video.
Despite all that I know about how automated and mechanized this stuff is, I was and remain deeply touched by the effect.
I’m trying not to sound too sappy here, but, in a way, I want to be extra sappy — I want you to know how powerful this feature can be. Sure, it’s all made by shuffling some bits around and associating whatever data it can find, but when you wake up to find a slideshow of you and your significant other over the past year, it really is pretty powerful.
I can’t confirm this, but Memories seems to prefer edited and liked photos, which makes sense — those are probably the best ones in a given set. It also incorporates Live Photos and video in a really nice way.
If you don’t like the auto-generated video, you can customize it. A theme is assigned by default, but you can pick your own, too, from options like “sentimental” and “dreamy” to “epic” and “extreme, with music and title styles to match. If you don’t like the soundtrack, just tap the adjustment button in the lower-right corner and you can pick from nearly one-hundred provided songs, plus all of the ones in your library. You can also adjust nearly all attributes of the video, including title style and the precise photo selection. But I’ve found that the auto-generated Memories are, generally-speaking, just fine.
The nature of this feature is such that most of the ones that it made for me are quite personal in nature — more on that in a minute. Thankfully, I do take enough photos of buildings and whatnot that it has produced a couple that I feel comfortable sharing. First up is one that was generated entirely automatically:
Here, as they say, is one I made earlier, with a few modifications:
You can imagine that if these were videos of your family or your significant other, they would be much more meaningful. I hope these examples provide you with a sense of what’s possible.
There’s something else unique about Memories, compared to — say — Timehop, or the “On This Day” posts that Facebook dredges up from years past. Because apps like these tend to use primarily public posts, they’re pre-selected based on the kind of image we project of ourselves. But we take far more photos that never get posted for all kinds of reasons.
I have a series of photos from mid-August of a trio of ducks fighting over a fish. I didn’t post them publicly because it’s of little-to-no interest of anyone, I presume, but it reminds me of watching those ducks duke it out on the river. That’s a memory particular to me, and it’s the kind of thing that will one day be served up by Memories.
I will say that I’ve seen it have some problems with facial recognition when cropping portrait-oriented photos to fit within a 16:9 video frame. More than once, the people in the photos have had their heads cut off. Sometimes, it’s only my head visible, instead of the faces of those I’m with; that seems to be the inverse of the most appropriate way to crop an image — who wants to look at themselves?
Regardless, Memories is probably my favourite new feature in iOS 10’s Photos app, and maybe in the entirety of iOS 10. It’s a beautifully-executed and completely effortless high-test nostalgia delivery system.
RAW and Live Photos
iOS 10 unlocks new capabilities for developers as well. Third-party applications can now shoot Live Photos, and encode and decode RAW images. The former capability is fine — I’m happy to have it for those times I want to have a little more control over a Live Photo than the default camera app can provide.
The latter capability, though: oh my. The default camera app doesn’t encode RAW images, but the newest versions of Obscura and Manual can, and they’re totally worth buying just to try RAW shooting on your phone. It’s clear that a lot of detail is obliterated when the photo is processed and compressed as a JPEG; a RAW image is three-to-four times the file size of the same JPEG, and it’s completely lossless. The easiest way to demonstrate the difference is with a simple, unedited comparison of two photos I shot one after another:
In the image shot as a JPEG, the trees become a blocky, gestural mess. The fine lines on the outside of the green building on the left are incomplete and chunky. The whole thing looks a little more like an oil painting than a photograph.
In order to process the RAW for web use, I simply applied Photoshop’s “auto” Camera Raw setting; it may have flattened out the shadows, which is why the roof of the castle-looking building looks darker in the JPEG. But, even with that minimal processing, you can clearly see individual tree branches instead of a blocky mess. The train tracks on the overpass are clearly distinct. You can almost make out the windows on the sandstone school in the distance, in the middle of this crop. Every detail is captured far better.
Of course, the JPEG variant looks far better at the typical size of a photo viewed on Facebook, for example, where these photos typically go. And, naturally, the lack of any processing means that a full spectrum of noise is also captured; it’s not quite fine enough to be considered pleasantly grainy. But for those of us who want some more control over individual attributes, the capability of shooting RAW is extremely exciting. It presents far more flexibility, provided third-party photo editing apps jump on the bandwagon. Snapseed already handles RAW in post; I’ve heard confirmations from the developers of several other apps confirming that they will soon support RAW editing, too.
For an utterly unfair comparison, I shot a similar photo on my DSLR, a Canon XSi with a 12 megapixel sensor — the same rating as the one in my iPhone. Of course, the APS-C sensor in it is far larger and the lens I have on it — the cheap and cheerful 40mm pancake — is much nicer, and has a completely different field of view as my iPhone. Even so:
There’s a long way for the iPhone’s camera to go to become comparable to a professional DSLR — in fact, I’m not sure it ever can compete on that level. But, with RAW shooting capabilities, I see this as one of the single biggest jumps in image quality in the history of the iPhone. It is properly, exceedingly, brilliantly good.
Messages
Like most of you, I can think of few apps I use more on my iPhone than Messages — Safari, Mail, and Tweetbot are the only three that come to mind as contenders. Its popularity is a direct result of its simplicity and versatility, with few apps making text-based conversation as straightforward.
Perhaps because of that simplicity, Messages has seen few updates in its lifetime. iPhone OS 3 brought MMS support, iOS 5 introduced iMessage, and iOS 8 added more messaging types and a better Details view. But the ubiquity and flexibility of applications for Messages means that those improvements effected amongst the most significant changes on the utility of any app on iOS. While Apple hasn’t released their monthly active user count for iMessages, for example, I bet that it’s one of the most popular messaging standards in the world.
But, while you’ve always been able to send text, pictures, and video through iMessage, the experience has always been rather static. Until now.
In iOS 10, you can now send handwritten and Digital Touch messages through iMessage on your iOS device. What was once a niche feature for Apple Watch owners takes very kindly to the larger displays of the iPhone and iPad, allowing you to send Snapchat-like sketches through iMessage. The heartbeat option is even available if an Apple Watch is paired, and you can mark up photos and videos right from the media picker. In some ways, it seems that Apple is still chasing a bit of Snapchat’s unique style of photo-based messaging.
The media picker has, by the way, been completely redesigned. There’s now a tiny camera preview right in the picker, alongside a double row of recent photos. Swiping right on the picker will show buttons to open the camera or show the entire Camera Roll.
This redesign is simultaneously brilliant and confusing. I love the camera preview, and I think the recent photo picker is fine. But the hidden buttons are far too hidden for my liking, and it’s somewhat easy to miss the small arrow that provides a visual clue. Once you find them, they’re easy; but I have, on more than one occasion, forgotten where the button to access the full Camera Roll picker now resides.
But what if you want to communicate in a more textual way? Well, iOS 10 has plenty of new features there. After you type out a message, you can tap on the keyboard switcher icon to replace any words in your message with emoji. Relevant words or phrases will be highlighted in orange, and tapping on the words will either suggest emoji to replace them with, or simply replace the words if only one character seems to fit the phrase. Yet, despite the extent to which I already communicate through emoji, I could never really get the hang of this feature. The QuickType bar provides a good-enough suggestion throughout the OS that I never really got the hang of tapping on the emoji icon after typing the word I intend to replace, but only in Messages. It simply doesn’t match with the way I think when I bash out a text message. Your mileage may vary.
And then there’s the stuff I got a little obsessed with while testing iOS 10 this summer. Apple has added a whole host of weird and wonderful effects for when you send an iMessage. Press on the Send button, and a full-screen sheet will appear with a bunch of available effects. Some message effects display inline, while others will take over the entire screen the first time the message is read. Some are interactive: “Invisible Ink” requires the recipient to touch over the message to reveal it. An effect like “Lasers” turns the whole display into a nightclub, replete with bangin’ choons. What’s more, sending some messages — like “Happy Birthday” or “Congrats!” — will automatically fill the recipient’s screen with balloons.
I make no bones about how much I love these effects. I’ve only been screwing around with them for the past few months with a handful of people, but they bring so much excitement and joy to any conversation that they’re easy to over-use, potentially to the chagrin of anyone else you’re talking to.
If you hate fun, you’ll probably be disappointed that there’s no way to opt out of receiving them, with the exception of switching on the “Reduce Motion” option in Accessibility settings — but that has all sorts of other side effects, too.
I’ve also noticed that these effects don’t regress very well. Users on devices running older versions of iOS or OS X will see the message followed by a second message reading “(sent with Loud Effect)”, or whatever the effect might be.
Messages has also learned some lessons from Slack. Links to webpages now show inline previews if the message was sent from a device running iOS 10 or MacOS Sierra. These previews can be pretty clever, too: a pasted Twitter link will show the whole tweet, the user it’s from, and any attached media; and, for YouTube links, you can actually play the video inline (but, curiously, not for Vimeo links). You can also react to individual messages with one of six different emotions by tapping and holding on a message bubble, a feature Apple calls “Tapback”, or with stickers from apps — more on that in a moment. Messages containing just emoji, up to three, will display much larger. All of these relatively small tweaks combine to produce some of the most welcome improvements to an app we use dozens of times a day.
Curiously enough, Messages in iOS 10 actually loses some functionality as well. In iOS 8, Apple attempted their take on Snapchat. You’ll recall that tapping and sliding on the camera icon would immediately send a disappearing photo or video. There is no longer a way to do that in iOS 10. Not that anyone would notice, of course — as I noted at the time, that feature was often more frustrating than helpful. I don’t know anyone who used that shortcut to send photos. I suspect few will notice its removal.
But I think that everyone will notice that developers can now add to Messages in a really big way.
iMessage Apps and Stickers
For the past few releases of iOS, Apple has rapidly been opening up their first-party apps to third-party developers. From sharing sheets to Safari, extension points now exist throughout iOS to make the system vastly more capable, efficient, and personalized. And now, they’re giving developers perhaps one of the biggest opportunities in years: apps and stickers in Messages.
Stickers are probably easiest to understand because they sound exactly like what they are: packs of images — still or animated — that you can stick to messages in a conversation. If the success of stickers in every other chat app is to be believed, they’re are going to be one of the hottest new features for both users and developers alike.
Actually, even saying “developers” is a misnomer here. Creating a sticker pack does not require writing a single line of code. The only things anyone needs to build a sticker pack are Xcode, correctly-sized artwork for the stickers in common image file formats, and an icon in different sizes, which means that virtually any idiot can make one. And I can tell you that because this idiot, right here, made a sticker pack in about ten minutes, excluding the amount of time I spent fighting with Xcode. It could scarcely be simpler: drag your sticker image assets into one tab of Xcode, drag a bunch of icon sizes into the other, and build. Unfortunately, you do have to subscribe to Apple’s Developer Program in order to test the app on your device; you can’t use a free Apple ID to build a sticker pack just for yourself.
As a result of this simplicity, I think a lot of artists and designers are going to have a field day making all kinds of sticker packs and selling them. Aside from free stickers — plenty of which will be from brands half-assing their marketing efforts — I’m guessing that the one-dollar price point will be the sweet spot for a typical pack.
From a user’s perspective, these stickers will be a fun addition to pretty much any conversation. They can be dropped — with a slick animation — on top of any message, or they can be sent as plain images in the chat. Some users may get frustrated that stickers typically overlap a message, which can make it hard to read. You can tap and hold on any message bubble to temporarily hide stickers and get more information about what stickers were used.
Stickers are a hoot for users and developers alike. But, of course, if you want more functionality, you’re going to have to write some code and put together an app for Messages. Apple says that developers can create all sorts of interactive environments, optimized for short-duration usage: think back-and-forth games, peer-to-peer payments, and the like.
It’s telling that they call these “iMessage Apps”, and not “Apps for Messages” or some variant thereof. While apps that confine themselves to sending just images or links will work fine over SMS, any of the truly cool interactive apps won’t work.
Apple ships two examples with iOS 10: Music and “#images”. The former, of course, lets you share your most recently-played tracks with friends. Instead of having to switch to Music from a conversation and tapping on the Share button, the track is served to you from within the thread. When combined with rich previews for Apple Music links, the app provides a totally seamless experience.
The “#images” app — I will continue to use quotation marks because I cannot stand that name — is a much-needed enhancement for those of us who like to spice up any conversation with various movie and T.V. references. It appears to use the same Bing-powered image search engine as Siri on the Mac, except perhaps more tailored for Messages. That is to say, it seems more GIF-oriented, and it appears to suggest images based on the conversation. There are even two buttons across the top that are pre-populated with likely search terms. Like any Messages app or sticker pack, you can tap on the arrow in the lower-right corner to expand its view, but in “#images” you can also press on any image’s thumbnail to see a full preview.
“#images” has been the bane of my friends’ discussions with me for the past few months. GIFs are way better than emoji, of course, and any opportunity to reply to a message with Homer Simpson starting a bowl of corn flakes on fire really is a tremendous ability. If I’m completely honest, though, I don’t really need every movie reference on the planet; I only need clips from the Simpsons. I do hope a Frinkiac app is on the way.
Unlike other app extensions, apps running in Messages are entirely independent, and don’t require a container app; however, developers can use their existing and new iOS apps, if they so choose.
And, like pretty much every other extension point on the system, there’s no indication of when an app is installed that features a Messages extension. Unlike every other extension point, there’s a switch that allows you to automatically activate any new Messages apps. I think a similar option should be available for other extension types, like keyboards and share sheet items, as the current method of determining whether an app has installed a new extension is, at best, a matter of trial and error.
Apps and sticker packs are installed in Messages similarly, in a pseudo-Springboard sheet that appears in place of the keyboard. It behaves like Springboard, too: you can tap and hold on an icon to change the order of the apps, or tap the x in the corner to remove the extension. There’s even a row of page indicator dots across the bottom; if you install a lot of apps, it doesn’t scale particularly gracefully.
I managed to run up this tally just by downloading all of the iOS 10 updates issued to apps already on my phone. Nearly every app that I updated today included a Messages extension. Imagine what it’s going to be like if you really dive deep into the iMessage App Store.
I’m sure that these apps are going to be insanely popular. Consider, for comparison, the popularity of emoji keyboards like Bitmoji or Kimoji. Perhaps a handful of apps will take over, but I anticipate plenty of users overrunning the page dot capacity. I’m surprised that this is not handled more gracefully.
Music
I wrote at length earlier about the interface design changes in Music and News; here, I want to spend a little more time on how those updates affect the usability of the app.
I want to start with the five tabs across the bottom. To me, their relatively subtle change has radically improved how I use Music. Previously, the tabs in Music were, from left to right: For You, What’s New, Radio, Connect, and Library.
The redesigned version of Music makes a subtle but critical change to its overall usability, simply by adjusting the five tabs that appear across the bottom: Library, For You, Browse, Radio, and Search. The implication of this change is a promotion of Library from the lowest priority item to the highest, where it belongs.
Arguably the most significant improvement to usability directly gained from the adjustments to the tab bar is the promotion of Search. After all, when you’re looking for something — whether in Apple Music or your local library — you probably use search. Its previous placement, in the toolbar across the top, was an awkward place for it, primarily because results ended up in the New tab, for reasons I can’t quite explain. By simply adding a search tab bar item, the usability of Music is far better than it used to be.
Even the rather vaguely-named Browse tab is a boon. The old New tab indicated that you’d only find new releases within; Browse, while more generic, allows Apple to add sub-categories for Curated Playlists, Top Charts, Genres, and the previously-buried Videos feature.
Meanwhile, the Connect features have been moved to the For You tab, and individual artist pages. I don’t know if that will improve its popularity among artists or users; I suspect not.
Within the Library tab, Music loses the weird drop picker that previously allowed you to browse by artists, genres, and so forth. This has been replaced by a simple, straightforward list, and it’s much better for it. There’s very little hunting around in this version of the Music app; most everything is pretty much where you’d expect it.
But, while Apple resolved most of the usability issues of the old app, they created a few new ones as well. “Loving” tracks and playlists — a critical component of the Apple Music experience and the only way to train the For You selection — is now a multi step process. There is no longer a heart button on the lock screen, nor is there one on the playback screen. Instead, you need to unlock your device and tap the ellipsis icon on the playback screen, or beside the item in a list. It’s a little odd to see so much emphasis placed on the ellipsis icon; it’s a metaphor that’s more frequently used on Android.
The playback screen is, overall, probably the least-successful element of the redesigned Music app, from a usability perspective. It took me a few days with it before I realized that it was possible to scroll the screen vertically, exposing the shuffle and repeat buttons, adjustable playback queue, and lyrics, when available. There’s simply no visual indicator that it’s possible to scroll this screen. My bug report on this was marked as a duplicate, so I suppose I’m not the only person who feels this way.
There are some holes in other parts of the app as well. There’s still no option to sort albums from an artist by year, otherwise known as “the only acceptable way to sort albums by a single artist”. There’s still no way to filter or search for music by year.
If you want a list of songs from a particular artist, you’ll want to use the Songs menu item to get a giant list of all songs, sorted by artist. There’s no way to do this from within the Artists menu item, which makes no sense to me. If I’m looking for songs by an artist, I’m going to start by looking in Artists; I bet you’d probably do the same.
Aside from the occasional usability bafflement, I’m certain that this version of Music is a much more successful organization of its myriad features. I’ve said many times that my ideal streaming service would feel like a massively extended version of my local library, and Music in iOS 10 comes closest to accomplishing that, even without enabling iCloud Music Library.
So what about some of the new features in the app, like lyrics support and new recommendations in Apple Music? Well, while lyrics are ostensibly supported, I had a hell of a time finding a song where that’s the case. After trying a bunch of different tracks from lots of different genres, I found that lyrics were shown for tracks from Drake’s “Views” album and Kanye West’s “The Life of Pablo“.
Lyrics only display for Apple Music songs, and I do mean only. My purchased-from-iTunes copy of “Views” doesn’t have lyrics, but if I stream the same song from that album on Apple Music, it does.
However, with the notable exception of Kim Mitchell’s truly terrible “Patio Lanterns“, just being able to read the lyrics doesn’t usually communicate the intent or meaning of a song. For that, you need something like Genius — not to be confused with the iTunes feature of the same name. I think it would be more useful if there were some substance behind displaying the lyrics.
While there’s no indication that adjustments have been made to the recommendation algorithms that power For You, there are two playlists that are served up on a weekly basis: the Favourites Mix, refreshed every Wednesday, and the New Releases Mix, refreshed every Friday. Unlike most of the pre-made playlists on Apple Music, these are algorithmically generated, but I’ve found them to be pretty good.
The first New Releases Mix that I got was a decent sampler plate of a bunch of new music that I generally enjoyed. Of the 25 tracks, in the first mix, I’d say that only two or three were misses. From my experience with both Apple Music and Spotify, that success rate compares favourably to the Discover Weekly mix in the latter service. Apple’s mix is, however, focused entirely on new releases a user might like; there doesn’t appear to be an automatically-generated playlist in the vein of Spotify’s.
All told, I think this iteration of Music is markedly more successful than the outgoing one, which grated on me more and more as the year wore on. I haven’t felt that with this version. Though it’s not yet perfect, it’s far better than its predecessor.
Continuity
Universal Clipboard
After launching with a robust set of initial features last year, the overarching concept of Continuity has been updated to support a frequently-requested feature: a universal clipboard.
The idea is simple: copy a piece of text, or an image, or a URL, or whatever on any device you own and have the ability to paste it on a completely different device. Apps like Copied, CloudClip, and Command-C filled in the gap left by the lack of official support for this functionality.
But, now, there is official support for clipboard sync, and it’s pretty good for my very basic uses. Like Handoff, Apple says that the clipboard is encrypted and synced entirely locally over WiFi and Bluetooth; your iCloud account is only used to ensure that it’s you copying or pasting on both devices.
As I said, my use-case for this is extraordinarily simple. Sometimes, I’ll have read something on my iPhone and want to link to it within a post. I can either open a new Safari tab on my iPad or Mac and wade through my iCloud Tabs until I find the right one, or I can just copy it on my iPhone and paste it on my other device. Or, sometimes, I’ll have something copied in a Mac-only app like TextMate that I can paste into an email message on my iPad. It’s pretty cool.
Unfortunately, there’s no visual indication of when an item is available to paste from a different device. I haven’t yet run into an instance where I’ve pasted in the entirely wrong thing from a different device, and the lack of a visual indicator strikes me as very deliberate: Universal Clipboard isn’t something you should have to think about — it “just works”.
Universal Clipboard lacks some of the more power-friendly options of the third-party apps mentioned earlier, like clipboard history and saved snippets, but it does a perfectly cromulent job fulfilling a basic use case for clipboard syncing. It works pretty well for me.
Apple Pay
Apple Pay was only introduced in Canada this June, but I’ve already become accustomed to paying for all kinds of stuff with it. Most payment terminals have supported tap-to-pay for a long time, but Apple Pay is more secure and, from my experience, faster and more reliable.
That it’s come to the web is a good thing; that I no longer have to use PayPal or submit my credit card details to an online store is a very good thing.
None of the places I typically order online from have yet added Apple Pay to their checkout options, so I tried using Stripe’s Apple Pay demo and it seemed to work pretty well.
I’ve dumped this feature into the Continuity section because Apple Pay is also supported in Safari on MacOS Sierra. You just start the purchase on your Mac, and authenticate on your iPhone. Strangely, though, this same cross-device functionality isn’t supported to authenticate an iPad purchase using an iPhone.
iPad
After several years of menial adjustments tailored for the iPad, iOS 9 brought serious systemwide improvements: proper multitasking, keyboard shortcuts, ⌘-Tab application switching, and lots more. iOS 9 was the significant boost the iPad needed, particularly since there are now two iPads named “Pro”. I, perhaps naïvely, thought that this was a renaissance for the iPad — a wakeup call for a platform that should feel like its own experience.
I was wrong. iOS 10 brings very few changes specifically designed for the iPad, and a whole lot of changes that feel like they were scaled-up from the iPhone.
There’s a scaled-up Notification Centre’s Today view that makes for an amazing visual trick in looking both cramped and inefficient with its use of the iPad’s larger display:
Control Centre also looks a bit odd on the iPad’s larger display, featuring gigantic buttons for AirDrop, AirPlay Mirroring, and Night Shift:
Half the space in the second Control Centre tile is occupied by a playback output destination list:
Instead of a list of output devices — something which I doubt most users will be adjusting with enough frequency to merit its equal priority to playback controls — why not show the “What’s Next” queue or additional Apple Music controls?
There are plenty of instances where the iPad simply doesn’t utilize the available screen space effectively. While not every pixel should be filled, shouldn’t playlist descriptions in Apple Music expand to fill the available space?
Shouldn’t I see more than this in my library?
Shouldn’t the timer feel a little more deliberate?
Then there are the aspects of the iPad’s interface and features that remain, inexplicably, unchanged. The 12.9-inch iPad Pro retains the 5 × 4 (plus dock) home screen layout of the original iPad. The Slide Over drawer still shows the same large rounded cells around each icon, and its lack of scalability has shown as more apps support Slide Over.
That’s not to say that no new iPad features debuted this year. You can now run two instances of Safari side-by-side on iPads that support multitasking; however, it is the only app where this is possible.
The limitations created by the iPad’s form factor — a finger-based touch screen with a bare minimum of hardware buttons — has required ingenious design solutions for common tasks. Windowing, complex toolbars, and other UI components taken for granted were, and are, either impossible or impractical on the iPad. Similar problems were solved when the iPhone was developed. But, while there’s a good argument for retaining some consistency with the iPhone, the iPad is its own experience, and it should be treated as such.
There’s a glimmer of hope for iPad users: Federico Viticci has heard that more iPad-specific features are “in the pipeline“, presumably for an iOS 10.x release. Their absence from the 10.0 release is, however, noteworthy.
Grab Bag
As ever, in addition to the big headlining updates to iOS, there are a bunch of smaller updates to all sorts of apps. This year, though, there’s a deep-level system update as well.
System
Of all of the enhancements rumoured to be coming to iOS, not one revolved around a new file system. Yet, that’s one of the things that’s coming to iOS 10. It’s not finished yet, and it is projected to arrive as part of a system update next year, but it sounds like a thoroughly modern, fast, and future-friendly file system. I’m nowhere near intelligent enough to fully understand APFS, as it’s called, but Lee Hutchinson of Ars Technica wrote a very good early look at it back in June that you should read.
Phone and Contacts
It’s telling that I’ve buried what is ostensibly the core functionality of a smartphone — that is, being a telephone — all the way down here. We don’t really think of our iPhone as a telephone; it’s more of an always-connected internet box in our pants. But that doesn’t mean that its phone functions can’t be improved.
For those of you who use alternative voice or video calling apps, there’s a new API that allows those apps to present a similar ringing screen as the default phone app. And there’s another API that allows third-party apps to flag incoming phone calls as spam and scams. I don’t get that many unsolicited calls, blessedly, but I hope that apps like these can help get rid of the telemarketing industry once and for all.
The phone app also promises to transcribe voicemails using Siri’s speech-to-text engine. My cellular provider doesn’t support visual voicemail, so I wasn’t able to test this feature.
In addition, Apple says that you can set third-party apps as the primary means of contact for different people.
Mail
Mail has an entirely new look within a message thread, with a conversational view very similar to that of Mail on the Mac. This makes a long conversation much easier to follow, and allows you to take action on individual messages by sliding them to either side.
Additionally, there’s a new button in the bottom-left of message lists to filter which messages are shown. After tapping the filter button, you can tap the “filtered by” text that appears in the middle of the bottom toolbar to select filtering criteria; the default is unread messages across all inboxes.
This filter is similar to the Unread inbox introduced in iOS 9; but, with the ability to define much more stringent criteria, it’s far more powerful. I’ve been using it for the past couple of months to try to tame my unruly inbox with an unread count that keeps spiralling out of control.
Mail also offers to unsubscribe you when it detects a message was sent from a mailing list. That can save a great deal of time hunting through the email to find the unsubscribe link and then, inevitably, being asked to fill out a survey or getting caught in some other UI dark pattern. I used it on a couple of newsletters and it seems to have worked with just a tap.
Safari
Safari now supports “unlimited” tabs, up from 36 in iOS 8, and 24 prior to that. I checked this claim out, and got to 260 open tabs before I got bored. Of course, not all those tabs will be in memory, but they’ll be open for your tab hoarding pleasure. In addition, a long-press on the tab button in the lower-right lets you close all 260 of those tabs at once, should A&E show up with a film crew.
Sharing
Ever since iOS 8 allowed third-party developers to add actions from their apps to the Share sheet, I’ve wanted to see this feature enabled systemwide for pretty much anything I could conceivably share. As you can imagine, I save a lot of links to Pinboard and Instapaper. I also subscribe to a bunch of great newsletters, like NextDraft and CNN’s excellent Reliable Sources. But, while third-party email apps have long allowed you to share the contents of emails using the system Share sheet, the default Mail client hasn’t.
It’s a similar story in Safari: you’ve been limited to sharing just the frontmost tab’s URL using the Share sheet, and no other links on the page.
Previously, touching and holding on a link would pull up a series of options, one of which was to send the link to your Reading List. Now, for those of us who don’t use Safari’s Reading List, there’s a far better option available: touching and holding on any link will display a “Share…” option, which launches the system Share sheet. It’s terrific — truly, one of my favourite details in iOS 10.
App Store
As previewed in the week prior to WWDC, this year’s round of major updates brings with it some changes to the way the App Store works. Most of these changes have trickled out in a limited way this summer, including faster review times, and a beta test of ads in the American App Store. I’m Canadian, so I’m still not seeing ads, and that’s fine with me.
One thing that wasn’t clarified initially was the handling of the new Share feature for every third-party app. At the time, I wrote:
I sincerely hope that’s not just an additional item in every third-party app’s 3D Touch menu, because that will get pretty gross pretty fast.
Well, guess what?
That’s exactly how that feature works.
It isn’t as bad as I was expecting it to be. The Share menu item is always farthest-away from the app icon in the 3D Touch list, and it means that every icon on the home screen is 3D Touch-able, even if the app hasn’t been updated in ages.
For TestFlight apps, the Share item becomes a “Send Beta Feedback” item, which is a truly helpful reminder to do that.
Improvements for Apple Watch
While I won’t be writing a WatchOS 3 review — at least, not for today — there are a couple of noteworthy changes for Apple Watch owners on the iPhone.
There’s a new tab along the bottom of the Watch app for a “Face Gallery”. In this tab, Apple showcases different ways to use each of the built-in faces and how they look with a multitude of options and Complications set. I’m not one to speculate too much, but this appears to set the groundwork for many more faces coming to the Watch. I don’t think just any developer will be able to create faces any time soon, but I wouldn’t be surprised to see more partnerships with fashion and fitness brands on unique faces.
In addition, the Apple Watch has been added to the Find My iPhone app — and, yes, it’s still called “Find My iPhone”, despite finding iPhones being literally one-quarter of its functionality. Your guess is as good as mine.
Settings
With all sorts of systemwide adjustments comes the annual reshuffling of the Settings app. This year, the longstanding combined “Mail, Contacts, Calendars” settings screen has become the separate “Mail”, “Contacts”, and “Calendars” settings screens, as it’s now possible to individually delete any of those apps.
Additionally, Siri has been promoted from being buried “General” to a top-level item, complete with that totally gorgeous new icon. It doesn’t really go with the rest of the icons in Settings, to be fair, but it is awfully pretty.
As Game Centre no longer has a front-end interface, its options have been scaled back to the point of near-pointlessness. There is no longer an option to allow invitations from friends, nor can you enable friend recommendations from Contacts or Facebook. The only option under “Friends Management” is to remove all friends in Game Centre. There is no longer a way to find a list of your Game Centre friends anywhere on iOS or MacOS. Yet, for some reason, the framework lives on. Given these ill-considered omissions, if I were a developer, I wouldn’t necessarily build a new app that’s dependent on it. Just a hunch.
There are a bunch of little tweaks throughout Settings as well. It now warns you if you connect to an insecure WiFi network, and — for some reason — the option to bypass password authentication for free apps has been removed.
Sounds
There may not be any new wallpapers in iOS this year, but a few of the system sounds have been refreshed. Instead of the noise of a padlock clicking shut, the revised lock sound is more reminiscent of a door closing. Perhaps it’s my affinity for the old lock sound, but the new one hasn’t grown on me. It feels comparatively light and thin — more like a screen door than a bank vault.
The new keyboard clicks, however, sound good enough that I kept them on for most of the beta period, and I really hate keyboard noises on smartphones. There’s a subtle difference in the noise between a letter key and a function key — such as shift or the return key — which should help those with reduced vision and those of us who type while walking.
I should say, however, that my dislike of keyboard sounds eventually caught up with me and I switched them back off. It’s a smartphone, not a typewriter.
Conclusion
iOS 10 is a fascinating update to me. Every other version of iOS has had a single defining feature, from the App Store in iPhone OS 2 and multitasking in iOS 4, to the iOS 7 redesign, iOS 8’s inter-app interoperability, and iOS 9’s iPad focus.
iOS 10 seems to buck this trend with its sheer quantity of updates. Developers have been asking for a Siri API for years, and it’s here, albeit in a limited form. The number of developers using the deep integrations in Messages and Maps is already higher than I had anticipated at this stage of iOS 10’s release, and I’m writing this the night before it launches.
Then there are the little things sprinkled throughout the system that I didn’t have time to cover in this review: breaking news notifications and subscriptions in individual News channels, a redesigned back button, CarPlay integrations, and so much more.
I may regularly bemoan individual parts of iOS. There are certain places where I wish Apple had made more progress than they did, but there are also aspects of the system that have been greatly enhanced in ways I’d never have expected. Saying that iOS 10 is the best release of iOS yet is a bit trite — you’d kind of hope the latest version would be, right?
But there’s so much that has gone into this version of iOS that I deeply appreciate. The experience of using it, from when I wake up in the morning to when I go to bed at night — oh, yeah, there’s this great bedtime alarm thing built into the Clock app — that I can’t imagine going back to a previous version of iOS, or to a different platform. It feels like a unified, integrated system across all of my devices. Some people may call this sort of thing “lock-in”, but I like to think of it as a form of customer appreciation.
Whatever the case, I highly recommend updating to iOS 10 as soon as you can. I hope it surprises and delights you the way it did for me the first time someone sent me an iMessage with a goofy effect, or the last time it made a Memories slide show for me. These are little things, and they’re mechanized and automated like crazy, but they feel alive, in a sense. iOS 10 isn’t just the best version of iOS to date; it’s the most human release.
A big thank you to Sam Gross for proof-reading this review, and to Janik Baumgartner for assisting with some iPad verification. Thanks must also go to all the developers who provided beta versions of their apps.
I worked at Gawker for four years, walking the tightrope. The immediacy of publishing encouraged me to be extremely sure of arguments and facts and to write things I truly believed, since I had nobody to fall back on but myself. And, in order to find an audience, I had to be entertaining and provocative. At the site’s best, these two often conflicting impulses encouraged writing with a spontaneity, humor, and self-assuredness that wasn’t like anything else on the Internet. At its worst, it led to gratuitous meanness and a bad lack of self-awareness. I know I’m talking in generalities, but looking back on one’s old writing is rarely a fruitful prospect, even when it was produced under the most considered circumstances. There are plenty of posts that I’m proud of, and others that make me cringe to think about. Regardless, I can’t imagine having had a better place to develop as a journalist than Gawker.
These tables contain a lot of numbers and are not made responsive very easily. The best way I could think of was to make them scroll horizontally. Please let me know if you run into any weirdness with that.
For the past few years, tech companies have been publicly releasing the diversity statistics of their employees. Over the same amount of time, I’ve compared their numbers to United States national statistics, via the Bureau of Labor Statistics’ releases — you can see that in the 2015 and 2014 editions.
This year, it’s more of the same, in more ways than one: I’ll be comparing those stats side-by-side in the same categories as defined by the BLS’ EEO-1 form — which limits the available racial and gender identity information — followed by some brief analysis. New this year is that I’m also noting the year-over-year percentage point difference. Please be aware that rounding errors and other factors may create imperfect differences from last year’s figures; however, these differences are worthwhile guidance.
One more note: last year, LinkedIn and Yahoo released their stats at the beginning of June and July, respectively, while Amazon and Twitter released theirs later in August. A Yahoo spokesperson told me that their diversity report will be available in September, while a LinkedIn spokesperson is tracking down their report internally. I will update this article should their figures become available.
Gender Diversity
Gender stats are reported by all companies on a global level; ethnic diversity is typically reported at a U.S.-only level. In the past, I’ve compared both sets of stats against U.S. figures; this year, I’m adding worldwide labour participation rates for genders, for a more complete set of stats. The World Bank only reports female labour force participation for their worldwide stats; the male labour force participation has been inferred based on the binary gender system currently used for these reports.
As Google says in their report, “ethnicity refers to the EEO-1 categories which we know are imperfect categorizations of race and ethnicity, but reflect the US government reporting requirements”. Please keep that in mind.
The “U.S.A. Workforce” row uses data provided by the Bureau of Labor and Statistics (PDF). Their demographics information (indicated page 9) is kind of a pain in the ass, though: the unemployed column is a percentage of the labour force, but the employed column is a percentage of the total population. I’ve done the math, though, and the results are what’s shown below. In addition, the BLS does not separate out those of Hispanic descent because “[p]eople whose ethnicity is identified as Hispanic or Latino may be of any race.” As such, the row will not add to 100%, but the percentage of Hispanics in the workforce has been noted per the table on page 10.
Similarly, the “U.S.A. Overall” row uses data from the CIA World Factbook, and they, too, do not note those of Hispanic descent separately. This row will also not add to 100%.
This year, I’ve added a row for the U.S.A. tech workforce as a whole, for comparison. It uses the “computer and mathematical operations” data row from the BLS report. Amazon does not separate tech and non-tech employees.
Ethnic Diversity in Leadership/Executive Positions
The “U.S.A.” row uses the “management, business, and financial operations” data from the BLS report, as a rough and imperfect approximation of the broad US national trend.
Let’s get something out of the way: I’m a white twenty-something Canadian who graduated from art college. Analysis of statistics of racial and gender diversity at American tech companies is not exactly my strongest suit. But, hey, you’ve made it this far. I want to be as fair as possible to everyone represented in these stats and at these companies. If there’s a problem, please let me know.
Apple notes this year that they achieved pay equity for all U.S. employees.
Apple also says that they reduced the amount of employees who chose not to declare their race or ethnicity compared to previous years. The majority of those identified as white.
Microsoft was a real mixed bag this year, becoming whiter and more male in a few areas — and, in some, significantly so.
Facebook made a relatively large 8 percentage-point shift in favour of women in leadership roles. No other company reported as large of a gain in any demographic.
Facebook also became the first company to highlight their LGBTQ community, with 7% of their staff identifying.
However, a disproportionately low presence of black employees continues at Facebook, Google, and Microsoft. All threecompanies have released products with flaws experienced by black and darker-skinned users — issues that, if those companies had a greater proportion of black employees, would likely have been found and corrected.
I will reiterate that one of the excuses most frequently cited by tech companies for their lack of diversity is a small selection of underrepresented prospective employees coming out of colleges and universities in the United States. This is false.
Across the board, most gains are on the order of one or two percentage points, or even less. This is similar to last year’s incremental improvements.
Even though half the companies I survey annually have yet to release their latest data, I don’t anticipate much difference from last year. As I said at the top, however, I will update this should those figures become available.
Something that, unfortunately, comes with reporting any stats on gender and ethnicity is that angry white men use it to try to support their thesis that the white male is oppressed. These people can quietly fuck themselves.
Update Aug 15: A LinkedIn spokesperson has told me that their stats will be out by the beginning of October, but noted that their numbers are “looking strong”. We shall see.
Update Oct 19:LinkedIn’s figures are now current for 2016. LinkedIn reported some of the most positive gains overall, especially for women at the company. LinkedIn remains one of the few companies where the non-tech category has more women than men. Even so, an 80/20 split for tech employees puts them in the middle of a pack led by Amazon and Apple.
Update Oct 31:Yahoo’s data is now current for 2016. Their non-tech staff actually became whiter and more male overall, while leadership staff also became more male. There are some minor indications of improvements, but this year’s report from Yahoo generally shows a regressing trend — completely the opposite of the claims of a recent lawsuit against Yahoo.
Just a taste of Anna Wiener’s brilliant essay for N+1 magazine:
An old high school friend emails out of the blue to introduce me to his college buddy: a developer, new to the city, “always a great time!” The developer and I agree to meet for drinks. It’s not clear whether we’re meeting for a date or networking. Not that there’s always a difference: I have one friend who found a job by swiping right and know countless others who go to industry conferences just to fuck — nothing gets them hard like a nonsmoking room charged to the company AmEx. The developer is very handsome and stiltedly sweet. He seems like someone who has opinions about fonts, and he does. It’s clear from the start that we’re there to talk shop. We go to a tiny cocktail bar in the Tenderloin with textured wallpaper and a scrawny bouncer. Photographs are forbidden, which means the place is designed for social media. This city is changing, and I am disgusted by my own complicity.
“There’s no menu, so you can’t just order, you know, a martini,” the developer says, as if I would ever. “You tell the bartender three adjectives, and he’ll customize a drink for you accordingly. It’s great. It’s creative! I’ve been thinking about my adjectives all day.”
Last month, Kickstarter hired Mark Harris to investigate the circumstances around the failure of the most-funded European project in their history: a drone called Zano to be built by a company called Torquing. It’s the evergreen story of a lack of understanding conflicting with the ambition and creeping scope of the product:
On 18 November, the axe fell. Torquing announced via a Kickstarter update that it was entering a creditor’s voluntary liquidation, the UK equivalent roughly of an American “Chapter 7” bankruptcy filing. It appointed a liquidator who would bring its business operations to a close and attempt to sell the company’s remaining assets to pay its outstanding bills. Legal documents show that Torquing had not only burned through the £2.5m from its Kickstarter campaign, it had run up another £1m in debt. It was Kickstarter’s most spectacular flame-out to date.
No more Zanos would be made or sent out. Staff were sent home, and Torquing’s supercomputer was switched off and would be sold for parts. Because the Zano drone checks in over the internet with Torquing’s servers each time it powers up to retrieve calibration data and updates, the few drones in the wild were instantly and permanently grounded, like a dastardly robot army foiled in the last reel of a bad sci-fi film. After an abrupt final post on Kickstarter, Zano’s creators retreated offline and refused to engage with either backers or Kickstarter itself, contrary to the platform’s policies for failed campaigns.
It’s long — Medium estimates a 53 minute read time — but it’s worth spending some time with. Terrific reporting and storytelling make for a compelling autopsy.
At around 9:00 at night, the temperature in Magelang finally drops to a more hospitable 28°C from the 37° or so that it’s been hovering at. My girlfriend and I are here, just outside Yogyakarta, for this leg of the trip and we’ve stopped at a warung for dinner — think of a small trailer that can be pulled behind a bicycle serving ridiculously tasty food. This warung is known for several noodle dishes, but we’ve asked for mie godog — literally, “boiled noodles”. The broth from this cart is made with candlenut and it’s cooking overtop some hot coals in a wok with spring onions, garlic, some mustard greens, and the aforementioned egg noodles. Every few seconds, someone on a scooter or motorbike putters past, inches from the trio of younger men sitting and smoking on the stoop of the karaoke bar next door.
I’ve taken a couple of Live Photos of the scene and play them back, and I realize that it’s captured the sights and sounds well enough that I’ll be able to show my friends and parents back in Canada, but something’s missing: the smell of this place. It’s a distinct blend of engine fumes, clove cigarette smoke, burning wood, and this incredible food. This, to me, says worlds about the immediacy of place of Live Photos, as well as the limitations that they have. They are a welcome step closer to better capturing a moment in time, but the technology isn’t quite good enough yet for this moment.
I’ve been using an iPhone 6S since launch day — “Space Grey”, 128 GB, non-Plus — and I’ve read all the reviews that matter. But when I boarded a plane on October 24 from Calgary to Surabaya, I was unprepared for the way that this product would impact my travels, and how my travelling would impact my understanding of mobile technology.
We begin this story during a stopover at Vancouver International Airport. As this is an upgrade from an iPhone 5S, I’m still getting used to the size of the 4.7-inch 6S. After just the short hop from Calgary, I’ve noticed that my 6S feels less comfortable in my front-right jeans pocket, to the point where it becomes an obligation to remove it upon sitting down in a tight airplane seat.
This issue is exacerbated by the addition of a case. I never use one, but I felt that it would make my shiny new phone last a little longer in the rougher conditions I’d be experiencing at some points of my trip. My Peel case didn’t show up in time — something about a fulfillment issue — so I settled for Apple’s mid-brown leather model. It’s nice, but even after just a couple of days, I’m already seeing staining on the edge of the case, where it wraps around the display.
At least it gets rid of that damn camera bump.
My girlfriend and I kill some time by hopping on the moving walkways and checking out some of the kitschy tourist shops that dot the halls. I pull out my phone and take a few short videos across the different available video quality settings. I’m sure 4K looks better, but I don’t have a display that can take advantage of that resolution; 60fps looks great, but also looks too much like a home movie. I kind of wish Apple would add a 24fps mode, for a more cinematic feel. I settle on 30fps 1080p: it’s not exotic or technologically advanced these days, but it works pretty much everywhere and looks gorgeous. Even without the optical stability of the 6S Plus, I’m still impressed by how well the camera cancels out shakiness.
After a couple of hours, we board the twelve-plus-hour flight to Taipei. I pull my phone out, sit down, and notice that the Airbus seats lack power outlets. I check my phone, and it looks like there’s 50-odd percent left. In airplane mode, it should be fine for listening to music for much of the flight and taking the odd photo and video. Maybe without much battery life to spare, I’d even get some sleep.
We land in Taipei bright and early, and steer immediately to the complimentary showers to freshen up. My iPhone is on the last drips of power in low battery mode, but the shower room has an outlet to top it up. We have an eight-hour layover here which we’ll be spending entirely in the airport — thankfully, with free and reasonably speedy WiFi.
I review the photos I’ve taken while circling the city earlier and am pleasantly surprised at their quality in the dim twilight and smog.
In a little over two hours, we’ve seen most of the airport, which, as with every other, consists of duty free shops only occasionally separated by small cafés and restaurants. There are plenty of tech-related shops selling iPhones, MacBooks, and iPads, all at prices much greater than the exchange rate would suggest. Even outside of the airport, Apple products, in particular, are expensive on this side of the world, especially for the middle-or-lower-class income bracket.
I try to log into Tumblr, an account on which I’ve enabled two-factor authentication via text message. I realize that I cannot receive the confirmation message as I’ve turned off all data on my phone to avoid exorbitant roaming charges. Damn.
After another few hours spent walking throughout the airport in a fruitless search for a basic and inexpensive shirt, it’s finally time to board the flight to Surabaya via Singapore.
Despite taking the same plane and the same seats for the second half of this flight, it’s necessary — for some reason — to leave the aircraft and turn around, passing through a security check again. This irritates me, as my pockets and bag are full of crap that I’ve accumulated in entirely “secure” zones, yet cannot be brought back onto the flight.
To make matters worse, the WiFi at Singapore’s airport requires text message authentication, which I cannot get, cf. my troubles logging into Tumblr. It’s also usually possible to get a code from an attendant, but none are present because it’s late at night, of course.
Thanks to the extra memory in the A9 SoC, I still have plenty of Safari tabs cached so I don’t necessarily need a live internet connection. Unfortunately, it hasn’t preserved all of them — the camera still takes as much memory as it can. My pet theory is that Apple could put desktop levels of RAM in the iPhone and the camera would still purge Safari tabs from the cache.
It’s 11-something at night by the time we land in Surabaya. My phone retains a decent charge despite none of the planes including seat-back power outlets. We exit the airport into the overwhelming Indonesian humidity and heat, and hop into a small van to take us to our hotel.
As we wind through the city, I try recording with my phone pressed against the window. If you’ve ever filmed anything at night in even a moderately well-lit city, you know how difficult this is; in Surabaya, with its extremely dim lighting, almost nothing is visible. I put my phone on the seat and watch the city scroll by.
In the morning, we head over to the mall to pick up a SIM card for my time here. On Telekomsel, 4 GB of 3G data plus plenty of messages and call time costs me just 250,000 Rupiah, or about $25 Canadian. I later learn that it should have cost about half that, but I’m a tourist. Whatever the case, that’s a remarkable deal; at home, I pay $55 per month for 1 GB of data.
I’ve never previously tried swapping my SIM while iMessage is active, or adding a phone number to an existing iMessage account. I have to power-cycle my phone so that Telekomsel can activate the SIM, and another time to get it to work with iMessage, after re-enabling cellular data.
But it doesn’t quite work correctly. I’m presented with a prompt to “update” my Apple ID password, and I can’t figure out whether I need to set a new password or simply type it in again. I try the latter and find that the WiFi hotspot I’m connected to is too slow to reach the Apple ID servers. I try a few times, try a third power cycle, pop in my Apple ID password again, and iMessage is finally activated.
I try messaging a friend in Calgary. To my surprise, it fails. I realize that I must add the country code; despite having prior correspondence of several years while in Canada, it does not automatically resolve this. My friend reports that he’s receiving messages from both my new Indonesian number and my iCloud email address. I chalk this up as another instance where iMessage doesn’t understand that we typically want to message people, not numbers or addresses.
I get a tap on the wrist: my Apple Watch notifies me that it is using iMessage with the same email addresses that I’ve been using for years. Sigh.
After two days in Surabaya, we board a plane for Bali. Destination: Ubud, near the middle of the island. After checking into our hotel, I grab my “proper” camera and head off on a short walking tour of the area.
I’ve opted to bring my seven year-old Canon XSi — coincidentally sporting the same 12 megapixel count of the iPhone 6S — and Canon’s cheap and cheerful 40mm portrait lens plus a polarizer on this vacation (those are affiliate links). It’s not the latest gear, but it’s versatile enough when accompanied by my phone.
Ubud is a fascinating little town. It isn’t coastal, so we don’t get any beach time, but it’s an artistic and vibrant place. It happens to be extremely hot during the early afternoon, which makes spending any extended time under the sun uncomfortable and delays the time that we decide to explore. Due to Bali’s proximity to the Equator, the sun sets somewhere between 5:30 and 6:00, and “magic hour” seems to last the entirety of late afternoon. That’s great news for my vacation photos.
In spite of the heat, we take a walk one afternoon in search of some sandals; the ones provided by the hotel are fine for the room, but not for walking around the city. We duck into a small restaurant for lunch, and my girlfriend orders sate. It’s served in a miniature clay grill overtop hot coals, and I realize that this is the kind of moment the Live Photo feature was built for.
Other reviews have pointed out that it’s sometimes hard to remember that the video component continues to record after taking the still photo. I find it difficult to remember that it begins to record video before tapping the shutter button, so I must remember to wait a couple of seconds between tapping to focus and snapping the still; I must also remember to keep my phone raised after taking the picture. It takes me a few tries to get the hang of it, but I’m pleased by the result. Yet, I cannot share it with anyone — a month after the 6S’ release, it seems that none of the popular services that I use support Live Photos.
The next night, we explore the city later in the afternoon, when it’s a tiny bit cooler. I haven’t remembered to bring my DSLR, as we only plan on going for dinner and poking around some boutiques.
We spot a sign directing passers-by to a rice field “50 metres” away, and decide to take a look. After a walk of probably double that distance along a very sketchy path, with sheer drops on one side, we arrive at one of the most breathtaking landscapes I’ve ever seen. Rice paddy fields stretch from both sides of the single-track lane, framed by coconut trees. A rooster crows in the distance. The sun is low in the sky behind a bit of smog, so it’s a perfect golden hue.
It’s so beautiful that it takes me a few minutes to remember to pull out my phone and, low-ish battery be damned, begin snapping. I snap plenty of landscapes on either side, take the requisite panorama, and even a few Live Photos. Later at the hotel, I review these photos and realize that I can’t remember which ones are “Live”, and which ones are not. I look in vain for a Live Photos album; despite every other “special” photo and video format available on the iPhone being filtered into their own album, it simply doesn’t exist for Live Photos. I try searching “live”, or looking for an icon in the thumbnail view — neither works.
I finally stumble across them as I swipe through the photos I shot on the rice fields that day and notice a slight amount of motion. This is apparently the only indicator of a Live Photo, and the only way to find one. Not easy.
But, as I take a look at the few I’ve shot so far, I see great value in the feature. Live Photos can’t capture everything, but they greatly enhance an otherwise static scene. The sound and video snippet add context and a better sense of place: the rooster crowing, the crickets, and even the steam and smoke curling up from that sate the previous day. I have some perfectly fine still photos, too, but their context is entirely imagined; every Live Photo I’ve taken so far does a better job bringing the memory back. It’s too bad that the heat and smell of the place can’t yet be captured as well.
In any case, I don’t think Live Photos are the gimmick some see them as. They’re a little bit cute, but they work remarkably well.
We spend a day travelling from Ubud to Seminyak and seeing the sights there. Our driver, Sandi, had in his car — among the television screens, faux fur on the dash, and short shag roof liner — a USB outlet for passengers to charge their phones. But, he tells me as I plug mine in, most people just use their power banks. I tell him that I’ve noticed a lot of portable batteries around and he says that some people carry two or more, just in case. He says that this is completely normal.
I’m starting to question the power consumption of my own phone. I’ve used it for long enough in Calgary that I know that I can get a full day’s use out of it, from 7:00 in the morning until late at night. Here, I’m not getting even half that. I check the battery statistics and see that all of my usual web-connected apps have a “low signal” notice.
Not only is 3G service somewhat slower than you might expect in this region, it also has patchier coverage. That eats battery life at a much more significant rate, particularly if you have background services polling for data regularly. iOS is supposed to compensate for this, but if you have email inboxes set to refresh on a timed schedule, it seems as though it will obey that regardless of signal strength.
The low battery mode in iOS 9 does a good job of substantially increasing battery life when cellular coverage is less than ideal. I find it indispensable: coverage is too poor for my inboxes or Tweetbot to refresh regularly, and I don’t really want to check my email much while on holiday anyway.
After dropping our bags at the hotel, we head to Uluwatu for the world-famous kecak dance, performed at sunset on a cliff above the sea. I am so captivated by the dance that I all but forget to take photos until the climax, where the dancer playing the monkey is encircled by fire.
We hang around following the dance to take photos with some of the performers. There are a couple of floodlights illuminating the stage area, but it’s still pretty dark. We get our turn to take a selfie with the monkey performer, and I turn on the new front-facing flash. The photo comes out well — great white balance, well-exposed, and not too grainy — but we look sweaty and tired; I shall spare you that sight.
The next day, we head to the beach. Our hotel is just two blocks away, but even that feels sweltering; the cool waters of the Indian Ocean are beckoning. I shoot with both my iPhone and DSLR here. Normally, I’d be very cautious about stepping into the waves for some more immersive shots with my iPhone pocketed, but the increased water resistance of the 6S gives me more confidence that a few light splashes won’t be an issue, especially with a case.
When we get back to the chairs by the side of the beach, I notice that some lint from my pocket has accumulated around the edges of the case. I pop my phone out to dust it off and notice just how nice it feels in the hand. It is not, to my eyes, the best-looking iPhone industrial design — that would be the 5S followed by the original model — but it is the best-built, by far, and feels fantastic in the hand, despite the size. I’m looking forward to using it regularly without a case again at home.
We weave through Seminyak, then onto Yogyakarta, Magelang, and Rembang. Dinner in the latter two cities is often spent at warungs — it is some of the best food you can have anywhere, provided you know which ones are clean.
Our last dinner in Rembang is in a particularly interesting warung. The proprietor is known for his interpretation of nasi tahu — literally translated as rice and tofu. He sets up his preparation table surrounded on three sides by small, low benches, each of which can seat no more than three or four people. Tarps are draped overtop to protect against the possibility of rain — ’tis the season, after all.
We’ve squeezed ourselves onto the bench directly opposite the cook, currently mashing together peanuts, garlic, lime, and some broth into a paste while frying fist-sized lumps of tofu. It’s crowded and, with a bubbling wok of oil behind the cook, it’s hot, but the combination of every sensation makes the scene unforgettable. I want to show people at home, so I take a few photos on my iPhone of the cook at work, trying also to capture the close quarters of the space.
It occurs to me that taking photographs in this environment would be unnatural and straining were it not for a camera as compact and unassuming as my iPhone. Even my DSLR equipped with a pancake-style portrait lens — which I’ve specifically chosen to be less imposing — would be too obtrusive in this moment.
The final few days of our vacation is spent at a home in Surabaya that doesn’t have WiFi. That’s fine in terms of my data consumption, but the slower 3G connection tends to choke on any modern media-heavy site. Every unnecessary tracking script and every bloated cover image brings my web browsing to a crawl.
Then, I run into an issue where my connection refuses to complete. My iPhone shows me a dialog box informing me that there has been a “PDP authentication failure”. I do not know what PDP is, why it must authenticate, or why its failure means I can’t load anything online. I reset my network settings and that seems to work for a while, only for PDP to be unauthenticated again, or whatever.
I reset and search the great IT help desk that is Google for an answer. The top result is a Reddit thread, so I tap on it, only for it to fail to load. I page back and try an Apple Support thread link and it works fine; though, of course, it has no answers. Reddit, specifically, will not load on my 3G connection.
I get sidetracked from my PDP issue and do a little bit of digging. It turns out that Indonesian pornography laws prohibit both porn itself, and any conduit for it. Though Indonesia does not have a nationwide firewall a la China, the federal government has pressured the major ISPs and cellular networks to block major sites that allow access to porn.
Later in the day, we get carried away at Historica Coffee and forget to grab dinner. There’s not much open at midnight on a Wednesday, particularly if you’re not that interested in what I had been warned was maybe-it’s-food from sketchier vendors.
I swipe to the right on my home screen expecting to see shortcuts to late night food, but that feature isn’t enabled here.
I open Yelp. “Yelp is not available in your country.”
We opt for a nearby late night Chinese food place, and it’s pretty damn good.
On the long series of flights home, I get a chance to review photos from both my DSLR and iPhone while triaging my Instapaper queue. I have more than a few articles saved that proclaim that the camera in an iPhone today is good enough to be considered a camera, not just a smartphone camera. These articles began to percolate around the time of the iPhone 4S, and they are a perennial curiousity for me, especially as I glance at my screen of crisp photos taken on my DSLR.
There’s no question that an iPhone has never had a better camera than those in the 6S and 6S Plus today, with the latter edging out the former due to its optical stabilization. iPhones — and smartphones in general — have taken very good quality photos for the past few years, and I would not hesitate to print or frame any of them. In fact, every photo in this travelogue is unedited, and I think they all look pretty good.
But I’m looking now at photos from that paddy field back in Ubud, and there is an inescapable muddiness to the trees in the background. I didn’t bring my DSLR on that walk to compare, but I’ve no doubt it would render a vastly clearer and more detailed image.
Similarly, I have photos taken on both cameras from atop a cliff near Uluwatu of surfers paddling in the waves. The wide-angle lens of my iPhone provides a better idea of the scope of the scene, but the surfers are reduce to dark blobs. The images captured on my “real” camera show the clarity in the water and the surfers are clearly human beings.
This is, of course, a completely unfair comparison: the APS-C sensor in my XSi has about ten times more area than the iPhone’s sensor, and it’s paired with a much bigger lens which allows more light in. But, it does illustrate just how different the quality of image is from each device.
There are all kinds of tricks that are easier with a DSLR, too, like tracking focus from or of a moving object. For example, I will look through the windshield from a moving car for potentially interesting roadside scenes. Upon spotting one, I’ll grab focus of something of a similar focal distance as the objects of the scene, then move my camera in the opposite direction of travel at a similar speed. This is much easier on highways where speeds are constant, so I’m able to develop a rhythm of sorts. With my DSLR, this is tricky, but something I can reliably do; I have never succeeded in using this technique on my iPhone. It might be the rolling shutter or something I’m not doing quite right, but I also have not heard of someone else doing something similar.
I offer this not as a complaint with the iPhone’s camera, but as clarification that there is still great value to having a camera with a big-ass sensor and a great lens. I’m under no illusions; I am an optimistic hobbyist photographer, at best, but I can’t shake the feeling that I made the right decision in bringing my DSLR as well. It’s bulky, cumbersome, old, has “hot” pixels on the sensor, and creates gigantic RAW files that occupy a lot of space on my MacBook Air.1 However, it creates beautiful images to last a lifetime, and that’s what counts most for me.
I’ve spent an hour or so in an “e-library” in Taipei’s international airport wrapping up this travelogue. Nobody seems to use the e-libraries here, so they function as a pseudo-private lounge, and a pretty great place to find a power outlet. It’s quite nice.
There were some things I expected about bringing my iPhone to Indonesia. I anticipated that I’d use it to keep in touch with a few people at home, look up addresses and directions, and be able to take great-quality photos anywhere, any time. But I learned a lot about the phone, too: Live Photos showed their brilliance, and I was able to extend my battery life despite none of the aircraft including seatback power. I found out just how well the camera works for capturing intimate moments that would feel artificial or posed if I were to use my DSLR, and figured out some new iMessage limitations.
What I learned most, though, isn’t about the iPhone 6S directly; it’s about the role of technology and the internet in a developing nation.
In most developing nations, the proliferation of technology is limited by policy and economics; Indonesia is no exception to this. But, while I was there, I saw people regularly carry two, three, or more smartphones: usually an inexpensive Android phone — something like a OnePlus or a Xiaomi — plus either an iPhone or a BlackBerry. Apple’s products are still very much a luxury: an iPhone is about a third more expensive in Indonesia than it is in the United States, while the median income is half that of the U.S.2
The Jakarta Post reports that only about 29% of Indonesians are connected to the internet, and the internet they’re connected to is different than the one you and I are used to. But they’re making the most of what they’ve got, and established their own rules and understanding — it isn’t rude to use your phone at the dinner table, for instance, and Path remains alive (remember Path?). Not all the services and products you and I have come to expect have made their way there, and if you think search in Apple Maps is poor where you live, you’ve seen nothing yet.
I escaped to Indonesia for a relaxing vacation in a part of the world I’ve never visited. I vowed to get off the beaten path and out of my cushy boutique hotel. In doing so, I leave with a hint — but only a hint — of what life is like for hundreds of millions of Indonesians. In doing so, I learned a little bit of how they use technology; their smartphone is often their only computer and only connection to the internet.
There is something further to consider here: we — designers, developers, and product people — spend a lot of time worrying about how our new product looks and works in U.S. English on an LTE connection, for the tastes of an American (or, at least, Euro-centric) audience. We spend little time asking how it will function for people who fall outside those parameters — parameters which, by the way, narrow as fast as greater amounts of people get connected to the web. My quip about Path earlier is indicative of this: we assume Path is dead because we don’t use it; yet, it has, as far as I can work out, a respectable user base in Southeast Asia, and that market grows every day.
I’m not pretending to be fully educated in the country after spending just three weeks there, but I am certain I understand it better than three weeks ago. Indonesia is beautiful, breathtaking, delicious, and full of the nicest and most accommodating people I’ve ever met, and I’m Canadian. You should go. Bring a camera.
Not to mention post-processing in Photos on OS X, which remains an app that is hard to love. My workflow for a trip like this is to shoot hundreds of images, import them all into one album for the trip, and then pick my selects from that album.
In Aperture, I’d give five-star ratings to the images I was certain about, four-star ratings to those that might have potential, and no stars to images I wouldn’t bother using. (The digital packrat in me doesn’t delete them — just in case, I suppose.) Then, I could simply filter to four-star-or-better images and edit within that set, upgrading some to five-stars if I deemed them worthy. Exporting was as simple as selecting the already-filtered set within the album.
Photos doesn’t have this level of granularity: you either “like” a photo, or you do not. That keeps things a lot simpler, and I don’t mind that. What I do mind is that there appears to be no way to find only photos I’ve liked within an album. My workaround has been to create a smart album with that filtering criteria, but that seems like a workaround, not a solution. ↥︎
This has other effects, too: a couple of years ago, I guessed that data problems and inconsistencies in Apple Maps would be less frequent in places with more iPhone users, and I think that’s true. With less penetration in Indonesia, Apple Maps often lacked significant local points-of-interest. ↥︎
Software defines the Apple Watch as much, if not more so, than the hardware which embodies it. But as well designed and integrated as the first iteration of the Apple Watch’s software (aptly named, though questionably capitalized, watchOS) was, it wasn’t perfect. Some might argue it wasn’t even acceptable. Third-party apps were underpowered and massively handicapped by long load times; when disconnected from its iPhone, the Watch was rendered useless for almost anything except telling the time; and with no Activation Lock, stolen Watches could be reset and resold with ease.
Enter, watchOS 2.
One thing I didn’t notice in Guyot’s review is a comment on overall battery life. When I was running watchOS 1, I’d get home with probably 15-20% life at the end of the day; with watchOS 2, I’m seeing 40-60% left at the end of the day, and I am certainly not using it less.
Don’t miss the footnotes in this review; some of his best observations are buried therein.
Apple has long been a company of measured progress, but for two years in a row, they’ve delivered a substantial leap in the abilities and extensibility of iOS. iOS 7 brought a complete top-to-bottom redesign, while iOS 8 allowed developers to interact with the OS and other apps more than they ever could previously.
But leaps and bounds of supersized progress don’t come for free; indeed, both releases felt like they lacked the kind of software quality we’ve come to expect from Apple. While I can’t think of a single product they’ve made that didn’t have any bugs, the severity and regularity of quality-assurance gaps made the past two years of iOS feel rushed and a little incomplete.
Apple has faced this before; Mac OS X Leopard, released in 2007, was riddled with far more bugs and quality issues than its predecessor. Its 2008 successor, dubbed Snow Leopard, was famously billed as having “no new features” — that wasn’t entirely accurate, but it was the thrust of the release: bug fixes, bug fixes, and bug fixes (and Exchange support).
iOS 9 is an attempt to strike the balance between these extremes. There are two sets of promises: it’s supposed to fix a lot of bugs and improve overall performance and stability, but it’s also supposed to be a fully-featured release. So, is it?
I’ve been using iOS 9 since early June on my iPhone 5S and iPad Mini 2. This is what I’ve learned.
Impressively, iOS 9 runs on all the same devices as iOS 8 did. If you have an iPhone 4S or iPad 2 or newer, or an iPod Touch with a 4-inch display, you can update to the latest and greatest operating system. Four years is a long time in the smartphone world, but Apple remains just as committed as ever to older hardware.
But, though no devices have been dropped from the compatibility list with this release, there’s also an array of caveats and limitations, as usual. A5 hardware does not support Proactive features, for example. Similarly, only A7-and-newer iPads support picture-in-picture or Slide Over, while only the iPad Air 2 currently supports split-screen multitasking.
But, though the newest features are limited to the most recent hardware, older models should perform far better under iOS 9 than under 8. The iPad 2 is four and a half years old, yet Apple is still dedicating engineering resources to making sure it continues to feel like new hardware. That’s pretty impressive, and it seems to acknowledge a longer upgrade cycle more typical of a computer than, say, a phone.
Installation
Installing iOS 9 is as simple as you’d like and are used to: it’s available both over-the-air, and as a full standalone download. Depending on what device you’ve got, the complete package size will vary from about 1.4 GB for an iPad 2 to over 2.1 GB for an iPhone 6 Plus.
Over-the-air is another story. While there is no single reason why iOS 8’s upgrade rate was noticeably slower than iOS 7’s, the large amount of required disk space — four or five gigabytes — is likely a significant contributing factor, especially considering the amount of 16 GB devices still in use (and, inexplicably, being sold). Apple promises that devices only require about a gigabyte of free space to upgrade to iOS 9.
Updates get even better after you’re running iOS 9 in terms of both timing and available space requirements. Let’s start with the first one by taking a peek at the over-the-air update feed (large file warning). When you run an OTA update, it checks what version of iOS you’re currently on, which version is the latest available, and downloads just the required files for your device. Here’s what that looks like, in part, for a version of iOS 8:
Did you catch that? The SUInstallTonightEnabled key is a clue that iOS will now allow you to run system updates overnight or during periods of lower usage. In an increasingly rare moment, the post-PC iOS is learning from the Mac.
If you don’t have enough space on your device for that update, iOS will now offer to delete apps to run the update, then automatically restore them after the update has been completed. It’s a nice touch, but it also seems to acknowledge that 16 GB iPhones are both tiny, yet, here to stay, at least for a while.
Under the Hood
Bitcode, Slicing, and On-Demand Resources, Oh My
Since the release of the iPod Touch in 2007, there have been several distinct versions of iOS created with assets designed or created for specific hardware configurations. An iPhone doesn’t need an iPad’s graphical assets, and the WiFi-only iPod Touch doesn’t need a bunch of cellular-specific code or apps. This is evidenced by the vast chasm between the smallest and largest sizes of iOS 9.
Unfortunately, third-party developers haven’t had a similar ability. After the introduction of the iPad — and, with it, the creation of the universal iOS app format — app sizes grew significantly; this occurred over and over with each introduction of higher-resolution and larger-screened iPhones and iPads. So far, the only solution developers had for this is to create separate iPad and iPhone versions of their app which, due to pressure from both Apple and consumers, is suicidal.
But now there’s a way. Not for supporting developers’ livelihood, sadly, but for them to be able to provide just what’s needed to different devices, at just the right time. Apple calls it “App Thinning”, and it’s comprised of three complementary aspects: bitcode, slicing, and on-demand resources.
Bitcode is kind of cool. Instead of uploading a fully-compiled app, developers can now upload partially-compiled code. Apple can then optimize the app build based on their latest — and, presumably, most efficient — compiler, and the app gets built on the fly.
App slicing requires a little to a lot more work from developers, but probably creates the best balance between space savings and effort required. Developers can now “tag” resources within their app — images, sounds, graphics code, and other data — based on the kind of device they target. For example, neither my iPhone 5S nor my iPad Mini require “@3x” images, which are only used on the iPhone 6(S) Plus, but apps that I download these days often include them and they can take up a lot of space. After a developer correctly tags these files, they will no longer be included in apps that I download to my devices. The same goes for OpenGL vs. Metal, different qualities of audio or video, and different device architectures.
What wasn’t made entirely clear is how sliced apps behave when downloaded from iTunes. It’s easy to figure out how the slicing and building decisions are made when downloading over-the-air from the device, and Apple has built support for slicing into enterprise tools. But the only reference to iTunes comes in the overview document, and there are no specifics:
For iOS apps, sliced apps are supported on the latest iTunes and on devices running iOS 9.0 and later; otherwise, the App Store delivers universal apps to customers.
I haven’t found another reference to how iTunes handles app slicing; for example, when there’s one or multiple devices on a user’s account. Apple PR doesn’t talk to me. (But do they talk to anyone?) I simply don’t know.
Finally, there are on-demand resources. Depending on the app, this might require substantially more work from developers, but it could also provide some of the most significant space savings for users. By storing on the device only the assets that the app needs at the moment and for the foreseeable future, developers can potentially create vastly more depth in their apps without worrying quite as much about a significantly ballooning file size. Furthermore, developers who have already added more depth and size to their apps can strategically make reductions dynamically.
For instance, consider the apps that you’ve downloaded that provide some kind of tour or walkthrough at first launch. As a user, you probably won’t need that again, so the developer can allocate those assets as deletable after use. But the app itself won’t delete them — on-demand assets are entirely managed by the system. That means that one app from one developer could request space currently occupied by another app from a completely different developer. The system is supposed to do the right thing. If there’s an app that you haven’t opened in a long time, it can clear out unneeded assets from that app first, because — let’s face it — you probably won’t notice.
These three things — bitcode, slicing, and on-demand resources — comprise app thinning. Apple’s goals for developers have so far been working in opposite directions: they want developers to build universal apps with Retina-ready graphics and the latest architectures, but they also want to keep the starting capacity of their devices at 16 GB and the free iCloud storage at just 5 GB. Thinned apps work to try to address all of these goals.
I won’t dive into whether this comes close to justify the continued low-capacity devices and services Apple sells by default (it doesn’t) but it certainly reduces the pain of having such a device. If the space savings are as pronounced as Apple says they are — mileage may vary — it will mean a much better user experience for everyone, and less squeeze for those at the base level of device offerings.
Performance
Most years, the newer your iOS device, the less interesting this section will be to you. That perennial rule also usually comes with the unfortunate implication that users with older devices have to deal with increasing obsolescence.
But that’s not the case this year — iOS 9 comes with performance improvements for both new and old devices. Devices that support Metal — Apple’s near-hardware graphics layer — can take advantage of its significantly improved performance in most of the default apps.
For older devices — particularly those of the A5 generation — Apple promises significantly better performance. I don’t have one of those devices any more so I wasn’t able to test this.
Battery Life
Another major focus of the under-hood improvements in iOS 9 is battery life. Apple says that an extra hour of typical battery life can be expected in daily use.
So how did the marketing fare in the real world? I don’t have an adequate rig set up to test battery life in some kind of semi-scientific fashion, but in day-to-day use, it has fluctuated greatly from beta to beta, as can be expected. But that made me a little worried when faced with the promise of significant battery life gains. The third and fourth betas exhibited almost impossibly good battery life; if anything, I would have guessed that the promised additional hour was an understatement. The fifth beta, on the other hand, appeared to completely ruin my iPhone’s battery life. It went from reliably lasting all day long to being nearly completely depleted by lunchtime. Around this time, though, I began noticing some other strange battery-related behaviour: accelerated depletion between the 20% and 10% warnings, for instance, only to plug it in and find that it was at 19%. I suspect one of the cells in my phone’s battery has died.
Therefore, it would be imprudent for me to judge the battery life of the gold master I’ve been using for the past week. Some days, it’s phenomenal; others, it’s poor, and — given the far better battery life of a few of the earlier betas — I think my phone is partly to blame.
One significant factor appears to be music playback, which I’m consistently seeing at the top of the list of power-hungry apps in Settings. Even after switching off Apple Music and choosing only local items for playback, I noticed occasional network activity from within the app. But Apple Music was included with the third beta, and its battery life was just fine. I have no logical explanation for this.
If you can’t get to an outlet, you’ll be delighted to hear that the low-power mode from the Apple Watch has made its way onto the iPhone. This mode disables all kinds of background processes, places the battery percentage in the status bar, and apparently disables some visual effects. After enabling low-power mode, though, I didn’t notice the omission of blurring, translucency, or any of the usual suspects on my iPhone – the only things disabled, as far as I can figure out, are Dynamic wallpapers and parallax effects. Apple says that an additional three hours of battery life can be expected after enabling low power mode, in addition to the extra promised hour. That’s not as significant of a gain as low-power mode on the Apple Watch, but you’ll still have network connectivity and all your apps on your iPhone. It’s a tradeoff.
In my testing, the low-power mode delivered, though not as I expected. The description is a little misleading: if you turn on low-power mode when you’re first prompted — at the 20% warning — you won’t eke out three additional hours; you’ll probably add an extra hour of life, at best. But if you go all day long in low-power mode, you’ll probably get the full three hours.
The battery life on my iPad has been exceptional, but isn’t it always? I’ve been using it regularly for the past few months, especially, and I haven’t noticed a decrease in battery life (but nor have I noticed any improvement, either). In case you were wondering, low-battery mode isn’t available on the iPad.
UI
After a major interface design overhaul that practically rewrote the HIG just two versions ago, it comes as no surprise that the UI of iOS 9 has, well, very few surprises. The corners of some things are a little more rounded and there’s a lot more space within action sheet items. But there are changes, indeed, and they’re systemwide.
San Francisco
Helvetica has been the signature typeface of iOS since its launch, which, for myriad reasons, makes for no small feat to consider replacing it. From a developer perspective, apps have only ever tested against it as a system font, so every label and every text cell fits just right. From a user perspective, it’s one of the things that makes iOS look and feel the way iOS does. For all it has been criticised for being generic, there’s an intricacy and geometric precision to its forms.
While I personally enjoy Helvetica, its shortcomings in user interfaces are obvious, even to me. Because its letter forms are similar, they tend to blur together and become ambiguous at the smaller sizes of most UI labels, buttons, and controls.
In my eyes, the days of Helvetica on iOS and OS X were quickly coming to a close with the introduction of the Apple Watch, bringing with it the typeface family known as San Francisco (or, for the retro users among you, San Francisco Neue).
Unlike OS X’s switch from Lucida Grande to Helvetica Neue last year though, the switch to San Francisco has gone largely unpublicized by Apple. They didn’t mention it during the WWDC keynote, and neither of their OS X or iOS promotional pages make note of it. For comparison, here’s what Apple wrote on the page devoted to Yosemite’s design:
For some, the choice of a font may not be a big deal. But to us, it’s an integral part of the interface. In OS X Yosemite, fonts have been refined systemwide to be more legible and consistent across the Mac experience. You’ll notice a fresh, new typeface in app windows, menu bars, and throughout the system. The type looks great on any Mac, and even more stunning on a Mac with a Retina display.
There isn’t an equivalent section for either iOS 9 or El Capitan’s use of San Francisco. I kind of understand this; the change to Helvetica in OS X had an outsized amount of attention placed to the change of system font, given that most users might not notice it. But San Francisco is an in-house typeface that’s completely unique to Apple’s operating systems, and it plays to the “only Apple devices work together so seamlessly” trope in their marketing, in a very distinctive way. Like Segoe — Microsoft’s in-house UI font — but unlike Roboto — Google’s — San Francisco is only available on its proprietors’ platforms. That is, while it was once possible to make a website look iOS-ey on other platforms by embedding Helvetica (Neue) as a web font — and while it is possible to make a website that looks like it comes from Google by doing the same with Roboto — it is not legally possible to make your website or app look like iOS 9 or OS X 10.11 on any platform other than Apple’s. San Francisco is not available as a web font, and it’s also not available in in-app font pickers. It is only to be used as a UI font, or where Apple deems appropriate. They control its use.
So what about the typeface family itself? San Francisco has been written about and dissected since its launch by myself, among many others, so I’ll try not to rehash too much. In a nut: I think San Francisco is extremely well-proportioned, looks fantastic, and performs very well as a UI font.
That’s a good thing, because it is now the one, true UI typeface across all of Apple’s operating systems, which means that the designers at the company thinks it works well and is legible on displays ranging from the poky-ass Apple Watch to a gigantic iMac, and everything in between. To make this more manageable, Apple has created two major versions of San Francisco: SF UI and SF Compact — née San Francisco, as shipped with WatchKit — that each contain Display and Text families, for a total of 42 font files. The Compact version is used exclusively on the Apple Watch, while the UI version is used on iOS and OS X.
But that’s not the whole story, because there are individual versions of the UI version specifically hinted and tuned for various purposes in both operating systems in which it is used. There are four different grades of the “regular” weight, presumably for better Dynamic Type support. These grades are not exposed to the end user and they’re not available to designers.
It all adds up to a fabulous family of typefaces that works well at pretty much all of the sizes at which it’s used. The lighter weight looks gorgeous displaying the date at the top of Notification Centre, while the regular weight is crisp and clear below the icons on the home screen. San Francisco shares the precise, metallic feeling of the DIN family — it replaces DIN in the Camera app — but is a little rounder, which makes it feel friendlier.
One of the main complaints I noticed about prior attempts to port San Francisco to iOS or OS X was that it too narrow, making it difficult to read. We now know this was the Compact version, with a much more appropriate UI version waiting in the wings. By opening up the characters compared to what we’re used to with the Compact set used in watchOS, it’s more legible at the sizes and with the amount of information it’s expected to display in iOS. Plain text emails, for example, are displayed in San Francisco; using the Compact version is noticeably less readable when it’s used in paragraphical text:
As you might expect, a systemwide typographic update does throw off the occasional third-party app. Most apps specify either the system font or a custom font, and these apps will look just fine. Apps that specify both the system font and Helvetica (Neue), on the other hand, look a little janky, largely because the two typefaces are close enough that they clash, like really poor denim-on-denim. These issues are minor and will, of course, be resolved in due time. Most importantly, I have found precious few instances of text strings that overrun their intended area or are truncated significantly differently than they would be in iOS 8.
As transitions go, it’s subtler on the surface than, say, the one from Lucida Grande to Helvetica on OS X, but it’s extremely effective. It’s a much better UI font than Helvetica, and — dare I say — simply nicer in every application. Helvetica is a classic; San Francisco is clearer and more delightful. It’s possibly my favourite change in iOS 9, largely because it’s one that’s carried through the entire system with ease. It adds an additional layer of sophistication that Helvetica just can’t muster; it doesn’t have that quality.
Back Button
Somewhat more noticeable is the introduction of a back button in iOS. This has been part-and-parcel of every other mobile OS for a long time, so what’s different about Apple’s implementation?
In almost all situations, the system maintains a back stack of activities while the user navigates your application. This allows the system to properly navigate backward when the user presses the Back button. However, there are a few cases in which your app should manually specify the Back behavior in order to provide the best user experience.
Most of the time, the system controls back button behaviour on Android, but developers can intervene:
For example, when a notification takes the user to an activity deep in your app hierarchy, you should add activities into your task’s back stack so that pressing Back navigates up the app hierarchy instead of exiting the app.
For example, tapping on a new text message notification and then tapping the back button will — by design — not return you to the last thing you were doing, but send you to the message list.
The first thing the Back button button does, as you might expect, is take you back one screen from where you are. Your phone remembers all the apps and websites you’ve visited since the last time your screen was locked, and will take you back one page each time you press Back, until you get to the Start screen.
[…]
There’s one exception to the rule: if you’re in Internet Explorer and press Back, you’ll return to the previous webpage you visited, rather than to the previous app. But once you’re out of it, Back goes back to taking you, well, back again.
Not exactly consistent in either case. Tapping the back button in an app might take you to the previous screen within the same app, might take you to a different app, or it might take you “back” in a way you may not expect. There are some who get used to this; there are others who want them to die.
And now, nine major releases in, iOS has one.
Happily, it’s implemented in a much more consistent way than any other major platform (and, I admit, it’s a stretch to call Windows Phone a “major platform”). It’s really very simple: if an app sends you into another app, you’ll see a back button on the lefthand side of the status bar. It replaces the assorted networking indicators, happens automatically, and requires no intervention from developers.
In the first month using iOS 9, I rarely used the back button simply because the multitasking shortcut is so engrained into my muscle memory. But, since I began to remind myself that it existed and is much more convenient than the app switcher, I’ve increasingly become reliant upon it.
However, its placement in the upper-left corner seems to fly in the face of Apple’s newer giant and gianter phones, relatively speaking. When Phil Schiller introduced the iPhones 6 last year, he addressed concerns that the much larger phones would be harder to use and hold (lightly edited for clarity — Schiller wasn’t exactly on-point during this presentation):
One of the things the team has worked on is to not only help [these phones] feel great in your hands […] but to make [them] easier to use one-handed. With iOS 7 last year, we introduced a new gesture: a side swipe gesture, thinking ahead to these phones and knowing that you’d want to use [that gesture] here.
The back button in most iPhone apps — or, at least, the good ones — is in the top-left of the navigation bar, but the inter-app back button in iOS 9 is in the status bar above it. It’s true that this will likely be used less frequently than the back button within an app. It’s also true that the app switcher can always be accessed with the home button, and with a 3D Touch from the edge on a 6S. But most devices running iOS 9 — at least, for a while – won’t have 3D Touch, but many of them will be larger. That makes the back button a little less than accessible, especially one-handed.
On my 5S, though, I think it would feel strange without it now.
Hierarchy and Order
With hundreds of millions of active users, any change Apple makes to the user interface components of iOS is necessarily going to cause a big ripple. Obviously, the biggest so far was the rollout of iOS 7, bringing with it equal amounts of praise and condemnation. Though I was generally positive towards it, there’s one thing iOS 7 did, design-wise, better than almost anything else.
Sheets Stacked on Light
Jog your memory with a look at the mess that was the hierarchy of iOS 6. If you think of different components that make up the iOS user interface — the wallpaper, icons, toolbars, and so forth — the Springboard lay somewhere in the middle of the UI stack. Dark linen — Apple’s background texture du jour — was probably the worst offender, appearing in elements that logically resided underneath the wallpaper — like folders and the multitasking tray — and in Notification Centre, which appeared overtop everything else.
In the video that accompanied the introduction of iOS 7, Jony Ive explained very clearly how the individual components of the operating system became organized:
Distinct, functional layers help establish hierarchy and order, and the use of translucency gives you a sense of your context. These planes — combined with new approaches to animation and motion — create a sense of depth and vitality.
The wallpaper became the base; nothing could be underneath it. Everything — icons, apps, folders, and the multitasking switch — would now be placed overtop the wallpaper. Finally, auxiliary, temporary “sheets” — Control Centre and Notification Centre — would appear over everything else. iOS 7 brought a clear order to the stacking of user interface components.
Multitasking
In that vein, the multitasking UI was revamped to shrink applications to tiles, floating overtop the background image, while the home screen icons appeared to float in their own tile, complete with a frosted background. This brought a sense of physicality to the OS, reinforcing the layering of the interface components.
But one thing has remained consistent since the introduction of multitasking in iOS 4: the most recent applications have always appeared on the lefthand side, while older apps are stacked to the right. It has been this way for five years: left to right, new to old.
In iOS 9, multitasking gets a visual makeover, causing a distinct functional change. When you double-press the home button, the current app shrinks and flies nearly offscreen to the right. Apps are now stacked in a kind of vertical Rolodex, right to left, with the second most-recent app taking the centre position, and all others receding into the distance on the left. It takes some getting used to, especially if your muscle memory, like mine, is predisposed to scrolling towards the right to find the app you’re looking for, but it’s not bad.
The impetus for this change remained unclear to me, however. I wracked my brain for the past few months trying to understand why this is a better multitasking UI than the one it replaces, and I could only think of a single reason: its implementation of apps suggested by Proactive and Handoff. Previously, this feature felt like it was shoehorned in by placing the suggested app to the left of the home screen. This doesn’t make sense contextually, insomuch as that app is from a completely different device. Now, the suggestion is visible as a banner at the bottom of the screen, similar to the mini-player introduced in the Music app in iOS 8.4. It’s a more conceptually robust approach, as if the app resides outside of the boundaries of the device and can be pulled into view.
Now that I see 3D Touch working on the iPhones 6S, though, it’s clear why Apple chose to change the multitasking UI: it is now consistent with the back-paging swipe gesture. It hid under my nose this entire time.
A Metaphysical Interface
Apple’s choice to render many of the user interface layers with a background of blurred smears of translucent white. I think there was a significant conceptual rationale behind this decision — much greater than the aesthetics.
Your device’s display is a small-ish, very bright light with translucent colours suspended in differing amounts on top. User interface elements are literally blocking part of this light on their way to your eye. You can approximate the real-life effect iOS 7 is simulating by holding two sheets of paper in front of a lamp at varying distances from each other. In a sense, iOS went from being skeuomorphic — in the sense of attempting to replicate real-world analog materials — to skeuomorphic in the sense that it’s an almost literal interpretation of how an LCD works.
When placed in that context, the lack of shadows makes sense. Why would something entirely backlit draw shadows in any direction other than forward, straight into your face? Sure, there are exceptions: iOS 7 also, somewhat hypocritically, introduced the NSTextEffectLetterpressStyle attribute to Text Kit, which no developer I can think of has actually used because nobody is doing letterpress-style text on otherwise “flat” user interfaces. By and large, though, iOS 7 eschews typical shadowing because there are no shadows in this environment.
Except when the system is viewed in ambient lighting, that is. And that’s pretty much all the time, indoors or out. While the backlight remains the primary source of light, the foreground lighting with cast shadows onscreen. And if an element were to be overtop another in the virtual space, it should cast a shadow, right? Insist otherwise as hard as you want, but skeuomorphism isn’t dead on iOS; it has evolved and become more sophisticated.
In iOS 9, popovers, action sheets, and multitasked apps now draw a drop shadow behind them. It’s large and diffuse, and the kind of thing you only really notice in terms of shading rather than shadowing. The impression is that these popovers are significantly — nearly impossibly — closer to the user’s eye than anything behind them, kind of like the exaggerated virtual depth of folders and animations. It’s nice to have some sense of depth-by-lighting back.
Springboard
Another year begets another shuffle of applications installed by default. This year, Find My Friends and Find My iPhone receive the privilege of being preinstalled with iOS 9 and cannot be removed. I question the decision for the former to be built in, as much of its functionality is built into the Messages “detail” view anyway. I don’t know a lot of people who use Find My Friends; I suspect that many people will toss it into the same folder where Stocks, Tips, and Game Centre presently collect dust.
Happily, these are the sole additions to the preinstalled apps. While there is a new app by way of News, it basically replaces Newsstand, which is converted into a basic folder in iOS 9. Any apps or publications that resided within it are freed and displayed as regular apps. Passbook has also been renamed Wallet to more accurately reflect what it’s supposed to be, and Apple Watch is now, simply, Watch.1
I’ll get to the specifics of Wallet and News later, but I wanted to comment briefly on their icons. Both represent something of an evolution of the Apple’s established iOS icon aesthetic, and they look so similar that I wouldn’t be surprised if they were created by the same designer. The Wallet icon is more muted than the icon it replaces, while the News icon is simply nicer than the outgoing Newsstand icon. Rather than being a series of differently-coloured rectangles all crammed onto a white background, it’s a greatly-simplified newspaper on a pink background. There’s even some shadowing in both of these icons. I think they look great.
And the Kitchen Sink
As of iOS 9, there are at least 33 applications installed by default, plus News if you live in a supported region and Activity if you have a paired Apple Watch. And, if you switch it on in Settings, there’s also an iCloud Drive app, but it’s hidden by default.
This certainly isn’t the first complaint you’ve read of this nature — and I’m sure it won’t be the last — but that’s a lot of default applications. It makes sense for Apple to try to cater to as much of the vast and heterogeneous iOS user base, but there is a necessary tradeoff. There’s more clutter and, therefore, less disk space afforded to a user’s own data. The latter is particularly noticeable when the company continues to insist that 16 GB is a reasonable entry-level capacity. Most baffling of all, however, is that the way Apple has configured default apps means that their updates are tied to system updates. The delta updates Apple introduced in iOS 5 and their much faster-paced update schedule lately have made this less of a burden, but it’s silly that a security flaw in, say, Find My Friends will necessitate a system update.
There are plenty of apps that we regularly ignore on our iPhones, and they seem pretty constant for most people. Who, for example, uses Compass regularly enough to keep it on their first home screen? It’s hard to make the argument that these apps represent the core functionality of iOS devices. Yet, Apple has already created a solution to this. With the introduction of the iPhones 6 last year, Apple began bundling the iWork suite of apps on the higher capacity models, but you could remove them. And, if it tickled your fancy, you could download them again later. There are probably ten apps that come preinstalled with iOS 9 that could easily fit into the category of nonessential apps that should be deletable. Fine, prevent me from deleting Messages or Safari, but why do I need to keep Stocks on here?
Wallpaper
iOS 9 shuffles the wallpaper selection, losing many of the legacy wallpapers that have shipped since iOS 7. Most of the gradients, coloured patterns, and so forth have been removed, along with many of the photos. Even the blue/green wave photo that was used in the earlier iOS 9 betas doesn’t ship with the release version.
Instead, we’re treated to some gorgeous macro photos of bird feathers that really highlight the precision of the Retina display. Only two non-Retina products run iOS 9, and Apple currently does not sell a single new non-Retina iOS device, so it makes sense to take advantage of the quality of these displays.
There are also some gorgeous new photos shot on a black background, which look especially nice on devices that have a black bezel around the display. It gives them the same kind of edge-less look as the Apple Watch display.
Unfortunately, there are no new Dynamic wallpapers included with the OS, only those that shipped with iOS 7, and there’s still no developer API for creating them. There are some motion wallpapers that will ship with the iPhones 6S, but those are more like videos as opposed to the existing programmatic Dynamic wallpapers, and require interaction before they will come alive. The seven colours of bokeh have, as far as I’m concerned, become stale. I’d love to see some new Dynamic wallpapers in both design and colour, but it doesn’t feel like it’s a priority.
iPad
If iOS 7 was the “look and feel” release and iOS 8 the “Christmas for developers” edition, iOS 9 is the iPad update. See, for all of the big changes in iOS 7 and 8, the iPad seemed like an afterthought; or, at the very least, the enhancements to the iPad during this era felt like they were only developed in parallel to any iPhone improvements. If you’re wont to radically oversimplifying the iPad’s software and UI, this isn’t much of a surprise. Since its introduction, the most frequent criticism of the iPad has been that it’s “just a big iPhone”. This is, in some ways, good — its software is recognizable, approachable, and easy-to-use. Apple played this criticism as a strength in early ads, promoting that “you already know how to use it”.
But it does have its drawbacks when it comes time to multitask, for instance: with only one app capable of being onscreen at a time, it makes researching and writing at the same time more cumbersome than it is on a computer running a desktop operating system. From the iPad’s perspective, this represents multiple tasks; from the human perspective, these two aspects define a single task. There are a enough little instances of issues like this that make the iPad feel less capable than it ought to be. Even if you’re like Federico Viticci and have managed to work within these limits — even making the most of the situation and treating them as an advantage — any improvement to the iPad’s multitasking capabilities certainly bolsters its post-PC credentials.
And bolster its credentials Apple has. It’s now possible to have two apps onscreen simultaneously, more easily toggle between two or more apps, and watch videos at the same time as other apps are open. Of course, having multiple apps visible at the same time isn’t necessarily the tricky bit; the hard part is making all of this additional capability feel easy to use when all there is to work with is a flat sheet of glass.
Slide Over and Split View
Since iOS 5, swiping from the top of the screen has brought down Notification Centre; since iOS 7, swiping from the left has gone back, and pulling up from the bottom has revealed Control Centre. Now, with iOS 9, Apple has completed their monopolization of edge-of-display gestures. Swiping from the right side of an A7-generation-or-later iPad will slide a different app overtop the currently-active one. It’s like proper multitasking; or, at the very least, bi-tasking.
Out of the box, most of Apple’s apps work in Slide Over, and most apps are expected to adapt Slide Over. The only exceptions that they envision are apps that require the entire screen, like a game, or a dedicated camera app. Everything else should adapt or die — my words, not theirs.
To switch between apps, there’s a “handle” at the top of the Slide Over area, similar to the one at the bottom of the lock screen that invokes Control Centre, or the control within a notification that expands it. Dragging down on this handle reduces the current app to a thumbnail and displays icons for available apps in a vertical strip of tiles.
Even with just the default apps, this feels like it will not scale well. Only four app icons are visible at a time in portrait orientation, and just three are shown in landscape. When most of the 60 third-party apps I have installed on my iPad support Slide Over, I imagine it will be unfeasible and inefficient. This is almost entirely due to each app icon being wrapped in a large tile-like shape that is slightly less opaque than its surroundings. I’ve been attempting to understand the justification for this since the WWDC keynote, and I still don’t get it. Perhaps it’s intended as a subtle indication that the app will not open full-screen or its intent will become more clear with the iPad Pro in hand, but it comes across to me as unnecessary chrome that significantly increases the amount of scanning and searching one needs to do to find a specific app.
Both this gesture and the gesture used to invoke Slide Over feel somewhat non-obvious to me, much in the same way that Control Centre and Notification Centre felt hidden when they were first introduced. As Slide Over uses the same symbolism as both Centres, it doesn’t take too long to figure it out, provided you know about it in some way prior to using it. The monopolizing of the righthand edge of the display also means that the forward-paging gesture in Safari no longer works, but perhaps that’s one way to figure out that this new multitasking option exists. Unlike either Centre, pressing the home button while the Slide Over view is active will not close the view; it will return the user to the home screen.
I wouldn’t dedicate this much discussion to the UI if I didn’t consider it one of the most important qualities of multitasking on the iPad. I wouldn’t be surprised if one of the biggest reasons the iPad didn’t have this feature until now is because of the challenge of trying to fit everything expected or required for a multitasking environment into the existing iOS UI language. Notably, there are things missing: there’s no close button or a way to “quit” the secondary app, for example, and it doesn’t appear in the app switcher. You just have to trust that the OS will do the right thing; no manual app management is required.
An app in Slide Over is the same 320-point width of iPhones 4 and 5, and they look the same as their iPhone counterparts. But even if an app is universal, it won’t appear in the Slide Over tray unless it meets the Apple-stipulated requirements:
The app needs to have a Storyboard-based launch screen, not an image.
The app must support all four device orientations.
Most importantly, it needs to be linked to the iOS 9 SDK.
I’m not a developer, but I wanted to find out how easy this was. I downloaded a couple of older sample projects from Apple’s developer centre and followed the steps above and, within a few minutes, had them running in the Slide Over view. It seems fairly easy to implement, but I’m sure it will vary from app-to-app. The advantage will go to developers who have followed Apple’s suggestions and hints over the past few years, and who already have universal apps.
So how is it in use? Quite simply, the addition of Slide Over to the iPad has significantly changed the ways in which — and how often — I use my iPad Mini. While doing research for this review, I was able to have developer documentation open in Safari and pull in Notes any time I wanted to jot something down. Similarly, many nights this summer have been spent on my balcony reading articles with Instapaper or Safari, and pulling in Messages to chat with friends. These are simple things that have always been possible before, but its the increase in elegance and intuitiveness that really sets Slide Over apart.
Unfortunately, neither Slide Over nor Split View support simultaneously displaying two instances of the same app. A typical multitasking case for me is to have two documents open in Byword, or two webpages visible side-by-side. This is a surprising shortcoming for an otherwise-fantastic improvement.
One complaint I’ve long had with the iPad, ever since I bought my first — the iPad 2 — is that it does not ship with enough RAM. I’m not some kind of spec fiend, but when a web browser struggles to keep two tabs in memory at once, there’s a problem, both with the average size of web pages today and with the amount of memory available to handle them. My experience with Slide Over tells me that this remains the case, though significantly less so than ever before. It’s clear that Apple’s engineers have done a lot to improve the performance and memory consumption of iOS — tabs remain in memory longer with more open, and backgrounded apps seem to be dumped from memory less frequently. But it’s not quite as smooth as it could be.
Put it this way: the original iPhone, the 3G, and the first generation iPod Touch used effectively the same processor and memory combination. Do you remember how switching apps felt on those devices? Every time you pressed the home button, it felt like you were quitting the foreground app, and every time you tapped on an icon, it felt like the app was launching. It wasn’t painfully slow by any stretch of the imagination — certainly not for its time — but there was a perceptible amount of hesitance. Now recall how doing the same on a recent iPhone, like a 5S or newer, feels like you’re merely hopping between two apps (if those apps are built well).
Using the multitasking features on my iPad Mini in combination with often-heavy webpages open in Safari doesn’t feel like it lies at either end of these extremes. That’s good, in the sense that it doesn’t feel like I’m toggling in a kind of modal way, but it’s not as great when it doesn’t entirely feel like popping another app in for a peek, and then getting back to what I was doing. It’s a slightly hesitant, nervous feeling that I get from the OS, as though it wants to tell me just how much work it’s doing. Obviously, my iPad Mini is not of the most recent generation, but its hardware does represent over 50% of the iPads that are currently running iOS 9’s multitasking features.
Split View
Split View is the next level of multitasking on the iPad, allowing you to run two apps side-by-side. Unfortunately, it requires kind of a lot of power (and RAM) to do this, so its availability is limited to only the newest iPads: Air 2, Mini 4, or Pro, naturally. I do not have an iPad Air 2, so I was unable to test Split View; Federico Viticci does, though, and did test it, so you should go check out his review for that and many, many other things.
I have played around with it in the iPad Simulator, though, so I feel somewhat qualified to chat a little about its UI and a few other related things.
The good news here is that if you’ve figured out the UI for Slide Over, you’re 95% of the way towards activating Split View. After pulling in a Slide Over view, you’ll see a small drag indicator on its lefthand side. Tap on this, the active app will shrink horizontally, and you’re now running two apps side by side.
In the simulated ways in which I’ve been able to test this — and what has been corroborated with others who have had hands-on experience — the two apps behave as if they didn’t know the other exists. You cannot, for instance, drag a link from Safari on the left into Messages on the right. The lack of drag-and-drop support — or any significant interaction, really — between the left and right apps is a surprising omission, especially since it’s an Apple product, and they’re kind of known for drag-and-drop. Though it’s not a standard interaction method in iOS, it would make a lot of sense in Split View.
Even without that, I can see this being a huge productivity boon. It’s only supported in the iPad Mini 4, iPad Air 2, and iPad Pro at the moment — of which only the first two are currently shipping — but it’s the kind of enhancement that defines the iPad as, in Tim Cook’s words, the “clearest expression of [Apple’s] vision of the future of personal computing”.
Picture-in-Picture
There’s another layer of multitasking available on top of the Slide Over and Split Views. iOS 9 allows you to pop video out of its app and float it overtop everything else on the screen. Also, when you press the home button while a fullscreen video is playing, the video will continue to play in an overlay. The video can be resized by pinching, and can be moved to any corner or even offscreen temporarily.
The video overlay is pretty clever too. If it’s positioned in the lower corners and you return to the home screen, the video will bump up to be just above the dock. The video will also typically avoid navigation bars.
Picture-in-picture functionality doesn’t work by default in all media views within apps; developers will need to make changes to allow users to take advantage of this. By default, videos played in Apple’s built-in apps — including Safari — will work with picture-in-picture. That means that any video you can stream on the web can be made to float above everything else, in any other app. That’s a lot more powerful than it sounds like — there are lots of universities that make recordings of lectures available for students, for example. Floating the lecture above Pages or OneNote is golden for students.
This is not a feature I have used extensively, and I don’t think it’s the kind of feature most people will use extensively. But the times that I have used it I’ve been thankful for its inclusion. I never observed it lagging or dropping frames, even when scrolling through complex web pages or my gigantic photo library behind it.
Command-Tab
There’s one more really cool multitasking feature that will be familiar to anyone who uses OS X: a task switcher. It looks pretty much the same as OS X’s, and it’s invoked with Command-Tab, just like OS X’s. It’s the kind of thing that’s optimized for the Smart Keyboard that will be available for the iPad Pro.
The iPad Release
All-told, the additional iPad-centric features in iOS 9 give a vital and impressive lift to a platform that has felt a little under-served for a while. The multitasking capabilities are impressive and genuinely useful — as I wrote earlier, it has increased the amount of time that I spend with my iPad, and the features that are coming to newer iPads make it much more likely that I upgrade sooner rather than later.
Keyboard
Precisely five hundred, fifty-five days ago, Apple decided to impart confusion on iOS users everywhere by redesigning the shift key on the keyboard. As I said last year, changes to the keyboard must be approached carefully — it is, of course, one of the most-used interface elements, and any change will be noticed. The decision to change the shift key in the way they did was not a good one.
In iOS 9, Apple is not taking chances. Not only has the design of the shift key been refreshed to make the active and inactive states much more distinct, the keycaps now change from uppercase to lowercase when the shift key is not engaged. As this has always been something that’s possible in theory, it’s reasonable to question why it took eight years for it to make the cut. And it only takes a screenshot to figure it out.
From a typographic and layout perspective, uppercase letters are super nice to deal with. In most typefaces, they’re all the same height which means it’s fairly simple to vertically align them in a way that looks consistent and balanced. Lowercase letters, on the other hand, have proper ascenders — like the vertical beam in a d — and descenders — as with the letter j. That makes the letters difficult to align vertically, which you can see in the screenshot above. They’re all aligned to a baseline — that is, the bottoms of the letterforms all touch the same line — and the x-height — literally the height of the letter x — is vertically centred within the keycaps. But because ascenders and descenders go beyond the x-height and baseline, as they do, half of the letters of the alphabet look misaligned within their keycaps.
No matter how wonky this looks, I’ll bet this is huge for users with less-than-perfect eyesight, and others in the accessibility community. If you find it typographically disturbing, it can be disabled in Settings under Accessibility → Keyboard.
The keyboard has more tricks up its proverbial sleeve. On iPads, tapping on the keyboard with two fingers allows it to enter a sort of pseudo-trackpad state, where you can move the text insertion cursor around. It’s pretty great, especially in long documents.
Apple has also added formatting shortcuts to the top of the iPad and iPhone 6(S) Plus keyboards. I type pretty much everything in plain text, but for someone who uses productivity apps regularly, I’m sure it’s great. There’s no way Apple would ship it, but I’d love a Markdown version.
Searching Harder
Proactive
The machines are rising. We have told them so much about ourselves and, as we are creatures of habit, they have learned so much about us. And now they want to help us; that is, with a little bit of help from some Californian software engineers, too. Google has Now, which attempts to predict what you want to look at based on the time, your location, and what’s on your device. Apple would like to do something similar with an addition to Siri that they call Proactive.
Proactive can be accessed with both Spotlight — which Apple now bills simply as “Search” in most places, but not in Settings — and Siri. In iOS 9, Spotlight can be accessed by swiping down on any home screen, or — making its jubilant comeback — swiping to the left of the first home screen. These modes are similar, but not identical: swiping downwards only shows Proactive app suggestions, while swiping to the left displays suggestions for contacts, apps, nearby businesses, and news.
When Proactive learns about you and works for you, it is clever and often helpful. Take my routine, for instance. My day typically begins at around 6:30 AM. I wake up and, over the next hour or so, check my Instagram and catch up on the news. At about 8:15, I plug in my headphones and walk to the train station for my commute. Day after day, this pattern doesn’t really waver.
All through this, Proactive is learning certain cues. When I wake up in the morning, it suggests Music, Instagram, and NYT Now. If there’s an event invitation in one of my emails, it suggests that I add it to my calendar at the top of the message. When I plug in my headphones, I see the last song or podcast I listened to on the lock screen. In fact, as I’ve been writing this review every evening over the past few weeks, I’ve seen Notes — where I’ve been keeping a bunch of my observations and ideas for this review — suggested by Proactive at around 9:00 or 10:00 each night. It’s a really nice touch.
Unlike Google Now, all of this predictive work is done on the device. Instead of a remote server analyzing the goings-on of your device, the OS itself figures it out.
The exception to this are the recommendations for nearby businesses which, naturally, requires pinging a server with your location. These are grouped into categories like Fast Food, Cinema, Nightlife, and gas stations, based on the current time of day. (I’m writing this review at night — can you tell?) Tapping on one of these circular icons launches a Maps search for that category. Unfortunately, it doesn’t automatically filter these results by those which are currently open, but it does sort by distance. This feature is only available in China and the US at launch, unfortunately; during the betas, it was functional in Canada and I really liked using it. I hope it comes back soon.
But, ever since the WWDC keynote, I’ve been perplexed as to why so much of this tentpole feature has been tucked away in Spotlight, given how much it can change the way customers use their phones. When I unlock my phone, I see my first home screen. On there are the apps I use most frequently; I would suspect most others’ phones are set up in a similar way. It’s unlikely that I’m going to make the effort to swipe one screen away, as I’ll find four or eight apps that are typically on my home screen already. Occasionally, as with the Notes suggestion above, I will see apps that are buried deep inside a folder, and that’s genuinely useful. I get the feeling that this is the first step in preparation for a more major push to modify the way we think of the home screen, but right now, it’s not radically changing the way I use my iPhone or my iPad.
I also don’t understand the logic by which apps and contacts are suggested. This is, of course, by design — it’s more magical when you don’t know what’s inside the hat, so to speak. Of the eight contacts suggested on my phone right now, seven are people I’ve recently texted, emailed, or called, which makes sense. Six of those people are on my favourites list in the phone app. The last of the eight suggested contacts is someone who is also on my favourites list, but who I have not called nor messaged in the past several months. Maybe that’s Proactive reminding me to get back in touch; I’m not sure. I’ve also seen a couple of instances of suggestions for contacts whose birthdays are coming up, but not always, and not with any particular rhyme or reason.
But often not suggested are non-favourite contacts who I am in regular contact with recently. Lately, there are a couple of people I’ve been speaking with by phone, text, and email, and not one of them has appeared in this suggested contacts area. I’ve since removed all but two of my favourite contacts in the hopes that I’ll receive a more relevant mix. Perhaps this works better for people in corporate settings, or others may potentially talk to a few dozen people every day.
Even if these suggestions become really great, I’m back to my lack of understanding of why this functionality isn’t front and centre, or part of the typical iOS workflow. When I want to message a friend, I open Messages, regardless of how much or how little I contact them; when I want to email someone, I fire up Mail. Providing suggestions is genuinely helpful, but when they’re presented in a way that doesn’t fit my learned usage patterns, I’m unlikely to use them regularly. The surprising exception to this, for me, was last year’s introduction of favourite and recent contacts to the app switcher. Apple has removed it this year, but I regularly used that feature. Perhaps I will learn to integrate Spotlight into my typical usage pattern after all, but it hasn’t come naturally yet.
There are instances where Proactive goodness could conceivably appear in the places you would expect. For example, when you tap the search box in Maps, you’ll see the same higher-level versions of the business categories that Proactive suggests — instead of Fast Food or Nightlife, you’ll see Food or Fun. Tap on one of these broader categories and you’ll see all of the sub-categories: for Food, that would include Restaurants, Supermarkets (which displays as “Supermark…” on my phone, but never mind that), and, yes, Fast Food. Though these are not in any way predictive – it doesn’t use the current time to suggest nearby businesses — I find this integration far more natural, and it’s something I’ve regularly used over the past several months.
And then there are Proactive’s app suggestions. For a period of a few weeks, I had several parcels arriving, so I was regularly opening and checking Deliveries. But now, all of my parcels have arrived – less one; long story — and I don’t have much of a need for Deliveries at the moment. Though I haven’t opened the app for a while, Proactive continues to suggest it. It doesn’t seem to take much for Proactive to learn something, but it does seem to take a long time for it to forget stuff.
Event invitations represent what is probably the cleverest Proactive functionality. It is, as far as I can tell, the only instance of one app being aware of the contents of another app without user intervention. Calendar will detect events included in received emails and schedule it as a placeholder. This doesn’t confirm the event with the inviting party, but it does allow you to schedule your time, especially if you’re far more popular than I am and go to a lot of cool parties.
In practice, I’ve found it difficult to replicate this functionality with any reliability. A confirmation email from Hotels.com, for example, was detected perfectly, with the exception of detecting what time zone I booked the hotel in. But a plain text email like this produced unexpected results:
Hi Nick,
We’re having a party!
When: Friday, September 11 at 8:00 PM until midnight
Where: Jarvis Hall Fine Art
Hope to see you there.
(I should note that this email is fake, but Jarvis Hall really does run a great gallery in Calgary.)
In this case, Mail will detect the event in the email and display a banner across the top similar to that in iOS 8 for adding a new contact. However, it will not detect the location of the event, nor use anything in the email — not even the subject line — as an event title. Calendar also won’t create a placeholder event, and minor differences to the formatting of that message can fail to make the banner appear in Mail as well.
Perhaps most surprising is that receiving an ICS event file, the standard file format for event files in all calendaring applications, doesn’t display the banner in Mail nor does it create a placeholder event in Calendar. Colour me perplexed.
As far as I can tell, Proactive-enhanced calendaring is a temperamental animal. It does not behave consistently, which is frustrating, but when it does work, it’s magic. Like most seemingly-magical things — Siri, natural language search engines, automatic whatever — its limitations feel undefined and murky, and they’ll only be found by experimentation.
There’s one final touch bit of Proactive goodness, and that’s related to audio. When you plug in your headphones or connect to a Bluetooth system, the last-playing song will appear. This I like very, very much.
Spotlight
So what about enhancements to Spotlight that aren’t predictive? Well, iOS 9 has those, too, in a big way.
Remember the halcyon days of rumours that Apple would compete with Google by introducing a web search engine? Well, they’ve kind of done that, only it’s a much more comprehensive interpretation that searches content on both the web and on your phone, even within apps. And the only way you can access it is if you have one of Apple’s devices; there’s no webpage with a search box.
To make this functionality possible, Apple has been not-so-secretly crawling the web for the past several months. I first started noticing references to an Apple web crawler in January or February in my server logs, and references to AppleBot — as it came to be known — showed up at the end of March.
17.142.152.145 - - [31/Mar/2015:21:28:41 -0400] "GET /?p=15974 HTTP/1.0" 301 20 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/600.2.5 (KHTML, like Gecko) Version/8.0.2 Safari/600.2.5 (Applebot/0.1; +http://www.apple.com/go/applebot)"
The AppleBot uses a familiar array of signals to index pages, including Schema, Facebook’s OpenGraph, and HTML structure, and has so far indexed a bizarre assortment of sites. I’ve spot-checked many of the sites that I typically visit, and I haven’t really noticed a pattern. Very high-traffic sites are in there, as you might expect, but sites with even moderate traffic seem to be chosen based on criteria I haven’t figured out. Pixel Envy is in there as a suggested website, but Stephen Hackett’s excellent 512 Pixels is not, and he’s way more popular than I am.
Perplexing indexing anomalies aside, the fact of the matter is that this represents a radical shift in Apple’s approach to search. Previous versions of iOS have been entirely dependent upon Microsoft’s Bing search for their web results. Now, Bing results are demoted, but still visible, in both Spotlight and Safari.
Also in Spotlight in iOS 9 are results from Twitter, web videos (like Vevo and YouTube), and pretty much all of Apple’s built in apps (including their content). And you can do conversions and other basic kinds of math, or check the weather.
Starting in iOS 9, developers can set up the content of their apps for indexing. For example, Television Time uses this API so that you can search for a show from Spotlight, and each result is linked to that specific part of the app (in this case, a television show). Spark continues its quest to be a total replacement email app by indexing all your mail, and Pinner indexes your Pinboard bookmarks. I’m glad to see Apple opening up this kind of extensibility in their OS; it’s so much better for it.
Siri
There are some enhancements and changes to Siri in this release, too. As one may reasonably have guessed, Siri in iOS 9 looks a lot like Siri on the Apple Watch, complete with an RGB histogram that represents audio waveforms. I’m not entirely sold on this logic, or lack thereof, but I quite like it — I think it’s the best Siri has ever looked.
Also inherited from the Apple Watch is a lack of the familiar “ding ding” chime when activating Siri. If you’re on an iPhone, you’ll only feel a couple of quick pulses of haptic feedback. It’s subtle, but it feels like a much richer, more personal experience. It’s hard to explain, but it’s the difference between a version of Siri that feels like cold software, and a version that’s more immersive and connected.
The rest of Siri’s audio feedback, like reading answers aloud, can now be set to be dependent on the position of the ring/silent switch. This combination of things can render Siri mostly silent, most of the time, which has given Apple the confidence to reduce the amount of time the home button must be held before Siri is activated. It feels more fluid and somehow closer, though I have noticed an increase in the amount of accidental activations I make. It’s genuinely not a big deal, though: Siri is faster than it has ever been, and there’s no chime to embarrass you in a meeting or class.
If you’re outputting sound from your iPhone via Bluetooth or the headphone jack, or you’re using an iPad or, presumably, an iPod Touch — both of which lack a vibrating motor — you’ll hear the activation chime, but no de-activation sound.
Changes to sound effects aside, one thing I’ve noticed in iOS 9 is a marked increase in Siri’s accuracy. I’m never sure whether this stuff is my imagination or not, but the reliability of Siri has gone up seemingly significantly over the past few months I’ve been using this new incarnation. I’m guessing that the Watch and the forthcoming Apple TV helped fine tune this, and the result is that I’m far more confident using Siri than I ever have been. Yes, it still — inexplicably — can’t edit reminders, but it’s better at understanding you when you do utter a command that it can obey. It’s a relatively small amount of progress, but the payoff is worth it.
Something new and Proactive-related is that you can now ask Siri to remind you about “this”. What’s “this”? Well, it’s whatever’s on the screen at the moment. Got a Mail message that’s open in the foreground but you don’t want to deal with it right away? Tell Siri to remind you about it, and you’ll get a reminder with a deep link to the message itself in Mail. This works in a lot of apps — Messages, Phone, News, Maps, and others. It works with third-party apps that have adopted that search indexing functionality we talked about earlier. You could even try it now, if you’ve already upgraded to iOS 9: just activate Siri, and ask her to remind you to about this in an hour. There: I just created an intermission for you. You’re welcome.
But, like much of Siri’s functionality, it’s buried under a layer of guesswork and unexpected failures. You can’t, for example, ask Siri to remind you about anything in Photos or Music — say, if you found a playlist in For You that you wanted to listen to later. It also doesn’t work with some of the cues you might expect. For example, invoking Siri on a Safari page on your iPhone and asking it to remind you to check the page out on your iPad will create the reminder, but it won’t be triggered when you next switch devices.
More Intelligent, but Not Brilliant
Apple has clearly put a lot of work into all aspects of search. Proactive is one of their most ambitious efforts yet in the space, and it’s clear that Siri works better than ever, and can be tuned to your voice. The addition of third-party indexing to Spotlight is totally killer. All of these things come together to make iOS more personal and more user-friendly.
But no matter how much work has gone into all of these features, there remains the nagging feeling that Apple could have done more if they didn’t have such a hardline stance on user privacy. The flip side of the coin is, as Jack Wellborn quipped, “The reverse statement sounds even worse: ‘Google Now takes functionality seriously, but privacy is sacrificed.’”
I can’t be alone in wanting the best of both worlds: the privacy of Siri with the features of Google Now. But I’m not desperate, and it’s far easier for me to be patient with what iOS has now than to try to make Google Now as privacy-conscious as possible. It’s far from perfect, but at least I don’t feel like I’m turning tricks.
Safari
Safari View Controller
So far, iOS has come with two major APIs available to developers for including web content in their apps: UIWebView and — new in iOS 8 — WKWebView. The latter performs better, but they’re basically the same sandboxed, dumb web browser in an app, completely distinct from Safari.
iOS 9 includes an incredible new API for developers called Safari View Controller. Federico Viticci has a great explanation of this new API but, in a nut, it’s almost like running Safari directly within an app. There are shared cookies, which means your logins from Safari will transfer over, autofill is available, and it’s much safer for users as all browsing activity is isolated to the view controller.
I’ve been testing a few apps that have implemented SVC, and I can’t tell you how nice it feels. In most cases, it’s a subtle change, but the ways in which it improves over its predecessors improves the consistency of web browsing in different apps.
Content Blockers
“A widescreen iPod with touch controls” got big applause, while “a revolutionary mobile phone” got a massive reception. But “a breakthrough internet communications device” was merely greeted with tepid and polite clapping. With hindsight, we recognize the raucous applause opportunity we missed at Macworld 2007; the iPhone truly revolutionized the way we communicate on the web.
Before the iPhone, the mobile web was considered an anomaly. Very few sites considered how they would be rendered on the crappy WAP browsers included with most phones, from cheap flip phones to what was then considered a “smartphone”. Very few advertisers wasted their time or money trying to cater to these users.
But, as the mobile web grew after the introduction of the iPhone, everyone began to take notice. Companies large and small wanted their web properties to work well on phones, so developers began to create techniques to support them. Responsive web design was born of the desire to have identical content available for every visitor, no matter their screen size; after all, the mobile web is just the web. Browsers on both desktop and mobile adapted to these increasing demands. But as soon as performance improved, some developers would take advantage of it.
Also taking advantage of the increased JavaScript performance in web browsers were ad exchanges, analytics scripts, and other bullshit. Developers could use far more advanced techniques to try to measure everything from time spent on page to tracking every move of the user’s mouse throughout the page, to tracking a user within a site and around the web. Not only is this performance-intensive, it’s also invasive.
Because many websites converted to a responsive website and implemented a bunch of ad and tracking scripts, mobile users have been receiving the same bloated webpages as desktop users for years. A 10 MB text-only article — more common than you’d think — is inexcusable on the desktop, but at least it barely makes a dent in most broadband data caps. On mobile, though, the data caps are far more restrictive and the price per megabyte is way, way more.
Here’s the sticky problem: these crappy scripts and performance-intensive ads are how many websites make money these days. In order to pay for the — pardon me — content that readers consume, they have to put up with a bunch of stuff around it that readers don’t want. That much is understandable. But when a page’s weight is 10% content (and supporting code) and 90% advertising, that’s irresponsible and wrong. Something’s gotta give.
With iOS 9, Apple is bringing content blockers to Safari and Safari View Controllers. They’re only available on 64-bit devices for performance reasons, and they’re more complex than they seem on the surface. Yet, as far as users will be concerned, they provide for a far better browsing experience in a painless and unobtrusive way. Perhaps you’ve heard about this news.
So why are these extensions receiving significant media coverage? Well, let’s begin with the technical stuff. Most browser extension formats that you’re used to simply inject code upon loading a page, and they do so recursively — that is, the code also gets injected into each framed page within the parent page. This code may hide content on the page based on a stylesheet, or do something based on an additional script.
Content blockers are different. Instead of running during page load, the browser pre-compiles the JSON-formatted list of rules. This makes it way more efficient because the extension doesn’t have to load for each frame within a page, nor does it have to reload each time a new page is visited.
What kind of content can you block? Well, that really depends on what kind of blockers third-party developers create — unlike the misconception in plenty of media reports, Apple is not preloading an ad-blocker, nor are they implying that this is the intended use of content blockers. Indeed, plenty of third-party developers are utilizing this extension in different ways, from blocking adult content to silencing the internet peanut gallery.
But plenty of the content blockers available from third-party developers will be ad and tracker blockers. Unfortunately, like most iOS extensions, activating a content blocker is a bit buried. After downloading one or more of your choice from the App Store, open Settings, tap Safari, then tap Content Blockers and switch your newly-acquired blocker on.
I’ve been using Crystal for the past month or so and the difference it makes to loading times is, ahem, crystal clear. Page loads are reduced, often significantly, and I can have more pages open at the same time without triggering a refresh when switching tabs. It’s night and day.
Unfortunately, as has been pointed out, there are some ethical dilemmas that are inescapable. Blocking ads makes it harder for a website to earn money. But, while we’re not entitled to a website’s content, we are typically there for anything but the ads. If this were like television — where the balance between advertising and substance is typically tilted towards substance — the ethical argument would hold more weight. But it isn’t; the scales are tilted too far in the direction of stuff that isn’t substantial. Furthermore, there are plenty of very heavy scripts that load in the background that track you within a site, or even across the web. I didn’t ask for an omnipresent Facebook “like” button that tracks me everywhere, but it’s out there. Everywhere.
There’s one more issue I have with content blockers: the name. Ads are not content; comments are not content. It’s a bit of a misnomer, but I suppose “crap blocker” didn’t fly in Apple’s marketing department. Oh well.
If content blockers take off on iOS — and I think they will — there’s going to be a big shakeup. Things will necessarily change, and that’s why there’s been so much media coverage. But we will hopefully be left with a better web that’s less scummy, more respective of privacy, and nicer to mobile data caps.
Other Enhancements
There are a couple of other changes to Safari this year that, while not as headlining as content blockers, are important in their own right.
File uploading is no longer limited to images. Now, when using a standard HTML file uploader, you can select from iCloud Drive, OS X Server, and third-party apps — like Dropbox or Transmit — in addition to the photo library. That’s impressive, especially for iPad users who would like to use it as their only computer.
It’s also now possible to select from all available username and password combinations stored in iCloud Keychain when autofilling a form. There are lots of sites on which I have multiple username and password combinations, and the crapshoot autofill of iOS 7 and 8 bothered me on these sites. Sadly, it still isn’t possible to generate a password when registering for an account, like you can on OS X.
And, naturally, these enhancements are available in Safari View Controllers, too.
Finally, the Find In Page and Request Desktop Site options have been moved from the address bar to the sharing sheet. The latter is also available by holding the reload button, as is an option to reload the page without content blockers. This makes a lot more sense than burying any of these things at the bottom of the address suggestions.
News
With iOS 5, Apple added a new icon to the home screen: Newsstand, and it was a curious beast. It had the container-like behaviour of a folder blended with the dedicated multitasking icon of an app. But the intent was pretty clear: instead of having a bunch of news apps all cluttering up your home screen, why not have one place for publications of all sorts, indie and professional? These apps would look like the news, too, with icons that were masked and skewed to look like physical magazines and newspapers laying in wooden shelves.
While they were at it, Apple gave these apps some special permissions. Unlike any other app, Newsstand apps could change their icon with each issue, because magazines and newspapers change their front page with every issue. These apps could also change the app description at will, without having to submit a new app build to the review team. Perhaps the most significant difference between a Newsstand app and a generic app is that they were allowed to refresh in the background, something otherwise unallowed with the APIs available at the time.
Even with all of these additional capabilities and hundreds — perhaps thousands — of mainstream publications supporting it, Newsstand never really took off. Perhaps users didn’t like the idea of opening multiple apps — sorry, publications — every day to read what’s new in each, or maybe they didn’t want to pay for several individual subscriptions. The redesigned Newsstand icon in iOS 7 certainly didn’t help, as it no longer indicated to the user whether the publications inside had been updated. Meanwhile, the background refresh capabilities previously exclusive to Newsstand apps were opened up to every app.
News is a different animal altogether. Instead of being a folder of separate apps, it’s just one app with all publications within it. Publishers also don’t have to create custom apps because News relies on RSS, which means all publications — large and small — can be included.
For publishers, there are two ways of getting your app included in News. The first way is to head to iCloud.com and sign in with your account. Then, launch the News Publisher app. You can fill in your details, submit your RSS feed, and upload a logo. You can even submit multiple RSS feeds for different categories or sections. For example, if I were so inclined, I could create separate feeds for linked items and full-length articles, and they would appear as different sections within News. Apple recommends using only full-content RSS feeds, as opposed to truncated summary-style feeds.
The obvious advantage of submitting your feed through News Publisher is that it will be included in the searchable News directory. Feeds are automatically crawled for keywords and topics; publishers do not have to fill in a lengthy metadata or description form which, let’s face it, would likely be abused anyway. Being in Apple’s directory, though, requires a review from Apple, and I know of perfectly legitimate publications that have been rejected with the same kind of vague rationale as some App Store rejections.
Fortunately, if a website has an RSS feed, there’s another way to add it to News. In Safari, just tap the Share icon and scroll across the bottom row of actions until you find Add To News. Tap this icon and the sharing sheet will disappear. There will be no visual confirmation of any kind, but the site will be added to News.
When you launch News for the first time, you’ll be asked to subscribe to several publications and topics that you’re interested in. Apple refers to both publications and topics as “channels”, and they have quite the variety of both. You can also subscribe to topics, for when there’s something in the news you’d like to keep an eye on from any publication. As this is Apple, you won’t find Playboy or Penthouse in News — not even just the articles. But it has built-in topic detection for all items, and it does not censor topics that Apple might otherwise consider “mature”.
By my guess, there are thousands of publications in News available at launch — including, I feel compelled to point out, the one you’re reading right now — but you might notice that some of these publications look a little more polished than others. Some mainstream publishers — such as the New York Times, Vanity Fair, and Cosmopolitan — appear to have been given early access to the Apple News Format (ANF). This format promises to offer far more control over article appearance, like custom typography, custom layouts, and the option of a parallax effect on the top “cover” photo. It will also offer readership statistics to publishers. Apple hasn’t released the specifications for their format yet and my attempts to reverse engineer it were not fruitful. In part, though, it looks like it’s formatted with Markdown.
The biggest difference between RSS and ANF is that the latter supports in-story ads and monetization, while the former does not. Publishers using ANF can opt to use their own advertising network and keep all of their revenue, or they can use iAd from which Apple receives a 30% cut of revenues. RSS feeds do not support this kind of advertising, but News will not block or otherwise undermine sponsorship posts.
Additionally, there’s no way for publishers to set up a paywall or require authentication to read their articles. This seems like a logical next step for the platform, perhaps even repurposing the name Newsstand to allow publishers to offer paid subscriptions.
News, then, starts to feel kind of like a glorified RSS reader with a custom format soon to be available, and a monetization layer on top. But News comes with the same kind of personalization features as Apple Music. Tapping the heart icon in the bottom toolbar while reading a story tells News that you’d like to read more articles like it in the For You tab.
As with Music, this works better in theory than in practice. Because it isn’t an automated process, generating a profile of your particular interests and things you enjoy requires you to manually intervene regularly, and I’m not sure a tiny heart icon in the lower toolbar is a convincing way of doing that.
Apple says that News also uses the stories you read to help establish trends for the For You section. my expectation would be that reading and liking stories within News would surface other articles related to those, but from publications I do not typically follow. However, I haven’t really noticed this in practice — every story in my For You stream right now is either from a publication that I already follow, or is from a publication writing about a topic I already follow.
News articles can be shared with other apps or saved for later. When sharing a link to a story, it points to an apple.news URL which redirects to the original source on non-iOS 9 platforms. Saved articles are perplexingly not synced with Safari’s Reading List, but they are saved offline.
Having written all of this, you may be expecting my feelings towards News to be rather muted; yet, I’ve opened the app nearly every day since it became available. As a basic RSS reader, it’s actually pretty good, with offline article support, a clean layout, and a nice gestural way to flick between stories – just swipe left and right. Additionally, scrolling “past” the bottom of an RSS article — but not an ANF article — will pull up a web view of the RSS link. This is typically the original post, but for feeds that use a Daring Fireball—style link post feature, it will be the linked article.
News is only being made available in three places at launch: the United States, the United Kingdom, and Australia. Apple is also only accepting English-language feeds at launch. This seems particularly restrictive for an app based on displaying content from the web using an open standard, but what do I know?
Notes
On pretty much any platform since the dawn of time, the built-in note-taking app has largely seemed like an afterthought. It’s almost always just a text box that allows for user input, with few options or features. That’s not necessarily a bad thing; there are plenty of people who love a no-frills spot to dump their thoughts, ideas, notes, lists, or whatever.
But there is a case to be made for a text editor that offers some additional options while keeping them out of the way for those that want the minimalist experience. Evernote, Vesper, Yojimbo, Byword, and plenty of others have all offered variations on this theme. While all allow you to keep things simple with text in a box, they all allow some level of formatting flexibility, whether it’s adding images, dividing by section headers, or creating lists.
The Notes app on iOS has received the same treatment. No longer just a text box with bold/italic/underline options and a syncing backend, Notes now allows for a plethora of formatting options. There are the usual suspects — different kinds of lists, headers, and titles — and you can insert photos.
But there are some other, much more clever options available. You can now create todo lists within a note, which I’ve found extremely useful. When I write a lengthy review like this, I typically keep some notes of my findings in a paper notebook; sometimes, I’ll write some notes in Vesper, too. I prefer paper because I can also make a list of things that I need to look into further.
For my review this year, I made a commitment: all my notes for the review would be taken in the new Notes app. No exceptions. While that meant entrusting my precious notes to iCloud, I felt it was my duty to take the plunge.
I also made another decision. I write almost all of the articles on Pixel Envy in Byword, because it means I can write in Markdown, keep everything synced with my different devices so I can write wherever, and know that everything is backed up. But this year, I have opted to write this section of this review in Notes. (And, I should point out, I’ve switched to MarsEdit on the desktop because it’s just nicer.)
I could almost have written the entire initial draft of this review in Notes, if I were so inclined. It’s really nice to be able to keep everything together, and simply back out of this note and open up my list of iOS 9 observations for consultation. That list is critical to this review: everything I’ve noticed that I find interesting is in that note. I’ve also created a list of items that require further inquiry in my iOS 9 observations note, made possible by the new checklist feature.
It’s also possible to add a drawing to a note, and there’s even a ruler so your lines don’t get all wonky. I’m not much for drawing on my phone, but I’ve played around with it and it’s quite nice, albeit limited: there’s no way to choose a custom colour, for example, so you’re limited to the 24 presets. There are also just three drawing instruments available — pen, pencil, and felt tip — but each can have a unique colour, and they do look distinct (unlike in some other drawing apps).
For some reason, Notes remains saddled with a fake-looking paper texture and debossed text. Why this is the case I’ve no idea; it is completely incongruous with the rest of iOS, and yellow on grey is not a particularly flattering colour combination.
On the backend, Apple has changed the syncing technology from WebDAV to iCloud Drive. There were once days when I wasn’t sure whether this was an upgrade; now, I’m much more convinced it is. The fact that Apple is making Notes a headlining feature and they’re eating their own dogfood on it displays much more confidence in iCloud’s abilities.
So: is Notes my new go-to note-taking app? I’m not really sure. I rely upon Vesper’s tagging abilities and I vastly prefer its design, but Notes is a much more viable candidate. It no longer feels like the notation app that its developer is obliged to include, and that’s a big step up.
Maps
The big news in Maps this year is the introduction of transit directions in a small selection of cities. If you live in a reasonably big city in China (ostensibly — more on that soon), or you’re a resident of the metro areas of Baltimore, Berlin, Chicago, London, Mexico City, New York City, Philadelphia, San Francisco, Toronto, or Washington DC, this will be incredibly relevant to you. I do not; therefore, it is not. But I went and checked out transit directions anyway in some places that I know pretty well.
The thing Apple seems most proud of is that they’ve mapped the physical station structure, whether under- or over-ground, and that makes a difference on some of the giant underground stations you will find in London or New York. For example, the Bank station that one arrives at on the DLR line in London is midway between the Bank station for the other lines and Monument station. They are all connected via a massive network of pedestrian tunnels and it’s notoriously easy to get lost in this interchange, especially if you’re a tourist. I’m not entirely certain if you’ll get a cell signal in the main tunnel that connects Bank and Monument, though.
Even for smaller stations, it’s nice to know exactly how the station is laid out, especially if you’re a tourist. Hounslow West is not a big station — there’s just one line that goes through it — but it would have been nice to know its layout in advance of visiting a couple of years ago.
As I don’t live in any of these cities, I’m in no place to judge the accuracy of these listings in a real-time kind of way. I compared a selection of listings to a couple of other transit apps and the local transit authority’s official listings, and they seem fine. There’s nothing here that stands out to me UI-wise, either, which is a really good thing: this looks like a very competent transit integration, and I suspect a lot of people living in the cities covered will use it.
While Apple says that over three hundred cities in China can take advantage of transit directions, I tried a few that were listed on the WWDC keynote slide and was notified in all cases that no transit options were available. The transit icon didn’t appear on any cities in China, either, and no Chinese cities are listed in the Maps data providers document. I’m not sure if the deals didn’t get finalized in time, or perhaps it’s something that only appears if you’re in China, but if it was, indeed, omitted, that’s pretty significant, even if only in purely numerical terms.
Update: It appears that transit data in China is only available if you’re browsing from within China. Thanks to Wil Turner for his help in confirming this.
Data Quality
It has now been three full years since Apple launched their mapping product. That’s a long time for most things to get perfect, but nowhere near enough time for a cartography product to get great all around the world. Apple has been steadily making improvements, but for each enhancement they make, there are others that still feel lacking.
The other day, I came across Calgary’s 10th street LRT station on Apple Maps, which was KO’ed shortly after Apple Maps’ launch. I reported it as closed at the time, but it required a second report a couple of weeks ago for it to be resolved.
Search is still messed up, too. My annual search for “wine market” — Kensington Wine Market is near my apartment — continues to return a list headed by a place in Baltimore, and a series of others that are located nowhere near where I live.
I’m sure that Apple continues to make improvements in mapping data, and I’m sure that it’s going to keep getting better in more and more places. But for some people in some places, it can be a long wait. In the same way that users with content blockers probably aren’t going to disable them to see if ad exchanges have gotten any better, most users probably don’t check in from time-to-time with Apple Maps if they’ve switched to Google or another provider. And that’s a little sad because, as you can see in the London Underground screenshots above, Apple’s data is sometimes better than Google’s.
Grab Bag
Bugs and Stability
It’s kind of funny that I’m burying one of the stated goals of this release all the way down here, in the knick-knacks section. In my experience with it, iOS 9 puts Apple back on the road of higher-quality software releases. I encountered far fewer serious bugs, and I can’t think of any significant frustrations in the release version. That’s not to say that there aren’t any, only that I haven’t seen anything wildly broken on either of my devices. It’s good news; you should smile.
Health
One of the more glaring indications that Apple’s corporate staff is overwhelmingly male is that Health in iOS 8 omitted female and reproductive health factors from tracking. In iOS 9, Apple has added a reproductive health category to Health and HealthKit, including spotting, basal body temperature, and menstruation. As HealthKit is just a secure data store and junction point for third-party apps, it probably won’t replace your standard period tracker. But, it’s now possible for different apps to securely share reproductive health data; for example, if you’re trying to conceive, you may want to use a couple of apps to keep track of all of the relevant data.
Photos
After years of cries from developers and users alike, iOS now includes a dedicated automatic album for screenshots. Unfortunately, screenshots still litter the All Photos view, and there remains no way to view just photos.
There’s also a new “Selfies” album for photos taken with the front-facing camera. It doesn’t perform any face recognition or anything, so if you’ve taken any photos with the front-facing camera, they’re going to be in here.
Notification Centre
I never really understood why notifications were sorted by app in previous versions of iOS. When you miss a notification, you probably don’t want to go hunting through a list of apps to figure out which one it came from. In iOS 9, missed notifications are now — correctly, in my opinion — sorted by time, newer to older.
Developers are now able to add a text entry field to notifications, like that for Messages.
Widgets
There are a couple of new Notification Centre widgets in iOS 9 provided by the system: one for Find My Friends, and another for battery status.
The Find My Friends widget is perplexing to me. If you’ve used the app, you know that it takes a little while for it to refresh your friends’ locations. Placing this necessarily slow response in the context of a Notification Centre widget is a recipe for something that probably won’t be used; why not just open the app instead?
The Battery widget is a little more useful. It allows you to keep tabs on the battery status of your iOS device and any paired Bluetooth gadgets. My Jambox Mini, for example, doesn’t have a visual battery indicator; displaying it on my phone makes a lot of sense.
Settings
Our prayers have been answered: it’s now possible to search Settings. Unfortunately, it’s not particularly smart: while it handles all the basics you’d expect, slightly fuzzier searches don’t match. For example, say you wanted to change the wallpaper but didn’t know that it was called that; searching for “background” will not find the Wallpaper settings panel. This, unfortunately, also applies to the more esoteric settings nestled deep. It’s not the best search engine in the world, but it gets the job done most of the time, provided you know what to look for and simply don’t know where to find it.
Security has also been improved. Saved passwords and credit cards in Safari now lay behind Touch ID prompts, which should provide at least a little more security than a passcode, and it’s possible to search within the saved password list. iOS will also default to a six-digit simple passcode for the system, up from four digits. The math says that this provides a million possible combinations, so your jackass roommate or the NSA — depending on who you annoyed most — will have a more difficult time trying to guess it.
Recommendations and Conclusions
If there’s anything I learned while using it, it’s that iOS 9 is a big release, far more than just a patch-and-refine deal. It’s clear that Apple learned a lot from building iOS 7 and 8, and the results are solid. It’s not perfect, but it remains the best mobile operating system for me. I wouldn’t have any reservations about recommending that you upgrade as soon as you can.
Thank you for devoting so much time to reading my thoughts on iOS 9. If you’ve spotted anything that you feel is unclear or incorrect, please get in touch. If you liked what you read, please let others know. I really appreciate your time.
Presumably to reflect how much Apple wishes for the Apple Watch to be to watches what the iPad is to tablets. ↥︎
If you haven’t been reading Joshuah Bearman and Tomer Hakuna’s excellent reporting of the Silk Road saga, you’re missing a jaw-dropping story. The second part has just been posted, and it’s riveting.