longform

Drew Magary, Deadspin:

I remember hosting the Deadspin Awards in New York the night of December 5th and then heading over to a karaoke bar for a staff after-party, where I ate some pizza, drank a beer, sang one song (Tom Petty’s “You Got Lucky,” which would soon prove either fitting or ironic, depending upon your perspective), and that’s it. After that comes a great void. I don’t remember inexplicably collapsing in a hallway, fracturing my skull because I had no way to brace myself for the impact. I don’t remember sitting up after that, my co-workers alarmed at the sight of blood trickling out of the back of my head. I don’t remember puking all over Barry Petchesky’s pants, vomit being one of many fun side effects of your brain exploding, as he held my head upright to keep me from choking on my own barf. I don’t remember Kiran Chitanvis quickly calling 911 to get me help. I don’t remember getting into an ambulance with Victor Jeffreys and riding to an uptown hospital, with Victor begging me for the passcode to my phone so that he could call my wife. He says I made an honest effort to help, but my circuits had already shorted out and I ended up giving him sequences of four digits that had NOTHING to do with the code. Flustered, he asked me for my wife’s phone number outright. Instead, I unwittingly gave him a series of 10 digits unrelated to the number he sought.

I don’t remember that. I don’t remember bosswoman Megan Greenwell trailing behind the ambulance in a cab with her husband and staying at the hospital ALL NIGHT to plead with them to give me a closer look (at first, the staff thought I was simply inebriated; my injury had left me incoherent enough to pass as loaded) because she suspected, rightly, that something was very wrong with me. I don’t remember doctors finally determining that I had suffered a subdural hematoma, or a severe brain bleed: A pool of blood had collected in my brain and was pressing against my brain stem. I was then rushed to another hospital for surgery, where doctors removed a piece of my skull, drained the rogue blood, implanted a small galaxy in my brain to make sure my opinions remain suitably vast, put the hunk of skull back in, and also drilled a hole in the TOP of my head to relieve the pressure. They also pried my eyes open and peeled the contact lenses off my eyeballs. They then put me into a medically-induced coma (SO METAL) so that my brain could rest and heal without Awake Drew barging in and fucking everything up.

I don’t remember any of that. I told you I wouldn’t be a very reliable narrator.

This is many things. It is gutting, inspiring, saddening, frustrating, at times very funny because Drew Magary wrote it so of course it is, illuminating, and moving. But, as a piece of writing, it’s perfect. Put this on your reading list for the weekend, or read it now. I don’t care which; it’s worth your time.

This essay by Paul Ford, published in Wired, is magnificent. I’ve been letting it stew all day, re-reading it a couple of times here and there. It’s beautiful, haunting, gutting, and romantic. Two excerpts from a dozen or more I could have picked to share here. First:

I keep meeting people out in the world who want to get into this industry. Some have even gone to coding boot camp. They did all the exercises. They tell me about their React apps and their Rails APIs and their page design skills. They’ve spent their money and time to gain access to the global economy in short order, and often it hasn’t worked.

I offer my card, promise to answer their emails. It is my responsibility. We need to get more people into this industry.

But I also see them asking, with their eyes, “Why not me?”

And here I squirm and twist. Because— because we have judged you and found you wanting. Because you do not speak with a confident cadence, because you cannot show us how to balance a binary tree on a whiteboard, because you overlabored the difference between UI and UX, because you do not light up in the way that we light up when hearing about some obscure bug, some bad button, the latest bit of outrageousness on Hacker News. Because the things you learned are already, six months later, not exactly what we need. Because the industry is still overlorded by people like me, who were lucky enough to have learned the etiquette early, to even know there was an etiquette.

Tech is, of course, not the sole industry with an insular and specific culture; but, it is something that can be changed by readers of websites like this one, or Wired. Technology has been commoditized so that you see people of every age, race, gender, and personality walking around with a smartphone or a DSLR or a smartwatch or wireless headphones, but the creation of these things haven’t followed suit at the same rate.

The second excerpt:

I have no desire to retreat to the woods and hear the bark of the fox. I like selling, hustling, and making new digital things. I like ordering hard drives in the mail. But I also increasingly enjoy the regular old networks: school, PTA, the neighbors who gave us their kids’ old bikes. The bikes represent a global supply chain; when I touch them, I can feel the hum of enterprise resource planning software, millions of lines of logistics code executed on a global scale, bringing the handlebars together with the brakes and the saddle onto its post. Then two kids ride in circles in the supermarket parking lot, yawping in delight. I have no desire to disrupt these platforms. I owe my neighbors a nice bottle of wine for the bikes. My children don’t seem to love computers as I do, and I doubt they will in the same way, because computers are everywhere, and nearly free. They will ride on different waves. Software has eaten the world, and yet the world remains.

This sounds dour and miserable but it isn’t all that — I promise. As much as Ford examines the failings of the industry in this essay, there’s an undercurrent of optimism.

In some ways, Ford’s piece reminds me of Frank Chimero’s 2018 essay about how web development is increasingly like building software instead of just writing a document. I remember when I learned that I could view the source of a webpage, and that’s how I began to learn how to build stuff for the web. That foundation drove my career and a passion for learning how things are made. Things are different now, of course. Common toolchains now generate gnarly HTML and indecipherable CSS; the web is less elegant and human-driven. But I’m not sure that different and harder are necessarily worse.

Thinking more comprehensively about Ford’s essay, perhaps there’s a new perspective that can be brought only by those new to tech. After growing up with the stratospheric rise of the industry and seeing how it has strained, maybe that context will inform how they read this piece.

Joel Schectman and Christopher Bing, Reuters:

A team of former U.S. government intelligence operatives working for the United Arab Emirates hacked into the iPhones of activists, diplomats and rival foreign leaders with the help of a sophisticated spying tool called Karma, in a campaign that shows how potent cyber-weapons are proliferating beyond the world’s superpowers and into the hands of smaller nations.

[…]

The ex-Raven operatives described Karma as a tool that could remotely grant access to iPhones simply by uploading phone numbers or email accounts into an automated targeting system. The tool has limits — it doesn’t work on Android devices and doesn’t intercept phone calls. But it was unusually potent because, unlike many exploits, Karma did not require a target to click on a link sent to an iPhone, they said.

In 2016 and 2017, Karma was used to obtain photos, emails, text messages and location information from targets’ iPhones. The technique also helped the hackers harvest saved passwords, which could be used for other intrusions.

It isn’t clear whether the Karma hack remains in use. The former operatives said that by the end of 2017, security updates to Apple Inc’s iPhone software had made Karma far less effective.

This story is just one part of a deeper investigation from Schectman and Bing into surveillance activities by the United Arab Emirates on dissidents and activists, which is worth reading. Remarkably, it even cites a named source.

The timing of the capabilities of this exploit coincide with the introduction of iMessage media previews. If I were looking to create a security hole in an iPhone without any user interaction, that’s the first place I’d look. Also, note that this report states that this exploit is now “far less effective”; it does not say that the vulnerabilities have been patched.

A new post by Justin O’Beirne is an immediate must-read for me, and this latest one is no exception. In fact, it’s maybe the one I would most recommend because it’s an analysis of the first leg of a four-year project Apple unveiled earlier this year. Here’s what Matthew Panzarino wrote at the time for TechCrunch:

The coupling of high-resolution image data from car and satellite, plus a 3D point cloud, results in Apple now being able to produce full orthogonal reconstructions of city streets with textures in place. This is massively higher-resolution and easier to see, visually. And it’s synchronized with the “panoramic” images from the car, the satellite view and the raw data. These techniques are used in self-driving applications because they provide a really holistic view of what’s going on around the car. But the ortho view can do even more for human viewers of the data by allowing them to “see” through brush or tree cover that would normally obscure roads, buildings and addresses.

O’Beirne:

Regardless of how Apple is creating all of its buildings and other shapes, Apple is filling its map with so many of them that Google now looks empty in comparison. […]

And all of these details create the impression that Apple hasn’t just closed the gap with Google — but has, in many ways, exceeded it…

[…]

But for all of the detail Apple has added, it still doesn’t have some of the businesses and places that Google has.

[…]

This suggests that Apple isn’t algorithmically extracting businesses and other places out of the imagery its vans are collecting.

Instead, all of the businesses shown on Apple’s Markleeville map seem to be coming from Yelp, Apple’s primary place data provider.

Rebuilding Maps in such a comprehensive way is going to take some time, so I read O’Beirne’s analysis as a progress report. But, even keeping that in mind, it’s a little disappointing that what has seemingly been prioritized so far in this Maps update is to add more detailed shapes for terrain and foliage, rather than fixing what places are mapped and where they’re located. It isn’t as though progress isn’t being made, or that it’s entirely misdirected — roads are now far more accurate, buildings are recognizable, and city parks increasingly look like city parks — but the thing that frustrates me most about Apple Maps in my use is that the places I want to go are either incorrectly-placed, not there, or have inaccurate information like hours of operation.

As has become a bit of a tradition around here, I have a review of iOS 12 coming; however, it won’t be out today. Turns out trying to find an apartment in Calgary right now is difficult and time consuming.

In the interim, please read Federico Viticci’s excellent deep dive into iOS 12. It’s far more detailed than mine will ever be and, as the iOS automation expert, he’s uniquely gifted in explaining this update’s improvements to Siri and the new Shortcuts app.

I’ve been using my iPhone X for nearly a week now and, while I have some thoughts about it, by no means am I interested in writing a full review. There seem to be more reviews of the iPhone X on the web than actual iPhone X models sold. Instead, here are some general observations about the features and functionality that I think are noteworthy.

The Hardware

The iPhone X is a product that feels like it shouldn’t really exist — at least, not in consumers’ hands. I know that there are millions of them in existence now, but mine feels like an incredibly well-made, one-off prototype, as I’m sure all of them do individually. It’s not just that the display feels futuristic — I’ll get to that in a bit — nor is it the speed of using it, or Face ID, or anything else that you might expect. It is all of those things, combined with how nice this product is.

I’ve written before that the magic of Apple’s products and their suppliers’ efforts is that they are mass-producing niceness at an unprecedented scale. This is something they’ve become better at with every single product they ship, and nothing demonstrates that progress better than the iPhone X.

It’s such a shame, then, that the out-of-warranty repair costs are appropriately high, to the point where not buying AppleCare+ and a case seems downright irresponsible. Using the iPhone X without a case is a supreme experience, but I don’t trust myself enough to do so. And that’s a real pity, because it’s one of those rare mass-produced items that feels truly special.

The Display

This is the first iPhone to include an OLED display. It’s made by Samsung and uses a diamond subpixel arrangement, but Apple says that it’s entirely custom-designed. Samsung’s display division is being treated here like their chip foundry was for making Apple’s Ax SoCs.

And it’s one hell of a display. It’s running at a true @3x resolution of 458 pixels per inch. During normal use, I can’t tell much of a difference between it and the 326 pixel-per-inch iPhone 6S that I upgraded from. But when I’m looking at smaller or denser text — in the status bar, for example, or in a long document — this iPhone’s display looks nothing less than perfect.

One of the reasons this display looks so good is because of Apple’s “True Tone” feature, which matches the white balance of the display to the environment. In a lot of indoor lighting conditions, that’s likely to mean that the display is yellower than you’re probably used to. Unlike Night Shift, though, which I dislike for being too heavy-handed, True Tone is much subtler. Combine all of this — the brightness of the display, its pixel density, its nearly edge-to-edge size, and True Tone — with many of iOS’ near-white interface components and it really is like a live sheet of paper in your hand.

Because it’s an OLED display that has the capability of switching on and off individual pixels, it’s only normal to consider using battery-saving techniques like choosing a black wallpaper or using Smart Invert Colours. I think this is nonsense. You probably will get better battery life by doing both of those things, but I’ve been using my iPhone X exactly the same as I have every previous phone I’ve owned and it gets terrific battery life. Unless you’re absolutely paranoid about your battery, I see no reason in day-to-day use to treat the iPhone X differently than you would any other phone.

I’m a total sucker for smaller devices. I’d love to see what an iPhone SE-sized device with an X-style display would be like.

Face ID

Face ID is, for my money, one of the best things Apple has done in years. It has worked nearly flawlessly for me, and I say that with no exaggeration or hyperbole. Compared to Touch ID, it almost always requires less effort and is of similar perceptual speed. This is particularly true for login forms on the web: where previously I’d see the Touch ID prompt and have to shuffle my thumb down to the home button, I now just continue staring at the screen and my username and password are just there.

I’m going to great pains to avoid the most obvious and clichéd expression for a feature like this, but it’s apt here: it feels like magic.

The only time Face ID seems to have trouble recognizing me is when I wake up, before I’ve put on my glasses. It could be because my eyes are still squinty at the time and it can’t detect that I’m looking at the screen, or maybe it’s just because I look like a deranged animal first thing in the morning. Note, though, that it has no trouble recognizing me without my glasses at any other time; however, I first set up Face ID while wearing my glasses and that’s almost always how I use it to unlock my phone. That’s how it recognizes me most accurately.

UI Differences

Last week, I wrote that I found that there was virtually no learning curve for me to feel comfortable using the home indicator, and I completely stand by that. If you’ve used an iPad running iOS 11, you’re probably going to feel right at home on an iPhone X. My favourite trick with the home indicator is that you can swipe left and right across it to slide between recently-used apps.

Arguably, the additional space offered by the taller display is not being radically reconsidered, since nearly everything is simply taller than it used to be. But this happens to work well for me because nearly everything I do on my iPhone is made better with a taller screen: reading, scrolling through Twitter or Instagram, or writing something.

The typing experience is, surprisingly, greatly improved through a simple change. The keyboard on an iPhone X is in a very similar place to where it is on a 4.7-inch iPhone, which means that there’s about half an inch of space below it. Apple has chosen to move the keyboard switching button and dictation control into that empty space from beside the spacebar, and this simple change has noticeably improved my typing accuracy.

In a welcome surprise, nearly all of the third-party apps I use on a regular basis were quickly updated to support the iPhone X’s display. The sole holdouts are Weather Line, NY Times, and Spotify.

I have two complaints with how the user interfaces in iOS work on the iPhone X. The first is that the system still seems like it is adapting its conventions to fit bigger displays. Yes, you can usually swipe right from the lefthand edge of the display to go back to a previous screen, but toolbars are still typically placed at the top and bottom of the screen. With a taller display, that means that there can be a little more shuffling of the device in your hand to hit buttons on opposite sides of the screen.

My other complaint is just how out of place Control Centre feels. Notification Centre retains its sheet-like appearance if it’s invoked from the left “ear” of the display, but Control Centre opens as a sort of panelled overlay with the status bar in the middle of the screen when it is invoked from the right “ear”. The lack of consistency between the two Centres doesn’t make sense to me, nor does the awkward splitting of functionality between the two upper corners of the phone. It’s almost as though it was an adjustment made late in the development cycle.

Update: One more weird Control Centre behaviour is that it displays the status bar but in a different layout than the system usually does. The status bar systemwide shows the time and location indicator on the left, and the cellular signal, WiFi indicator, and battery level on the right. The status bar within Control Centre is, left to right: cellular signal, carrier, WiFi indicator, various status icons for alarm and rotation lock, location services indicator, Bluetooth status, battery percentage, and battery icon. The location indicator, cellular strength, and WiFi signal all switch sides; I think they should stay consistent.

I don’t know what the ideal solution is for the iPhone X. Control Centre on the iPad is a part of the multitasking app switcher, and that seems like a reasonable way to display it on the iPhone, too. I’m curious as to why that wasn’t shipped.

Cameras and Animoji

This is the first dual-camera iPhone I’ve owned so, not only do I get to take advantage of technological progress in hardware, I also get to use features like Portrait Mode on a regular basis. Portrait Mode is very fun, and does a pretty alright job in many environments of separating a subject from its background. Portrait Lighting, new in the iPhone 8 and iPhone X, takes this one step further and tries to replicate different lighting conditions on the subject. I found this to be much less reliable, with the two spotlight-style “stage lighting” modes to be inconsistent in their subject detection abilities.

The two cameras in this phone are both excellent, and the sensor captures remarkable amounts of data, especially if you’re shooting RAW. Noise is well-controlled for such a small sensor and, in some lighting conditions, even has a somewhat filmic quality.

I really like having the secondary lens. Calling it a “telephoto” lens is, I think, a stretch, but its focal length creates some nice framing options. I used it to take a photo of my new shoes without having to get too close to the mirror in a department store.

Animoji are absurdly fun. The face tracking feels perfect — it’s better than motion capture work in some feature films I’ve seen. I’ve used Animoji more often as stickers than as video messages, and it’s almost like being able to create your own emoji that, more or less, reflects your actual face. I only have two reservations about Animoji: they’re only available as an iMessage app, and I worry that it won’t be updated regularly. The latter is something I think Apple needs to get way better at; imagine how cool it would be if new iMessage bubble effects were pushed to devices remotely every week or two, for example. It’s the same thing for Animoji: the available options are cute and wonderful, but when Snapchat and Instagram are pushing new effects constantly, it isn’t viable to have no updates by, say, this time next year.

AppleCare+

I mentioned above that I bought AppleCare+ for this iPhone. It’s the first time I’ve ever purchased AppleCare on a phone, and only the second time I’ve purchased it for any Apple product — the first was my MacBook Air because AppleCare also covered the Thunderbolt Display purchased around the same time. This time, it was not a good buying experience.

I started by opening up the Apple Store app, which quoted $249 for AppleCare+ for the iPhone X. I tapped on the “Buy Now” button in the app but received an error:

Some products in your bag require another product to be purchased. The required product was not found so the other products were removed.

As far as I can figure out, this means that I need to buy an iPhone X at the same time, which doesn’t make any sense as the Store page explicitly says that AppleCare+ can be bought within sixty days.

I somehow wound up on the check coverage page where I would actually be able to buy extended coverage. After entering my serial number and fumbling with the CAPTCHA, I clicked the link to buy AppleCare. At that point, I was quoted $299 — $50 more than the store listing. I couldn’t find any explanation for this discrepancy, so I phoned Apple’s customer service line. The representative told me that the $249 price was just an estimate, and the $299 price was the actual quote for my device, which seems absurd — there’s simply no mention that the advertised price is anything other than the absolute price for AppleCare coverage. I went ahead with my purchase, filling in all my information before arriving at a final confirmation page where the price had returned to $249, and that was what I was ultimately charged.

It’s not the $50 that troubles me in this circumstance, but the fact that there was a difference in pricing at all between pages on Apple’s website. I don’t know why I was ever shown a $299 price, nor do I understand why I’m unable to use the Apple Store app to purchase AppleCare+ for my iPhone X using my iPhone X.

David Zax, in a must-read article for Fast Company, describes the litigation initiated by Casper against several mattress review websites:

On April 29, 2016, Casper filed lawsuits against the owners of Mattress Nerd, Sleep Sherpa, and Sleepopolis (that is, Derek), alleging false advertising and deceptive practices.

Mattress Nerd and Sleep Sherpa quickly settled their cases, and suddenly their negative Casper reviews disappeared from their sites, in what many onlookers speculated was a condition of the settlements. But by the end of 2016, when I started closely studying the lawsuits, Derek’s Casper review remained, defiantly, up on Sleepopolis. He was soldiering on in his legal battle with the mattress giant. People who knew him called Derek a fighter; one of his nicknames was “Halestorm.”

Casper had another way of referring to him. Derek was “part of a surreptitious economy of affiliate scam operators who have become the online versions of the same commission-hungry mattress salesmen that online mattress shoppers have sought to avoid,” Casper’s lawsuit alleged. The company complained that Derek was not forthright enough about his affiliate relationships, noting his disclosures were buried in a remote corner of his site. This did violate recently issued FTC guidelines, and Derek updated his site to comply.

This is a deeply disturbing piece. Derek Hales, the founder of Sleepopolis, was doing some shady things that seemed to be driven by the value of affiliate links more than his honest opinion of the mattresses. But Casper’s practices are even more suspect, beginning with this correspondence between CEO Phillip Krim and Jack Mitcham of Mattress Nerd:

In January 2015, Krim wrote Mitcham that while he supported objective reviews, “it pains us to see you (or anyone) recommend a competitor over us.”

Krim went on: “As you know, we are much bigger than our newly formed competitors. I am confident we can offer you a much bigger commercial relationship because of that. How would you ideally want to structure the affiliate relationship? And also, what can we do to help to grow your business?”

[…]

Krim then upped his offer, promising to boost Mitcham’s payouts from $50 to $60 per sale, and offering his readers a $40 coupon. “I think that will move sales a little more in your direction,” replied Mitcham on March 25, 2015. In the months that followed, Mattress Nerd would become one of Casper’s leading reviews site partners. (The emails surfaced due to another mattress lawsuit, GhostBed v. Krim; if similar correspondence exists with Derek Hales, it has not become public.)

It certainly sounds like Krim was, behind the scenes, financially incentivizing reviewers to push the Casper mattress. You’ll want to read Zax’s full article for the kicker to the Sleepopolis saga. It’s atrocious.

Update: I’ve been racking my brain all day trying to think about what the end of Zax’s story reminds me of:

“Hello!” ran the text beside the headshot. “My name is Dan Scalco and I’d like to personally welcome you to the brand new version of Sleepopolis. Here’s what’s up… On July 25th, 2017 our company acquired Sleepopolis.com …. Derek Hales and Samantha Hales are no longer associated with Sleepopolis.”

An italicized note added:

“In July 2017, a subsidiary of JAKK Media LLC acquired Sleepopolis.com. Casper provided financial support to allow JAKK Media to acquire Sleepopolis.”

David Carr, writing in the New York Times in 2014:

Last week, I read an interesting article about how smart hardware can allow users to browse anonymously and thus foil snooping from governments. I found it on what looked like a nifty new technology site called SugarString.

Oddly enough, while the article mentioned the need for privacy for folks like Chinese dissidents, it didn’t address the fact that Americans might want the same kind of protection.

There’s a reason for that, although not a very savory one. At the bottom of the piece, there was a graphic saying “Presented by Verizon” followed by some teeny type that said “This article was written by an author contracted by Verizon.”

SugarString writers were apparently prohibited from writing stories about net neutrality or the NSA’s spying activity — remember, this was in 2014, when both of those topics were especially concerning. So if you were going to SugarString for your tech news, you were highly misinformed. Likewise, if you were to visit Sleepopolis — owned by Casper — do you think you’d be getting a fair review of mattress buying options?

The reason I’ve been puzzled all day about this is because I’m nearly certain that there was a similar marketing-spun publication that was created by — I think — a mining or oil and gas company. I don’t think I’m making this up or misremembering it, so if you have any idea what I might be thinking about, let me know.

I’ve got my balcony door wide open this evening and the breeze it’s creating simply isn’t making a difference — I feel like I’m melting into my couch. I should be used to this after a record-shattering summer, but I am not. I live in Canada, in a city where snowfall has been recorded in every month. I am exhausted. I’m holding in one hand a glass of The Hatch’s 2016 “Rhymes with Door Hinge” and, with the other, I am balancing my iPad perhaps a little too precariously on my leg.

I’m flipping through one of the Atlantic’s excellent weekly photo galleries and I see an amazing picture that I know a friend of mine will love. I put down my glass of wine to be able to perform a somewhat tricky routine of dragging the photo with one finger, dragging the URL with another, swiping from the right-hand part of the screen to float Messages over Safari with a third finger, then navigating to that friend’s chat thread and dropping both the image and URL into a message to send it off. I’m impressed, but also not quite used to these complex interactions. I still feel clumsy sometimes when I do them — a thought that was underscored moments later when I went to pick up my glass of wine only to spill it all over my coffee table.

iOS 11, then: it gives you all kinds of fun new powers, especially on an iPad, but it won’t save you if you’re already a klutz.

iOS 11 Review

I’ve been using iOS 11 daily since it was announced at WWDC and, rather than go through each feature point-by-point like an extended changelog with commentary, I thought I’d explore a bit of how this update feels different with daily use. There’s a lot to unpack and, while I think the vast majority of this upgrade is excellent and demonstrates clear progress in areas previously ignored, I feel there are some things that are really and truly confused. Let me show you what I mean.

The Weird Stuff

Let’s start with the lock screen, because that’s where pretty much every iOS interaction will start. When you unlock the device, the lock screen now slides up as though it’s a cover overtop the rest of the system. In some places, like notification preferences, Apple even calls it the “Cover Screen”. But, while this animation suggests that the lock screen is now sitting in an invisible place above the top of the screen, you can’t swipe upwards to unlock a non-iPhone X device — that action will scroll notifications instead — nor can you pull down from the top to lock it.

Lock Screen.
Lock Screen
Notification Centre.
Notification Centre

Making matters even more confusing, if you do pull down from the top of an unlocked device, the screen looks like the lock screen, but doesn’t actually lock the device.

Control Centre now supports 3D Touch-like gestures on the iPad, but no iPad today has 3D Touch.
Control Centre on iPad

Here’s another example: the iPad and other devices that don’t have 3D Touch displays now support some 3D Touch functionality. If you touch and hold on a notification on the lock screen, for example, it looks like you’re doing the “peek” gesture. The new grid-based Control Centre requires 3D Touch interactions on the iPhone but, again, those gestures have been substituted for touch-and-hold on the iPad. I guess these are fine adaptations, but it indicates to me that aspects of the system were designed in anticipation for a mix of devices that don’t yet exist and some — but not all — of the devices that do. It is inconsistent, though: while it’s possible to use 3D Touch interactions in Control Centre and on notifications in Notification Centre, similar “Peek” interactions don’t work on home screen icons or within apps.

The differences in iOS 11, then, continue to balance new functionality with further complications. But this should be no surprise to those who have used Apple’s ecosystem of devices for several years; it is merely accelerating a trend of growing the features of iOS without forgetting its roots. iOS was, in many ways, a fresh start for the future of computing and each iteration of the OS has built upon that. Sometimes, as above, it feels as though these additions are moving a little too fast. I notice this most when additions or updates feel perhaps incomplete, or, at least, not wholly considered.

These can all be added to Control Centre, if you’d like.
Control Centre options
As an example, this iteration of Control Centre is the third major interpretation since iOS 7, released just four years ago. It no longer splits its controls across two pages which, I’m sure, ought to make some people very happy — I was never bothered by that. Its grid-like layout has been touted as being “customizable”, but that’s only true of the app launching and single-function icons across the bottom: you know, the buttons for Calculator, Camera, or the flashlight. You can now choose from over a dozen different apps and functions, including screen recording and a quick-access remote for the Apple TV, and you’re no longer limited to just four of these controls — if there are too many, Control Centre will scroll vertically.

You’d think, though, that by turning Control Centre into a grid that it would be possible to rearrange sections of it by what you use most, or hide controls you never use. That isn’t possible in this version. You might also think that adding a level of customizability would make it possible to assign third-party apps to certain Control Centre launching points — for example, launching PCalc instead of Calculator, or Manual instead of Camera. But that hasn’t happened either. It is also not possible to change which WiFi network you are connected to from Control Centre, despite the additional depth enabled by 3D Touch controls.

Here’s another example of where things feel a bit incomplete: Slide Over and Split View on the iPad. Previously, dragging an app into either multitasking mode required you to swipe from the right edge to expose a grey panel full of oddly-shaped rounded rectangles, each of which contained an app icon. Apart from looking ugly, which it was, this UI made absolutely no sense to me. What were the rounded rectangles representing? Why did they need to be so large? Why did such an obviously unscalable UI ship?

iPad multitasking on iOS 9 and 10.
Old iPad multitasking UI

Thankfully, this interface is no more for iOS. iPad multitasking is now made somewhat easier by the new systemwide floating Dock. It works and looks a little bit like the Dock on MacOS, insomuch as it contains your favourite apps and can be accessed from within any app simply by swiping upwards from the bottom of the screen. If you want to get an app into Split View or Slide Over, all you need to do is drag its icon up from the Dock and let it expand into a multitasking view on either side of the open app.

But hang on just a minute: if you’re on the home screen, dragging an app icon up from the Dock will remove that app from the Dock. So, in one context, the action is destructive; in others, it’s constructive. That inconsistency feels bizarre in practice, to say the least.

And then there’s the process of getting an app into a multitasking view when it isn’t a Dock app. You can start from the home screen or Spotlight in Notification Centre by finding your app, then touch and hold on the icon until it starts to float. Then, either launch an app with another of your fingers (if you’re starting on the home screen) or press the home button to close Spotlight. Wait until the app icon expands in place, then drop it on either side of the screen to get it into multitasking. It took me a little while to figure out this gymnastics routine and, if I’m honest with myself, it doesn’t feel fully considered. The Dock is brilliant, but the trickiness of getting non-Dock apps into a multitasking view doesn’t yet feel obvious enough.

There is, however, a minor coda of relief: the Dock has space on the righthand side, past the very Mac-like divider, for “suggested” apps. This area tends to include non-Dock apps that you’ve recently used, apps from Handoff, or apps triggered when you connect headphones. But, as this Dock area relies upon technology that is “learning” user patterns rather than being directly user-controlled, the apps you’re expecting may not always be in that area of the Dock. When it works, it’s amazing; when it doesn’t, you still have to do the somewhat-complicated dance of launching apps from the home screen.

Dock popovers in iOS 11

Finally, the Dock has more of that pseudo-3D Touch functionality. You can touch and hold on a supported app’s icon to display a kind of popover menu, which looks a lot like the 3D Touch widgets that display on iPhone apps. But they’re not the same thing; apps that have a widget on the iPhone will have to add a different kind of functionality to show a very similar feature in the iPad’s Dock.

So these things — the Dock and Control Centre — feel like they are hinting at newer and more exciting things, but don’t quite conclude those thoughts. They feel, simply, rushed.

In other ways, though, it can sometimes feel like an addition to iOS has taken longer than it should.

Drag and Drop, Keyboard Flicks, and Other iPad Improvements

That statement, naturally, leads me neatly onto systemwide cross-application drag and drop, making its debut this year. There are apparently lots of reasons for why drag and drop was not in iOS previously — for example, it seems as though APFS and its cloning and snapshot features help enable a faster and more efficient drag and drop experience. The new Dock, which allows for more efficient app switching, also seems to have played a role. But regardless of why it took so many years for such a natural interaction to debut on Apple’s touch devices, we should focus on the what of it. Is it good?

Oh, yes. Very.

I love many of the iPad enhancements in this release, but none has been as strong for me as the implementation of drag and drop. Not only can you drag stuff across apps, the drag interactions are separate from the apps themselves. They kind of live in a layer overtop the rest of the system, so you can move around and find just the app you’re looking for — whether you launch it from the Dock, app switcher, home screen, or Spotlight.

You can pick up multiple items from multiple different apps and drop them into any of several different apps. This takes full advantage of the multitouch display on the iPad.
iOS 11 drag and drop

But my favourite thing about drag and drop on iOS and the reason I’ve been so impressed by it is that you can use all of your fingers to “hold” dragged items until you’re ready to drop them. You can also drag items from multiple sources and even multiple apps. It’s crazy good, to the point where dragging and dropping on a traditional computer using a mouse cursor feels like a kludge. In fact, drag and drop is one of the biggest reasons why I’ve chosen to use an iPad more in the past few months than I did for the preceding year.

Developers do have to add support for drag and drop in their apps, but some UI components — like text areas — will support drag and drop in any app without the developer needing to make adjustments.

The other really big enhancement that has completely transformed my iPad experience is the new app switcher. Swiping from the bottom of the screen reveals the new floating Dock, but a continued (or second) swipe will show the new app switcher. Instead of showing a single app at a time, six thumbnails now fit onto the screen of my 9.7-inch model at once, making for a much better use of the display’s space. I’m not sure how many app thumbnails fit into a 12.9-inch model’s screen; I hope for more.

iOS 11 app switcher

Other than being vastly more efficient, which makes the Swiss half of me extremely happy, the app switcher also preserves app “spaces”. When I’m writing, I like to have Slack and Tweetbot open in split-screen, put Safari and Notes together, and keep Byword in its own space. Now, whenever I switch between these, those pairings are retained: if I tap on Tweetbot in the Dock, I’ll see Tweetbot and Slack, exactly as I left them. This makes it really easy to construct little task-specific parts of the system.

Another great enhancement to the system is the new keyboard. Instead of having to navigate between letters, numbers, and symbols with a modal key, you can now swipe down on individual keys to insert common characters. It takes some getting used to — especially for the ways I type, I often insert a “0” where I mean to type a “p”, for instance. Unfortunately, this relatively common typing mistake isn’t caught by autocorrect. Maybe I’m just sloppy; I’m not sure. Even with my misplaced numerals, I appreciate this keyboard refinement. It makes typing so much faster, especially since I frequently have to type combinations of letters and numbers while writing Pixel Envy. I still think frequent patterns — say, postal codes, for example, which in Canada alternate between letters and numbers — should be automatically formatted as you type, but this keyboard is definitely a great step up once you get used to it.

There are some lingering problems I have with the iPad’s keyboard, in particular, however. I find that it occasionally registers keys tapped in fast succession as a two finger tap, which invokes a text selection mode. I have begun to replace entire sentences without realizing it because of this. I wish the iPad’s keyboard could do a better job of understanding the difference between fast typing and invoking selection mode. The goal should be to make the virtual keyboard as close to a physical keyboard in terms of user confidence and key registration accuracy. Also, I continue to have absolutely awful luck with autocorrect: it capitalizes words seemingly at random, changes word tense several typed words later — when I began typing the word “seemingly” just now, it changed “capitalizes” to “capitalized” — and is frequently a focus-disrupting nuisance. It can be turned off in Settings, but I find that the amount of times autocorrect is actually useful just barely outweighs the times that it is frustrating. Enhancing autocorrect is something I believe should be a focus of every iOS release, major or minor.

But, even with all the attention lavished upon the iPad this year, there are still some ultra-frustrating limitations. With the exception of Safari, you can only open one instance of an app at a time. I cannot tell you how frequently I have two different windows from the same app open at the same time on my Mac, and it’s really irritating to not be able to do that on my iPad, especially with the far better support for multiple apps in iOS 11.

There are other things that have left me wanting on the iPad, too, like the stubbornly identical home screen. I’m not entirely sure it needs a complete rethink. Perhaps, somewhere down the line, we could get a first page home screen that acts a little more like MacOS, with recent files, suggested apps, widgets, and a lot more functionality. But even in the short term, it would make sense to be able to add more icons on each page, especially on the larger-sized models.

And, strangely, in terms of space utilization, the iPad fares slightly worse on iOS 11 than it did running iOS 10 because Notification Centre has reverted to a single-column layout. There may be a reason for this — maybe even a really good one — but any attempt to rationalize it is immediately rendered invalid because the iPhone actually gains a two-column Notification Centre layout in landscape on iOS 11. I do not understand either decision.

iOS 11 replaces the two-column Notification Centre with a single column on the iPad, but adds a second column on the iPhone, even on my non-Plus model.
Notification Centre on iPhone.

I also think that it’s unfortunate that Siri continues to take over the entire display whenever it is invoked. I hope a future iOS update will treat Siri on the iPad more like a floating window or perhaps something that only covers a third of the display — something closer to the MacOS implementation than a scaled-up iPhone display. I know it’s something that’s typically invoked only briefly and then disappears, but it seems enormously wasteful to use an entire display to show no greater information than what is shown on the iPhone.

Siri

Here’s a funny thing about that previous paragraph: using the word “Siri” to describe Apple’s voice-controlled virtual assistant is actually a bit antiquated. You may recall that, in iOS 10, the app suggestions widget was renamed “Siri App Suggestions”; in iOS 11, it has become clear that “Siri” is what Apple calls their layer of AI automation. That’s not necessarily super important to know in theory, but I think it’s an interesting decision; it’s one thing for a website to note that their search engine is “powered by Google”, but I’m not sure Siri has the reputation to build Apple’s AI efforts on. Then again, perhaps it’s an indication that these efforts are being taken more seriously.

In any case, the new stuff: the personal assistant front-end for Siri has a new voice. In many contexts, I’ve felt it sounds more natural, and that alone helps improve my trust in Siri. However, I’m not sure it’s truly more accurate, though I perceive a slight improvement.

This idea of Siri as a magical black box is something I’ve written about several times here. I will spare you my rehashing of it. Of course, this is the path that many new technologies are taking, from Google and Amazon’s smart speakers to the mysterious friend recommendations in Facebook and LinkedIn. It’s all unfathomable, at least to us laypeople. When it works, it’s magical; when it doesn’t, it’s frustrating, and we have no idea what to do about it, which only encourages our frustration. These technologies are like having a very drunk butler following you everywhere: kind of helpful, but completely unpredictable. You want to trust them, but you’re still wary.

Even with a new voice and perhaps slightly more attentive hearing, Siri is still oblivious to common requests. I am writing these words from a sandwich place near where I live called the Street Eatery. It was recommended to me by Siri after I asked it for lunch recommendations, which is great. However, when I followed up Siri’s recommendation by asking it to “open the Street Eatery’s website”, it opened a Trip Advisor page for a place called the Fifth Street Eatery in Colorado, instead of the restaurant located blocks away that it recommended me only moments before.

In iOS 11, Siri also powers a recommendation engine in News, and suggests search topics in Safari when you begin using the keyboard. For example, when tapped on the location bar after reading this article about Ming-Chi Kuo’s predictions for the new iPhone, it correctly predicted in the QuickType bar that I may want to search more for “OLED”, “Apple Inc.”, or “iPhone”. But sometimes, Siri is still, well, Siri: when I tapped on the location bar after reading a review of an Indian restaurant that opened relatively recently, its suggestions were for Malaysian, Thai, and Indonesian cuisine — none of which were topics on that page. The restaurant is called “Calcutta Cricket Club”, and the post is tagged in WordPress with “Indian cuisine”, so I have no idea how it fathomed those suggestions. And there’s no easy way for me to tell Apple that they’re wrong; I would have to file a radar. See the above section on magical black boxes.

To improve its accuracy over time, Siri now syncs between different devices. Exactly what is synced over iCloud is a mystery — Apple hasn’t said. My hunch is that it’s information about your accent and speech patterns, along with data about the success and failure of different results. Unfortunately, even with synced data, Siri is still a decidedly per-device assistant; you cannot initiate a chain of commands on one device, and then pick it up on another. For example, I wouldn’t be able to ask my iPad to find me recommendations for dinner, then ask my iPhone to begin driving directions to the first result without explicitly stating the restaurant’s name. And, even then, it might pick a restaurant thousands of miles away — you just never know.

User Interface and Visual Design

At the outset of this review, I wrote that I wanted primarily to relay my experiences with the iOS 11 features I use most and had the greatest impact on how I use these devices. I want to avoid the temptation of describing every change in this version, but I don’t think I can describe the ways I have used with my iPhone and iPad without also writing about the ways in which Apple has changed its visual design.

Every new major release of iOS gives Apple the chance to update and refine their design language, and iOS 11 is no exception. Last year, Apple debuted a new style of large, bold titles in News, Music, and the then-new Home app; this year, that design language has bled throughout the system. Any app defined by lists — including Mail, Phone, Contacts, Wallet, Messages, and even Settings — now has a gigantic billboard-esque title. It kind of reminds me of Windows Phone 7, only nicer. I like it a lot and, based on the screenshots I’ve seen so far, it appears to work well to define the upper area of the iPhone X.

This big title style looks nice, but I’m not sure writing “Settings” in gigantic bold letters really affects how I use this app or the system overall.
Settings in iOS 11
In practice, though, this treatment means that the top quarter of the screen is used rather inefficiently in an app’s initial view. You launch Settings, for example, and the screen is dominated by a gigantic bold “Settings” label. You know you’re in Settings — you just launched it. A more cynical person might point to this as an indication that all post-iOS 7 apps look the same and, therefore, some gigantic text is needed to differentiate them. I do not believe that is the case — there is enough identifying information in each app, between its icon, layout, and contextually-relevant components.

And yet, despite the wastefulness of this large text, I still think it looks great. The very high resolution displays in every device compatible with iOS 11 and Apple’s now-iconic San Francisco typeface combine to give the system a feeling of precision, intention, and clarity. Of course, it’s worth asking why, if it’s so great, a similar large header is not shown as one triangles further into an app. I get the feeling that it would quickly become overbearing; that, once you’re deep within an app, it’s better to maximize efficiency — in magazine terms, the first page can be a cover, but subsequent levels down within the same app should be the body.

Fans of clarity and affordances in user interfaces will be delighted to know that buttons are back. Kind of. Back when iOS 7 debuted, I was among many who found the new text-only “buttons” strewn throughout the system and advocated for in the HIG as contentious and confusing. Though I’ve gotten more used to them over the past several years, my opinion has not changed.

iOS 11 is part of what I’m convinced is a slow march towards once again having buttons that actually look like buttons. The shuffle and looping controls in Music, for instance, are set against a soft grey background. The App Store launcher in Messages is a button-looking button. But, lest you think that some wave of realization has come across the visual designers working on iOS, you should know that the HIG remains unchanged, as does the UIButton control.

iOS 11 app icons

There are some noteworthy icon changes in this update as well. I quite like the new Contacts icon and the higher-contrast icon for Settings, but I have no idea what Apple’s designers were thinking with the new Calculator icon. It’s grey; it has a glyph of a calculator on it in black and orange. And I reiterate: it is grey. The Reminders icon has been tweaked, while the new Maps icon features a stylized interpretation of Apple Park which, per tradition, is cartographically dubious. I don’t like the plain-looking Files icon; I remain less-than-enthusiastic about almost any icon that features a glyph over a white background, with the exceptions of Photos and the NY Times app.

The new App Store icon proved controversial when it launched, but I actually like it. The previous glyph was a carryover from MacOS and, while I don’t think that it was confusing anyone, I do think that this simplified interpretation feels more at home on iOS. The new iTunes Store icon is the less successful of the two redesigns, I feel. As Apple Music has taken over more of the tunes part of iTunes, it appears that the icon is an attempt to associate iTunes with movies and TV shows through the blending of the purple background colour and the star glyph — both attributes, though not identical, are used for the iMovie icon as well. But this only seems to highlight the disconnect between the “iTunes Store” name and its intended function.

Icons on tab bars throughout the system have also been updated. In some places, solid fills replace outlines; in others, heavier line weights replace thin strokes. I really like this new direction. It’s more legible, it feels more consistent, and it simply looks better. These are the kinds of refinements I have expected to see as the course correction that was iOS 7 matures. While it has taken a little longer than I had hoped, it’s welcome nevertheless.

And, for what it’s worth, the signal bars have returned to the status bar, replacing the circular signal dots. This reversion seems primarily driven by the iPhone X’s notched display, but every iPhone and iPad model gets the same status bar. I cannot figure out why the brand new Series 3 Apple Watch uses dots to display LTE signal strength.

To complement the static visual design enhancements, many of the system animations have been tweaked as well. When you lift an iPhone 6S or later, the screen now fades and un-blurs simultaneously; it’s very slick. The app launching animation has been updated, too, so that it now appears as though the app is expanding from its icon. It’s a small thing; I like it.

Assorted Notes and Observations

  • The App Store has been radically redesigned. I’m dumping it down in this section because, while I applaud the efforts behind separating games from other kinds of apps and I think the News tab is a great way to help users find apps that might be buried by the hundreds of thousands of others, it has not changed the way I use the App Store. I’m pretty settled into a certain routine of apps, so I don’t regularly need to look for more. I didn’t ever really think, during my experience testing it, to check the App Store for what is being featured or what collections have been created lately.

  • ARKit and Core ML are both very promising technologies that, I think, will need several more months in developers’ hands to bear fruit. Carrot Weather has a fun AR mode today, if you want to try it out.

  • There aren’t any new Live or Dynamic wallpapers in iOS 11. Live wallpapers were introduced two years ago; Dynamic wallpapers were introduced four years ago.

  • The new still wallpapers are a clear retro play. There are familiar six-colour rainbow stripes, a Retina-quality version of the Earth photograph from the original iPhone, and — for the first time — Apple has included a plain black wallpaper.

  • Apple Music has gained some social networking features that, I think, might actually work well. After iTunes Ping and Connect, this is the third time Apple has really tried to push any kind of social functionality (Connect still exists in Apple Music, but I don’t know anybody who actually uses it). Apple Music’s new user profiles can automatically show your friends what you’re listening to, and you can display your playlists too. I expect the automatic sharing aspect — as opposed to requiring users manually update their profiles — to be a primary factor if it continues to be as successful in general use as it has been for me in beta.

  • There’s also a new take on a shared party playlist. I sincerely doubt that many people go to house parties to control the playlist in a group setting. Maybe this will change with the launch of the HomePod but, like Apple’s previous attempts — Party Shuffle and iTunes DJ — I expect this feature to be largely forgotten.

  • As I mentioned last year, I think the Memories feature in Photos is one of the best things Apple has built in a long time. iOS 11 promises additional event types, like weddings and anniversaries, which provides more variety in the kinds of Memories that are generated. I love this kind of stuff.

  • The vast majority of system photo filters have been replaced with much more sensitive and realistic filters. I’ve used them several times. While they’re no replacement for my usual iPhone editing process, they work much better in a pinch than the ones that date back to iOS 7, simply because they’re less garish.

  • You can now set Live Photos to loop, “bounce” back and forth, or even convert them into long exposure photos. These are fine effects, but I wish the long exposure effect would do better at detecting faces or foreground objects and creating a blur in the background. This may be more sophisticated on iPhones equipped with dual cameras; I’m not sure.

  • There’s a new file format for video and images — the latter of which is probably the one that will cause the most unnecessary concern. Instead of JPG, photos are saved in the relatively new High-Efficiency Image Format, or HEIF. I have not noticed any compatibility issues, and you get smaller file sizes and fewer compression artifacts in return.

  • The new Files app ostensibly provides access to all of your files in iCloud Drive and supporting third-party apps. However, because the most major enhancement of this is third-party app support, my time with it while testing is limited to what I have in iCloud, which makes the app function similarly to the iCloud Drive app it replaces. I look forward to using it as more third-party apps support it.

  • Maps now supports interior maps for an effective handful of malls and airports. If you live in a very large city in the United States or China, this will likely be useful to you; for the rest of us, I guess they have to start somewhere.

  • Flyover has also been enhanced in Maps, turning it into a sort of Godzilla mode where you can walk around a city overhead from your living room. It is ridiculously cool. I couldn’t confirm whether this is built with ARKit.

  • There are two new full-screen effects in Messages: “Echo” and “Spotlight”. The former is easily the more interesting and fun of the two. Also, the app drawer has been redesigned so it’s way easier to use.

  • Messages will support peer-to-peer Apple Pay in the United States later this year — my understanding is that there is a regulatory delay holding it up. As of June, the iPhone 7 was available in about ninety other countries worldwide. There are probably legal requirements that need to be satisfied for it to roll out anywhere else but, as an end user, the reasoning matters little. All that matters to me about this feature is that it will not be available where I live, and that’s a huge bummer.

  • The 3D Touch shortcut to get into the app switcher has been removed in this version of iOS for reasons I can’t quite figure out. It took me a while to get used to its removal; I used it a lot in iOS 9 and 10.

  • Safari now takes steps to restrict ad tracking and retargeting cookies to twenty-four hours of data validity. The advertising industry’s biggest trade groups are furious about this. Their creepy selves can fuck straight off.

Final Thoughts

As I’ve been writing for a few years now in occasional posts here, it feels like Apple has been going through a simultaneous series of transitions. Their services business is growing dramatically, they’ve switched over to an SSD-and-high-resolution-display product lineup — for the most part — and have been demonstrating how nontraditional devices like the iPad and Apple Watch can supplant the Mac and iPhone in some use cases.

While this story obviously isn’t going to wrap up so long as technology and Apple keep pushing things forward, iOS 11 feels like it is starting to resolve some of the questions of past releases. Despite my complaints about the rushed-feeling Control Centre and multitasking implementations, I also think that Apple is doing a lot of things very right with this update. Drag and drop is awesome, Siri is getting better, there are visual design improvements throughout, and Apple Music’s social networking features are very fun.

There is a lot that I haven’t covered in this review. That’s deliberate — some features aren’t available where I live or on the devices I use, while other changes have been small enough that you may not notice them day-to-day. However, the cumulative effect of all of these changes is a more complete, well-rounded version of iOS. I do think that the action of putting apps into Slide Over or Split View needs a more considered approach, but I can’t let that spoil how much better the Dock is than the old scrolling list overlay.

The short version of this review is very simple: if you reach for one of your iOS devices instead of running to your Mac for an increasing number of tasks, as Apple is coaxing you to do with each update, you’ll love iOS 11. Even if you don’t, and your iOS devices remain a peripheral extension to your Mac, you’ll find much to love in this version. Make no mistake: this isn’t trying to bring the Mac to your iPhone or iPad; iOS 11 is all about building upon their capabilities in a very iOS-like way. I would expect nothing less and, despite my wishes throughout this review for more, I feel like iOS 11 feels more complete than any previous update. It’s one of those ones where there’s very little you can put your finger on, but there are a lot of small things that make the system better.

iOS 11 is available as a free update for 64-bit iOS devices only: the iPhone 5S or later, iPad Mini 2/iPad Air or later, and the sixth-generation iPod Touch.

Glenn Fleishman, writing for Wired’s Backchannel:

What does international political corruption have to do with type design? Normally, nothing — but that’s little consolation for the former prime minister of Pakistan. When Nawaz Sharif and his family came under scrutiny earlier this year thanks to revelations in the Panama Papers, the smoking gun in the case was a font. The prime minister’s daughter, Maryam Sharif, provided an exculpatory document that had been typeset in Calibri — a Microsoft font that was only released for general distribution nearly a year after the document had allegedly been signed and dated.

A “Fontgate” raged. While Sharif’s supporters waged a Wikipedia war over the Calibri entry, type designer Thomas Phinney quietly dropped some history lessons about the typeface on Quora, and found himself caught in a maelstrom of global reporting. Phinney said that because Calibri has been in use for several years, people have forgotten that it’s a relatively new font. This has made Calibri a hot topic in document forgery as fakers fail to realize that this default Microsoft Word typeface will give itself away.

This wasn’t Phinney’s first forgery rodeo. He calls himself a font detective—an expert called upon in lawsuits and criminal cases to help determine documents’ authenticity based on forensic analysis of letterforms used, and sometimes the ways in which they appear on paper. Phinney even IDs each of his cases with a Sherlock-Holmesian title: The Dastardly Divorce, The Quarterback Conundrum, and The Presidential Plot.

This is such a great piece. Given how tedious it can be for even an expert like Phinney to ascertain a document’s authenticity, try to imagine the kind of forensic work that will be needed in the near future to try to identify whether a video of someone speaking is real.

Maciej Cegłowski, in an infinitely-quotable transcript from a talk he gave at Republica Berlin:

The danger facing us is not Orwell, but Huxley. The combo of data collection and machine learning is too good at catering to human nature, seducing us and appealing to our worst instincts. We have to put controls on it. The algorithms are amoral; to make them behave morally will require active intervention.

The second thing we need is accountability. I don’t mean that I want Mark Zuckerberg’s head on a pike, though I certainly wouldn’t throw it out of my hotel room if I found it there. I mean some mechanism for people whose lives are being brought online to have a say in that process, and an honest debate about its tradeoffs.

Cegłowski points out, quite rightly, that the data-addicted tech industry is unlikely to effectively self-regulate to accommodate these two needs. They’re too deeply-invested in tracking and data collection, and their lack of ethics has worked too well from a financial perspective.

Cegłowski, again:

But real problems are messy. Tech culture prefers to solve harder, more abstract problems that haven’t been sullied by contact with reality. So they worry about how to give Mars an earth-like climate, rather than how to give Earth an earth-like climate. They debate how to make a morally benevolent God-like AI, rather than figuring out how to put ethical guard rails around the more pedestrian AI they are introducing into every area of people’s lives.

The tech industry enjoys tearing down flawed institutions, but refuses to put work into mending them. Their runaway apparatus of surveillance and manipulation earns them a fortune while damaging everything it touches. And all they can think about is the cool toys they’ll get to spend the profits on.

The message that’s not getting through to Silicon Valley is one that your mother taught you when you were two: you don’t get to play with the new toys until you clean up the mess you made.

I don’t see any advantage to having a regulated web. I do see advantages to having regulated web companies.

All of us need to start asking hard questions of ourselves — both as users, and as participants in this industry. I don’t think users are well-informed enough to be able to make decisions about how their data gets used. Even if they read through the privacy policies of every website they ever visited, I doubt they’d have enough information to be able to decide whether their data is being used safely, nor do I think they would have any idea about how to control that. I also don’t think many tech companies are forthcoming about how, exactly, users’ data is interpreted, shared, and protected.

Update: If you — understandably — prefer to watch Cegłowski speak, a video of this talk has been uploaded to YouTube. Thanks to Felix for sending me the link.

The short answer: it depends on who you ask, and for what reasons.

Nathan Heller, the New Yorker:

The American workplace is both a seat of national identity and a site of chronic upheaval and shame. The industry that drove America’s rise in the nineteenth century was often inhumane. The twentieth-century corrective—a corporate workplace of rules, hierarchies, collective bargaining, triplicate forms—brought its own unfairnesses. Gigging reflects the endlessly personalizable values of our own era, but its social effects, untried by time, remain uncertain.

Support for the new work model has come together swiftly, though, in surprising quarters. On the second day of the most recent Democratic National Convention, in July, members of a four-person panel suggested that gigging life was not only sustainable but the embodiment of today’s progressive values. “It’s all about democratizing capitalism,” Chris Lehane, a strategist in the Clinton Administration and now Airbnb’s head of global policy and public affairs, said during the proceedings, in Philadelphia. David Plouffe, who had managed Barack Obama’s 2008 campaign before he joined Uber, explained, “Politically, you’re seeing a large contingent of the Obama coalition demanding the sharing economy.” Instead of being pawns in the games of industry, the panelists thought, working Americans could thrive by hiring out skills as they wanted, and putting money in the pockets of peers who had done the same. The power to control one’s working life would return, grassroots style, to the people.

The basis for such confidence was largely demographic. Though statistics about gigging work are few, and general at best, a Pew study last year found that seventy-two per cent of American adults had used one of eleven sharing or on-demand services, and that a third of people under forty-five had used four or more. “To ‘speak millennial,’ you ought to be talking about the sharing economy, because it is core and central to their economic future,” Lehane declared, and many of his political kin have agreed. No other commercial field has lately drawn as deeply from the Democratic brain trust. Yet what does democratized capitalism actually promise a politically unsettled generation? Who are its beneficiaries? At a moment when the nation’s electoral future seems tied to the fate of its jobs, much more than next month’s paycheck depends on the answers.

This is a long article, but it’s worth spending some time with. Heller does a fantastic job of delving into the nuances of “gig economy” jobs, and how participants are frequently sold a myth. That’s not to say that these jobs can’t be good, but rather that the groups of people who benefit most are often as imbalanced as in the broader economy.

Joshua Ho and Brandon Chester subjected the iPhones 7 to the rigorous battery of tests unique to AnandTech, and it’s a screamer: insane performance jumps over the already-fast iPhones 6S met with big leaps in battery life. Yet:

As Apple has rapidly added new features, UI performance has taken a hit, and the nature of the performance problems is such that throwing more hardware at them won’t make them go away because they’re often due to circumstances where rendering is blocked or is being done in a manner such that even large improvements in performance would not bring things back to 60fps. While I’m not going to comb through the entire OS to find all the cases where this happens, it happens enough that it’s something I would describe as a significant pain point in my experience as a user.

It’s nowhere near as egregious as the performance hiccups on Android phones, but iOS is increasingly adding instances where animations aren’t as smooth as they should be. Activating Notification Centre, scrolling through widgets in the Today view, and pulling down to show Spotlight are all instances where it’s reliably easy to cause a suboptimal animation.

Back in June, I had this crazy idea that I was going to review iOS 10 and WatchOS 3 this year, both in my usual long-but-not-too-long style. I drafted an entry for each in MarsEdit, made my notes, and then — nothing. Some real life stuff got in the way, I procrastinated, and I ended up only being able to finish my annual iOS review. I’m okay with that, because I’m pretty happy with the way it turned out this year, but I wish I got the time to write up my thoughts on WatchOS 3 as well.

That being said, I think Matt Birchler has done an outstanding job with his review. He touches on all the main points, in a beautifully-designed review, to boot.

The in-depth Pixel Envy review of iOS 10

Let’s get something out of the way upfront: iOS 10 is a big release. It’s not big in an iOS 7 way, with a full-system redesign, nor does it introduce quite so many new APIs and features for developers as iOS 8 did. But it’s a terrific combination of those two sides of the spectrum, with a bunch of bug fixes tossed in for some zest.

For some perspective, there has been more time between the release of iOS 10 and the original iPhone than between the release of the first iMac and the first iPhone. It’s come a long way, baby, and it shows.

Contents

  1. Installation
  2. Lock Screen, Widgets, and Notifications
    1. Goodbye, Slide to Unlock
    2. Widgets
    3. Notifications
  3. Springboard
    1. Default Applications
    2. Wallpaper
    3. Control Centre
    4. Music and News: Back to the Drawing Board
    5. Animations
    6. The Hidden UI
  4. Keyboard
    1. Differential Privacy
    2. Emoji
  5. Siri
    1. SiriKit and Intents
  6. Maps
  7. Photos
    1. Memories
    2. RAW and Live Photos
  8. Messages
    1. iMessage Apps and Stickers
  9. Music
  10. Continuity
    1. Universal Clipboard
    2. Apple Pay
  11. iPad
  12. Grab Bag
    1. System
    2. Phone and Contacts
    3. Mail
    4. Safari
    5. Sharing
    6. App Store
    7. Improvements for Apple Watch
    8. Settings
    9. Sounds
  13. Conclusion

Installation

Installing iOS 10 is a straightforward affair, particularly with the enhancements to the software update process initiated in iOS 9. It requires less free space than its predecessors to upgrade, and you can ask iOS to update overnight. Nice.

iOS 10 is compatible with most of the devices that iOS 9 was, but it does drop support for some older devices. A5-generation devices and the third-generation iPad are all incompatible; the iPhone 5 is the oldest device that supports iOS 10.

There are additional limitations for the few 32-bit devices that remain supported: Memories and “rich” notifications are are only supported on 64-bit devices. Raise to Wake is only supported on iPhones 6S and newer; it is not supported on any iPad or the iPod Touch. I find that a curious choice — surely Raise to Wake would be just as helpful, if not more so, on the iPad, given its much larger size. And it’s not like a lack of an M-class motion co-processor is an excuse, because both iPads Pro contain a derivative of the A9 processor in the iPhone 6S with the same embedded M9 co-processor.

Lock Screen, Widgets, and Notifications

Goodbye, Slide to Unlock

Back when I bought my first iPhone OS device in 2007 — a first-generation iPod Touch, as the iPhone wasn’t yet available in Canada — I was constantly being asked to demo two features for anyone who asked: slide to unlock, and pinch to zoom. Everyone I know wanted to plunk their finger onto the little arrow nubby and slide it across the bar.

Once again proving that they give nary a shit about legacy or tradition, Apple is dropping “slide to unlock”. Like any major shift — the transition from the thirty-pin dock connector to Lightning, or, say, the removal of the headphone jack — there will be detractors. But I’m not one of them.

Let’s start from the beginning. Back in the days when our iPhones were made of wood and powered by diesel, it made sense to place an interactive barrier on the touch screen between switching the phone on and accessing its functions. It prevented accidental unlocks, and it provided a deliberate delineation between waking the phone and using it.

The true tipping point for “slide to unlock” was the introduction of Touch ID. Instead of requiring an onscreen interaction, it became easier to press the Home button and simply leave your thumb on the button for a little longer to unlock the device. iOS 10 formalizes the completion of the transition to Touch ID. The expectation is that you have a passcode set on your device and that you’re using Touch ID; iOS 10 supports just four devices that don’t have Touch ID home buttons.

But I happen to have one of those devices: an iPad Mini 2. Because it’s an iPad — and, therefore, much larger than an iPhone — I’m far more likely to use the home button to wake it from sleep than I am the sleep/wake button. It took me a while to lose the muscle memory developed over many years to slide the home screen to unlock my iPad. I’m used to interacting with the hardware first, and the onscreen controls immediately after; iOS 10 upends all of this by requiring me to press the home button twice, followed by typing my passcode onscreen. It’s only slightly different, but it sent my head for a bit of a trip for a month or so. I still, on occasion, try to slide to unlock, and curse myself for doing so.

The lock screen interaction feels much better on my iPhone 6S for two reasons. First, my iPhone has the benefit of having the best Touch ID sensor Apple has ever shipped, which means that pressing once on the home button and leaving my finger on the sensor for a bit longer unlocks my phone — almost exactly the same interaction as before, with no additional friction. That’s something that you’ll find across most of the devices compatible with iOS 10, as most of those devices have Touch ID.

Raise to Wake iPhone
The second reason for the vastly improved lock screen experience on my iPhone is that it supports the new Raise to Wake feature. The Windows 10 phone I used for a week earlier this year had a similar feature, and I loved it then; I’m thrilled to see it come to the iPhone. Raise to Wake allows you to pick up your iPhone or pull it out of your pocket to switch on the screen. Awaiting notifications appear with a subtle zoom effect, as though they’re bubbling onscreen from the ether. I suspect a lot of lessons learned from developing the wrist activation on the Apple Watch went into building Raise to Wake, and it shows: I’ve found it to be extremely reliable when pulling my phone out of its pocket, and only a little less so when lifting my phone off a desk.

Throughout this section, I’ve been using the word “unlock” to refer to the same action it’s always been used for: going from the lock screen to the home screen. But this isn’t quite correct any more because it’s now possible to wake and unlock an iOS device without moving away from the lock screen. This is useful for, say, viewing private data in widgets, but it leads to a complication of terminology — when I say that I unlocked my phone, did I go to the home screen or did I remain on the lock screen?

To clarify the terminology, Apple is now referring to the once-“unlocking” act of going to the home screen as “opening” an iOS device. That makes a lot of sense if you think of your iPhone as a door; as I don’t have a Plus model, I do not.

Widgets

No matter what iOS device you use, the lock screen is now even more powerful. The familiar notifications screen sits in what is the middle of a sort of lock screen sandwich, with widgets on the left, and the camera to the right.

The widgets screen is actually just a copy of the Today view in Notification Centre; it’s also available to the left of the first home screen. That makes three places where widgets are available; yet, sadly, all three are identical. It seems to me that there are differences in the way one might use widgets in each location: on the lock screen, you may prefer widgets for the weather, your calendar, and the time of the next bus; in your Notification Centre, you may prefer to see your latest Pinboard bookmarks and what the next episode of your favourite TV show will be.

Widgets and notifications now share a similar frosted glass style, but older style widgets don’t suffer from a loss of contrast — if they haven’t been updated for iOS 10, they get a dark grey background instead. Widgets, notifications, the new Control Centre, and many UI components previously rendered as rounded rectangles are now drawn with a superellipse shape, similar to an expanded version of the shape of the icons on the home screen, or the iPhone itself. It’s a shape that’s simultaneously softer-looking and more precise, without the sharp transition between the rounded corner and the straight edge. I really liked this shape when it appeared on the home screen, and to see it used throughout apps and in widgets makes the whole system feel tied-together. It feels sophisticated, and very deliberately so.

In previous versions of iOS, the only place that widgets would appear is in the Today view, and if you have automatic app updates enabled, the only time you’d figure out if your favourite app had a new widget available was to scroll to the bottom of Today and find it. And, if you wanted to use a particular widget occasionally, but not always, you had to add and remove it from the Today view as you needed it.

The Activity widget, displayed when pressing on its home screen icon.
Activity widget
In addition to the Today view in Notification Centre and on the home and lock screens, apps updated for iOS 10 also get to show their widgets in the 3D Touch menu that accompanies the app’s home screen icon. I think this is terrifically clever. It balances new functionality with the familiarity of the home screen that has remained effectively unchanged in its purpose and appearance in over nine years.

Notifications

Notification Centre in iOS 10
In iOS 10, the similarities in style between notifications and widgets are not coincidental: notifications have been rewritten from the ground up to allow for far more interactivity directly from the notification itself. Notifications can now show dynamic content and images, and they support live updates. Their additional functionality probably explains why they’re so huge, too: it serves as an indication that each notification is interactive. Even so, their size and heavy emphasis on structure makes for a certain lack of elegance. They’re not ugly, but there’s something about the screen to the right that’s not particularly inviting, either.

Pressing on a notification from Messages, for instance, will display the past messages from that thread directly in the notification balloon; or, if the iPhone is unlocked, you can see several past messages. This is particularly nice as a way to reestablish context when someone has replied to an hours- or days-old thread. However, there’s no way to scroll back within a notification balloon — they’re not as fully interactive as they seem to be.

This year also marks the return of my least favourite bug from iOS 8: if you’re typing a quick reply and you tap outside of the keyboard or notification balloon, you lose everything you’ve typed. This bug was fixed in iOS 8.3, but has surfaced again in iOS 10. I’ve lost my fair share of texts due to a misplaced tap; I’m not sure why this remains an issue.

Apple also provides examples of rich data within an expanded notification balloon, like showing the position of a car on a map for a ride hailing app’s notification, or updating a sports score notification as more pucks are dunked in the goalpost. Or whatever. I didn’t have the opportunity to test those features, but I’m looking forward to developers making greater use of Notification Centre as a place to complete tasks without having to open the app.

Notification Centre also borrows a trick from the Apple Watch: you can now press on the clear button in the upper-right to clear all notifications. It really is everything you could have wanted.

Springboard

Default Applications

After a seemingly-endless climb in the number of preinstalled applications on fresh copies of iOS, finally, a plateau — this year’s total is the same as last year’s, at 33. Though the Home app is new, the Game Centre app has been removed, though the framework remains.

But 33 apps is still a lot, particularly when plenty of them will be squirrelled away by most users in a folder marked “Junk”, or the more-cleverly named “Crapple”. I’d make a handsome wager that a majority of iPhone users who have changed their home screen layout have placed the Stocks app in such a folder. Many others will do the same for Calculator, Clock, Contacts, Compass, and Voice Memos. People who don’t own an Apple Watch have no need for the Watch app, so they dump it in there, too.

Go away.
Deleting Tips app on iOS
We don’t really want this folder full of apps we never use on our phones. What we want is to remove them from our phones entirely, never to be seen again. And that’s kind of what you get in iOS 10: just tap and hold on any icon, and you’ll see delete buttons where you’ve never seen them before. Most of the apps you’d expect to be removable are; you can even remove some you might not expect, like Music, Maps, and Mail. As a result of this broad-reaching change, the on/off switch for the iCloud Drive app has been removed as well.

Want to restore an app? That’s pretty easy, too — just open the App Store and search for it.

There are, unfortunately, a couple of caveats that come with this new power. First, it’s important to know that the app isn’t being deleted from your iPhone — it’s simply being removed from the Home screen. This is in large part for security, according to Craig Federighi:

We’re not actually deleting the application binary, and the reason is really pretty two-fold. One, they’re small, but more significantly, the whole iOS security architecture around the system update is this one signed binary, where we can verify the integrity of that with every update.

That also means that even though the default apps appear in the App Store, they won’t get individual updates.

I see this as a limitation due to the way iOS has been built for the past decade, but I don’t necessarily see it always being this way. It would require a large effort to make these core apps independent of the system, but it’s not inconceivable that, one day, updates to these apps might be delivered via the App Store instead of rolling them into monolithic iOS versions.

So if the binary isn’t being removed, what is? Federighi, again:

[When] you remove an app, you’re removing it from the home screen, you’re removing all the user’s data associated from it, you’re moving all of the hooks it has into other system services. Like, Siri will no longer try to use that when you talk and so forth.

In most cases, this works entirely smoothly. If you remove Calculator, for example, it will also be removed from Control Centre. Even if you remove Calendar, it won’t break your ability to add new events or open .ics calendar files.

But if you remove Mail, be prepared to be in for a world of hurt. Mail is the only app permitted to open mailto: links, and no other app can be set to handle those should Mail not be present. When you tap on an email address or an mailto: link, you’ll be prompted to restore Mail; and, because all of its settings are removed when the app is hidden, you’ll have to rebuild your entire email setup. If you have just one email account, you’ll probably be fine, but if you have several, it’s a pain in the ass.

Mailto link handling on iOS when Mail is removed

In earlier betas, tapping on a mailto: link would result in a Safari error page. While the shipping solution is slightly better — insomuch as something actually happens when tapping an email link — I wouldn’t consider this resolved by any stretch. Either it should be impossible to remove Mail, or it ought to be possible to select a third-party app to handle mailto: links.

Wallpaper

Bad news, everyone: aside from the blue-green waterfall image we’ve seen in the betas, there are no new wallpapers in iOS 10. In fact, with just fifteen still images included, and the removal of all but one of the “feather” images from iOS 9, and the loss of all but three of the ones added in iOSes 7 and 8, I think the wallpaper selection in iOS 10 might be at its most pitiful since the iPhone’s launch.

Luckily, we can set our own still images as wallpaper, but we have no way to add a custom dynamic wallpaper. And, for the third year in a row, there isn’t a single new dynamic wallpaper in iOS. I’m not sure if it’s something Apple forgot they added back in iOS 7, or if there are simply no new ideas beyond some bouncing circles. There are also no new live wallpapers.

Control Centre

Since its introduction in iOS 7, Control Centre has been a bit of a grab bag of quick-access shortcuts. To sort out all of its functionality, Apple created five groups of related items: toggles for system settings, a screen brightness slider, audio playback controls, AirDrop and AirPlay controls, and lightweight app shortcuts.

But having all of these controls on a single sheet is less than ideal. At a glance, there’s not quite enough space between disparate controls, which means that your thumb can easily tap the wrong thing when looking for a particular button. And that’s without adding new functionality, like a control for Night Shift — the kludgy-looking five-across row at the bottom is a clue that it doesn’t fit into the existing layout — or quick access controls for HomeKit.

Something clearly had to change, and Apple has addressed it in a rather dramatic fashion: a thorough redesign of Control Centre. It’s now split across two “sheets” — three, if you have connected HomeKit devices.

iOS 9 Control Centre
iOS 10 Control Centre page 1
iOS 10 Control Centre page 2

The initial response to the splitting of Control Centre, as I observed on Twitter and in industry press, was, at best, contentious. Adrian Kingsley-Hughes, in a widely-circulated ZDNet article published weeks after the first beta was released:

The iOS 10 Control Center ranks not only as one of the worst user interface designs by Apple, but as one of the worst by any major software developer.

That’s harsh — undeservedly so, I feel. In fact, I’d go so far as to say that the revised Control Centre is one of the smartest and best-considered user interfaces in iOS.

Let’s start with the actual act of splitting it up into multiple pages. As I noted earlier, there’s only so much Apple could do with the existing single-page layout. Since nobody would seriously propose that Control Centre should not gain any new functionality, there are only a few ways for it to be expanded while remaining on a single page: the controls could get smaller, Control Centre could get taller, or the available controls could be customizable.

Making the controls smaller is no good because thumbs aren’t getting any smaller. If anything, some controls — like the track position scrubber — are already far too small for my liking. Making Control Centre taller, meanwhile, isn’t good for usability either, because thumbs aren’t getting any longer.

As for customizing Control Centre, while I’ve heard rumours that it’s being worked on, it clearly hasn’t progressed to a public release yet. It’s a valid solution, but one that also has its own drawbacks and complexities — it could very quickly become a second-level home screen when the doors of customization are opened. That’s not to say it’s not a solvable problem; rather, that the solution hasn’t yet been finalized.

So: extending it over two panels makes sense. And, when you add to the mix the space requirements of several HomeKit devices, having a third page become available makes even more sense.

The beauty of this UI, though, is that it remembers which page you left it on. If you use the music playback controls as frequently as I do, that means you can turn Control Centre into an ever-present remote control for audio, with some additional controls available if, for some reason, you need to toggle WiFi.

Across the bottom of the first page of Control Centre sits a familiar array of quick actions: flashlight, timer, calculator, and camera. The icons in this array now support 3D Touch, so it’s even faster to set a timer, and you can set the flashlight to three different levels of intensity. Unfortunately, it isn’t possible to use 3D Touch on the top row of toggles. It would be helpful, for example, to be able to launch WiFi settings from its toggle, or to have the option to lock the screen in a horizontal orientation on the iPhone.

I think the large buttons for AirPlay and AirDrop are fairly nice. They look like buttons, provide the additional information required by both services in a fairly compact space, but are adequately thumb-sized. However, the gigantic Night Shift button leaves me perplexed. When I first saw it, I assumed that it would be split in half for a True Tone toggle. However, not only does the iPhone 7 not have a True Tone display, the only iOS device with one — the 9.7-inch iPad Pro — doesn’t feature this split toggle. This button is unnecessarily large, and I probably find it particularly grating because Night Shift makes my iPhone look like it has a diseased liver.

Music and News: Back to the Drawing Board

I don’t remember the last time Apple introduced an app in one version of their software, only to radically redesign it just a year later; I certainly can’t think of two instances where that’s happened. But it did, this year, with Music and News.

I’ve always had a funny relationship with the Music app on iOS. In many ways, it has long been one of the finest apps Apple has ever shipped with the platform, featuring prominently in the original iPhone demo and in plenty of ads; but, deep down, there are some baffling design and functionality choices. That imbalance reached something of a high point in iOS 8.4, when Apple Music was added to the mix. Because Apple Music, by design, blurs the delineation between music you own and music you stream, the UI decisions made to add that layer of functionality increased the complexity of Music.

News, meanwhile, was a fine app last year, but it wasn’t particularly imaginative. There was very little distinctive about it; it looked a bit generic, if anything.

Music app on the iPad

Both of these apps have received a complete makeover this year. I’m bundling them together because both of them — and the new HomeKit front-end app called Home — share a common design language unlike anything else on the system. Their UIs are defined by very heavy weights of San Francisco, stark white backgrounds, and big imagery. I read an article just after WWDC — which, regrettably, I cannot find — that described these apps as having “editorial” interfaces, and I think that’s probably the most fitting adjective for this design language.

I’m struggling to understand why it’s being used in these three contexts, though — why in Music, News, and Home, but nowhere else? What do these three apps have in common? Music and News provide personalized recommendations and serve as windows into different media, but Home isn’t akin to either. Home and Music both provide direct control elements, but News doesn’t. If anyone can explain to me why these three apps get the same UI language that’s entirely different from any other app, I’d be happy to hear it.

Incongruity aside, I love the way Music and News look; Home is an app I’ve only seen in screenshots, because every time I try to launch it in my entirely HomeKit-free apartment, it just sits on this screen and spins away:

¯\_(ツ)_/¯
Loading accessories and scenes

I’ve no idea what’s going on here. I don’t know if there’s simply no timeout, or maybe there is but it’s set to the year 2022, or maybe you’re just not supposed to be an idiot like me and launch Home if you don’t have any HomeKit devices. (This is also why I was unable to comment on the third, Home-centric page of Control Centre.)

That aside, I think this new design language is fantastic. It’s bold and full of character, but not in a way that feels gaudy or overbearing. They feel like glossy interactive magazines, at least on the surface. As you get deeper into each app, the big, bold titles are minimized — secondary and tertiary interfaces look pretty standard compared with the primary screens of each app.

I think it would be interesting if this design language made its way into more apps on iOS. I think Phone, Reminders, and even Mail could take to this style quite well. Of course, there’s the bigger question of how permanent this style is: it appears in one app that’s brand new, and two others that were redesigned within twelve months of their launch. That’s not to say it can’t or won’t last, but its currently limited application makes it perhaps more experimental than other design languages Apple has implemented throughout the system.

Animations

I’ve been an ardent supporter of Apple’s interface design direction over the past few years. Though some critics may bemoan a generally less expressive experience with the iconography and human interface components of many apps, I’ve found that expressiveness to surface in other means — primarily, as it turns out, through motion and animation. From the subtle parallax effects in Weather and Photos to the new super goofy iMessage effects — more on that later — animations have become as much a part of the iOS user interface as are buttons and icons.

Unfortunately, many of the Springboard animations added in iOS 7 felt like they slowed down actually using the system. While they looked great the first few times, waiting for a long and softly-eased animation to complete for every task became, rather quickly, an irritation more than a pleasant experience. This was exacerbated by the inability to cancel any of these animations: if you opened the wrong app or folder on your phone, you had to wait for the “opening” and “closing” animations to play before you could try again. In the grand scheme of things, not the worst UI crime imaginable, but a frustration nonetheless.

In iOS 10, animations have been tweaked throughout the system to feel far faster. In fact, I’d convinced myself that all of the animations actually were faster, until I compared them to an iPhone 5S running iOS 9 and found them to be virtually identical.

But there is one very subtle change that makes a world of difference: it’s now possible to cancel animations before they complete. Tapped on Mail rather than Messages in your dock? Just hit the home button and it instantly responds. It’s the same story for folders, too; but, sadly, not for multitasking or opening Notification Centre.

Other animations still look and feel as slow as they were when iOS 7 debuted, including the icons flying in after unlocking. This animation has always grated on me. It takes about a full second to play; I wish it took about half that time because it makes the system feel much slower than it actually is.

Animations like these are most effective when they imply meaning — a sense of space, or an action. This has long been something that iOS does pretty well. For example, when you tap on a message in the Mail inbox, the whole UI slides to the left to show the message, as though it were laying just to the right of what the screen could contain. This animation is combined with the familiar right carat (›) that’s placed in each cell, completing the spatial relationship between the inbox and each message.

In iOS 7, the rather confusing spatial relationship between Springboard elements was organized into a more straightforward hierarchy. However, some animations and interactions were not fully considered; as a result, this hierarchy did not maintain consistency. The folder animation, in particular, was confusing: tapping on it would hide all of the home screen icons and perform some kind of hyperspace zoom into the folder area.

This has been fixed in iOS 10. Folders now appear to expand and sit overtop the rest of the home screen which, naturally, blurs. This animation feels a lot faster and more logical, while preserving the order of depth established in iOS 7.

The Hidden UI

You may have noticed that many of the most exciting new features I’ve mentioned so far — like additional options in Control Centre, and expanding notifications — make heavy use of 3D Touch. Plenty more of the enhancements that I’ll chat about later do too. In iOS 10, 3D Touch has been upgraded from a curious optional extra to a functional aspect of the system, and there are some complexities that are inherent to such a shift.

Because 3D Touch adds depth to a system that is, by the nature of pixels on a piece of glass, flat, its functionality is not obvious unless you know it’s there first. Paradoxically, the expansion of 3D Touch ought to make it feel much more like an expectation than an option, but there remains a steep learning curve for users to understand that 3D Touch is not necessarily consistent between apps.

3D Touch is also a bit of an anomaly across the iOS lineup. Apple says that they have over a billion iOS devices in use around the world, but only the iPhones 6S and to-be-released 7 support it. They sold a little over 200 million iPhones in the year since the 6S was introduced, which means that a maximum of about 20% of the entire iOS base is able to use those features.

This doesn’t even begin to touch on the questionable UI choices for the iPad. More on that later.
Notifications on the iPad.

Without 3D Touch, the user experience of a feature like rich notifications really breaks down. Instead of pressing on the notification bubble, it’s necessary to swipe the notification to the left and tap the “View” button that appears, to see its options. Of course, this is a compromise that will scarcely be a memory in a couple of years, about 80% of existing iOS device users will, on launch day, have a less-than-satisfactory experience.

Keyboard

Of all of the images of Steve Jobs onstage at an Apple event, there are few more instantly memorable than this moment at Macworld 2007:

“What’s wrong with their user interfaces? Well, the problem with them is really sort of in the bottom 40 there.”
Steve Jobs showing then-current smartphones at Macworld 2007

You might remember Jobs explaining that the keyboards “fixed in plastic” are a core issue with these phones, and that changing to a touch screen would allow for optimized controls for each application.

But one thing he didn’t mention — at least, not explicitly — is that the keyboard itself would see significant changes over the next nine versions of the operating system. From international keyboards and dictation, to the Predictive bar and case switching on the keycaps, the keyboard has come a long way since 2007. But it has always primarily been an explicit, active means of user input.

In iOS 10, the keyboard becomes a little more passive and a lot smarter by way of the QuickType bar. Instead of merely predicting what word you should type next based on what you’ve been typing so far, it now suggests inputs based on contextual prompts.

Phone numbers in QuickType

For example, if a webpage has a field for your email address, QuickType will suggest two of your email addresses. Or, if a friend texts you asking “Where are you?”, the keyboard will prompt you to send your current location.

And the improvements to the QuickType bar just keep getting better: as you’re typing, it can also suggest an appropriate emoji. Type “love” and you’ll see a heart; type “ugh”, and you’ll be prompted to add a straight-faced emoji. Unfortunately, as Apple is a strenuously PG-rated company, typing “shit” will not suggest the “pile of poo” emoji — though “crap” will — and typing “penis” won’t suggest the eggplant.

There are also some improvements to autocorrect. For users who type in multiple languages or mix languages, iOS now automatically handles corrections and suggestions in those other languages on the fly, ostensibly. For the most part, I’m monolingual, but I know a few sentences in other languages. Even after adding those languages as keyboards in Settings, I wasn’t able to get it to autocorrect to those languages if I didn’t manually select those keyboards.

Yup yup yup
The only time I ever saw a language switch in the QuickType bar without manually selecting another keyboard is when my girlfriend sent me a text reading “Yup yup yup”. QuickType decided that I should reply in what appears to be Turkish. I’ve noticed that these reviews get harder to write when I’m able to explain less about how the system works.

I’m entirely the wrong person to be trying this out; that it didn’t work for me means nothing. Maybe read Ticci’s review — that guy knows what he’s talking about.

3D Touch support has also been enhanced in the keyboard. The trackpad gesture now works far more reliably, and pressing harder on the delete key will erase text at about twice the speed.

Differential Privacy

Apple has long prided itself on standing up for the privacy of its users. They’ve fought the FBI, and have long resisted taking the relatively easy route of uploading all of their users’ data to their own servers to diddle around with in any way they want.

But there comes a time when even they will agree that it’s in the best interests of their users to detect trends, for instance, or enhance certain machine learning qualities.

In iOS 10, Apple is using a fairly esoteric field of study to enhance their machine learning capabilities. It’s called “differential privacy”, and they’re using it beginning only with the keyboard to learn new words.

You’ve probably heard a million explanations of how differential privacy works, so here’s the elevator pitch version, for reference: the keyboard tracks the words that you enter and how Autocorrect responds, and blends all of that with a lot of statistical noise. The data from you and hundreds of millions of other iOS users gets combined and the noise is averaged out, leaving certain trending words behind when they’re used by a significant number of people.

This isn’t a technique invented by Apple, but they’re the first to deploy it at this kind of scale. There are some people who are doubting its success, but there’s no way to tell whether it’s making a meaningful impact on our typing until iOS 10 reaches mass deployment.

Emoji

As part of the iOS 10 update, Apple has redesigned most of the characters in the “Smileys & People” category, along with a bunch of others in several more categories. The redesigned characters look a little more saturated to my eye, and a tiny bit softer. I really like them.

In addition to the redesigned characters, there are also a bunch of new and more diverse emoji that depict women in professions and activities previously represented only by men, as well as more variations for family characters. This is a good step forward — showing police officers, detectives, and swimmers as men while displaying women only as brides and princesses was clearly not representative of reality.

However, unlike on MacOS, there still isn’t a means to search for emoji in iOS 10. The keyboard may provide suggestions while typing, but it’s not the same as search: there’s only one suggestion, which necessitates a more precise guess to find the right emoji. I wish I could swipe down on the emoji keyboard to see a proper search field.

Siri

Before I jump into what’s new in Siri this year, I want to elaborate a little bit on where I see Siri today. To understand the current state of Siri is to understand why there are now APIs available to third parties.

The best place to start, I think, is with Steven Levy’s August profile of Apple’s artificial intelligence and machine learning technologies:

As far as the core [Siri] product is concerned, [Eddy] Cue cites four components of the product: speech recognition (to understand when you talk to it), natural language understanding (to grasp what you’re saying), execution (to fulfill a query or request), and response (to talk back to you). “Machine learning has impacted all of those in hugely significant ways,” he says.

I think it’s critical that we understand all four of these components: how they work on their own, in sequence, and how the unreliability of any component affects Siri as a whole.

So, let’s start with the first: speech recognition. One thing that has become consistently better with Siri’s ongoing development is its ability to clearly and accurately transcribe our speech. Even just a few years ago, it. was. necessary. to. speak. to. Siri. in. a. jolted. manner. William Shatner likely had few problems with Siri, but the rest of us found this frustrating.

In 2014, Apple transitioned Siri from a backend largely reliant upon third parties to one of their own design. The result was a noticeable and, perhaps, dramatic improvement in Siri’s speed and accuracy, to the extent that Apple felt confident enough to add real-time dictation with iOS 8.

But the quality of Siri’s transcription of homonyms and more esoteric words often leaves a lot to be desired, due in part to inconsistencies with the second component cited by Cue: the interpretation of what is being said. Here’s an easily reproducible example that you can try right now: tell Siri “remind me to sew my cardigan tomorrow at noon”. Siri doesn’t understand the context of the word “sew” nor its relationship to the word “cardigan”, so it always — or, at least, every time I’ve tried this — transcribes it as “so”.

Speech recognition and interpretation are, I would argue, two parts of a single “input” step in a given Siri interaction. The next two parts — execution and response — can also be combined into a single “output” step, and I think it has far deeper and more fundamental problems.

Nearly any frustration we have with any computer or any piece of software tends to boil down to a single truth: the output is not what we had expected, based on our input. Whether that’s because we open an app and it crashes, or our email doesn’t refresh on a timely basis, or perhaps because autocorrect inserts the wrong word every ducking time — these are regular irritations because they defy our expectations.

In many ways, Siri is truly amazing, typically answering our requests faster than we could ever type them out. But because Siri can do so much, we experiment, and rightfully expect that similar queries would behave similarly in their response.

Let’s start with a basic request — for instance, “hey Siri, how long would it take me to drive to work?” As expected, Siri will happily respond with information about the current traffic conditions and the amount of time it will take to get there. Now, change the word “drive” to “walk” in the exact same query, and witness an entirely different result:

How long would it take me to drive to work?
How long would it take me to walk to work?

These requests are nearly identical, but are treated vastly differently. The driving example works perfectly; the walking example doesn’t answer my question — I’m not looking for directions, I’m asking for a time estimate.

Worse still is when Siri fails to provide an answer to a specific request. Siri is akin to pushing the “I’m Feeling Lucky” button in Google: it ought to be the shortest, straightest line between asking something and getting an answer. If I ask Siri to “find me a recipe for banana bread”, I want a recipe, not a web search that gives me a choice of recipes. If I wanted options, I would have asked for them.

As Siri’s speech recognition and interpretation becomes more reliable, this becomes more of a problem. Based solely on anecdotal observations, I think that users will be more tolerant of an occasional mismatched result than they are of having to interact with Siri, so long as it remains fast and reliable.

With that, I’d like to propose a few guidelines for what a virtual assistant ought to be and do.

  1. Speech recognition and transcription should prioritize context over a direct phonetic interpretation.

  2. Similar commands should perform similarly.

  3. Returning an absolute answer should be the highest priority. A web search should be seen as a last-ditch fallback effort, and every effort should be made to minimize its use. User interaction should, overall, be minimized.

These bullet points are, I’m sure, much more difficult to implement than I’ve made them out to be. Contextualizing a phrase to interpret which words are most likely to be spoken in relation to one another requires a great depth of machine learning, for example; however, I see these guidelines as a baseline for all virtual assistants to behave predictably.

SiriKit and Intents

While Apple is busy working on the fundamental components of Siri, they’ve opened up its capabilities to third-party developers who have been chomping at the bit since Siri was launched in 2011. Much like multitasking in iOS 4, the functionality of SiriKit is limited to unique scopes or domains:

  • VoIP

  • Messaging

  • Payments

  • Photo search

  • Workouts

  • CarPlay

  • Ride hailing

These individual scopes each have their own “Intents” and vocabulary, and these can be defined by developers. For example, Uber provides different levels of ride hailing service, and they can define those levels for Siri in their app’s metadata; or, a payment service could define different methods of payment. Developers can include shorthand and alternate variants of their app’s terminology within their app’s vocabulary metadata.

All of this stuff sounds like it’s going to be a great way to expand the capabilities of Siri without Apple having to chase down individual partnerships. Unfortunately, these precise app categories tend to be dominated by big players who wouldn’t care to let me test their new apps. I’m looking forward to seeing what I can do with these apps once they’re released into the wild, though, because I have lots of questions.

Maps

The first thing you’ll notice about Maps in iOS 10 is that it’s received a complete makeover. With its bold card-based layout, floating controls, and Proactive suggestions, it now looks like the app Apple has wanted it to be since they dropped Google and went their own way back in iOS 6. It has taken on some of the design cues established in Music and News, though not to the same degree. I find it even easier to use than the old version, though it does retain some of its — uh — charms.

I could do with some waffles and a glass of Prosecco right now.
Booking the Beltliner through Maps
The bigger news in Maps doesn’t come from Apple, though: third-party developers can now integrate their apps directly into Maps’ UI, using Intents and similar code to SiriKit. Only two kinds of Intents are available for Maps integration: ride hailing and restaurant reservations. Third-party restaurant reservation integration is only supported in Maps; Siri has supported OpenTable integration since iOS 6. It’s not a particularly glamorous integration, but it is a useful one. This could be taken one step further by adding an Intent for delivery services as well.

I couldn’t test any of the ride hailing stuff because Uber threw a hissy-fit over Calgary’s requirements that drivers carry proper licensing and that vehicles are inspected, so they don’t offer ride sharing here.

Photos

About a year ago, Benedict Evans posed an intriguing question: about how many photos are being taken today? Given that there are a couple of billion smartphones in the world, it’s probably a lot:

How many more were taken and not shared? Again, there’s no solid data for this (though Apple and Google probably have some). Some image sharing is probably 1:1 for taken:shared (Snapchat, perhaps) but other people on other services will take hundreds and share only a few. So it could be double the number of photos shared or it could be 10x. Meanwhile, estimates of the total number of photos ever taken on film range from 2.5-3.5 trillion. That in turn would suggest that more photos will be taken this year than were taken on film in the entire history of the analogue camera business.

That was last year; this year, there will no doubt be a far greater number of photos taken due to the continued proliferation of smartphones worldwide. We all know this, and we all know how difficult it has become to manage those photos.

A few months before Evans wrote that article on photos, Google tried to combat this problem by introducing Photos, to much critical and public acclaim. Instead of worrying about storing those photos on your device — a worry that will be present so long as companies like Apple continue to include inadequate local storage in their smartphone lineups — Google reasoned that it would make more sense to allow users to stash their photos in a cloud storage system. Not only does this free up local space on the device, it allows photos to benefit from the ridiculous redundancy built into Google’s cloud storage facilities.

To sweeten the deal, Google built software that would analyze the photos as they’re stored in Google Photos. It could identify objects and people within photos, which means that finding that one photo of your dog licking a burger became as quick and easy as a Google search.

By all accounts, Google Photos has been a rousing success; it became quite clear in the intervening year that these kinds of improvements were expected from Apple, too. But this intelligence has long been presumed to require a sacrifice on user privacy — a sacrifice that has seemed unlikely for Apple to make. Om Malik wrote what is perhaps the most cogent explanation of this assumed contradiction for the New Yorker in June 2015:

The battle between Google and Apple has shifted from devices, operating systems, and apps to a new, amorphous idea called “contextual computing.” We have become data-spewing factories, and the only way to make sense of it all is through context. Google’s approach to context is using billions of data points in its cloud and matching them to our personal usage of the Google-powered Web; Apple’s approach is to string together personal streams of data on devices, without trying to own any of it. If Google is taking an Internet approach to personal context, then Apple’s way is like an intranet.

From the surface, Google’s approach seems superior. Understanding context is all about data, and the company is collecting a lot more of it. Apple has your phone; Google has access to almost everything. […]

And one day, I wouldn’t be surprised to see an executive from Apple come onstage at the Moscone Center, take a page from its rival, and say that they’re doing the same things with your data that Google is.

That day came, kind of, about a year later, on June 13, 2016. An Apple executive — Craig Federighi, naturally — took the stage at the Bill Graham Auditorium to explain that they‘re not doing the same things with your data that Google is. Apple claimed that they were able to do the same kind of facial and scene recognition on your photos entirely locally.

That sounds pretty compelling: a marriage of privacy and capabilities. All of the power, yet none of the drawbacks. So: has it worked?

Well, there are lots of criteria one could use to judge that. At its core, it’s a simple question of Can you search for objects and see photos you’ve taken of them?, to which the answer is “yes, probably”. But it would be disingenuous and irresponsible of me to view Photos in a vacuum.

While this won’t be a full Apple Photos vs. Google Photos comparison, it seems appropriate to have at least some idea of a benchmark. With that in mind, I uploaded about 1,400 photos that I’d taken through June and July to Google Photos; those same photos also live in my iCloud Photo Library. But, before we get to that, let’s see what Photos has to offer on its own terms.

Upon updating to iOS 10, your existing photo library will be analyzed while your iPhone or iPad is plugged in and locked. How long this will take obviously depends on how many photos you have — my library of about 22,000 photos took a few days of overnight analysis to complete. However, new photos taken on an iPhone are analyzed as they make their way into the Camera Roll. Apple says that they make eleven billion calculations on each photo to determine whether there’s a horse in it. For real:

In fact, we do 11 billion computations per photo to be able to detect things like there’s a horse, there’s water, there’s a mountain.

And those calculations have determined that there are, in fact, horses in some of my photos:

Editor’s note: dragonflies are not horses.
Searching for and finding horses in my photo library

There are lots more searches that are possible, too — an article from earlier this year by Kay Yin pegs the total number of scenes and objects that Photos will detect at 4,432. Yin told me in an email that they acquired the list through an analysis of Apple’s private PhotoAnalysis.framework. It includes everything from the obvious — food, museums, and musical instruments, to name a few — to the peculiar and surprising: ungulates, marine museums, and tympans all make an appearance on the list.

Weirdly, though, some searches still return zero results in Photos. You can’t search for photos by type — like screenshot, panorama, or Live Photos — nor can you search by camera brand or model. This information is within pretty much any photo, but is not indexed by Photos for reasons not entirely clear to me. Perhaps very few people will search for photos taken on their Canon DSLR, but it doesn’t make much sense to me to not allow that. It feels like an artificial limitation. The only way to find Live Photos within your library on your iPhone is still to thumb through each photo individually until you see some sense of movement.

For the myriad keywords Photos does support, however, there’s plenty of good news. After it has finished analyzing and indexing the photo library, searches are fast and respectably accurate, but it’s not perfect. In that “horse” screenshot above, you can see a photo of a dragonfly, for instance. A search of my library for “receipts” shows plenty of receipts that were indexed, but also some recipes, a photo of a railway timetable, and a photo of my wristband from when I was in the hospital a couple of years ago. In general, it seems to err in favour of showing too many photos — those that might be, say, a 70-80% match — rather than being too fine-grained and excluding potential matches.

Perhaps my biggest complaint with Photos’ search is that it isn’t available in the media picker. That wasn’t as big a deal in previous versions of iOS, but with the depth and quality of indexing in iOS 10, it would be really nice to be able to search within Messages or in an image picking sheet for a specific photo to send.

Apple’s facial recognition is also quite good, generally speaking. It’s reasonably adept at identifying photos of the same person when the face is somewhat square with the camera, but differences in hair length, glasses, mediocre lighting, and photos with a more sideways profile-like perspective tend to trip it up.

If you’re a completionist about this sort of thing, you’ll likely become frustrated with the most obvious mechanism for dealing with photos misidentified as being from different people. It’s not that it’s hard; it is, however, extremely tedious. To get to it, tap on the Albums tab within Photos, then tap People, then tap Add People. You’ll be presented with a grid of all of the faces identified in your photos.

The thumbnails are sorted in descending order of the number of photos found per face detected. The first few screens of these thumbnails will look fine — 21 instances of a face here, 40-odd there — but as you scroll, the number of photos per face drops precipitously. I got about a quarter of the way through my thumbnails before I started seeing instances of a single photo per detected face. You can add each to your People album, and assign a set of photos to a contact. If you’ve already started collecting photos with a specific contact in them, it will offer to merge any new photos you add to that contact.

Tapping more than one thumbnail in the Add People view will activate the Merge button in the lower-left corner. This allows you to select multiple photos featuring the same face and teach Photos that they are the same person. Unfortunately, it’s still quite laborious to sort through photos one-by-one, in some cases. To make matters worse, thumbnails will sometimes feature the faces of two people, making it difficult to determine which of them is being detected in this instance.

This is a time-consuming way of handling multiple photos from a single person. Despite its utility, I find this view to be frustrating.

Is this Kai?
Happily, there’s an easier method of teaching Photos which faces belong to which contact. If you tap on one of the faces you’ve already taught Photos about and scroll to the bottom of the screen, past the Details view — more on that later — you’ll see a Confirm Additional Photos option. Tap on it, and you’ll get a well-designed “yes” or “no” way of confirming additional photos of that person. There’s even some really pleasant audio feedback, making it feel a little bit like a game.

Unlike object detection, which seems to err on the side of including too many photos so as to miss as few potential matches as possible, facial detection errs on the side of caution. It may be much pickier about which faces are of the same person, but I haven’t seen a single false-positive. If there is a false-positive, the process for disassociating a photo with it is a bit bizarre: the button for Not This Person is hidden in the Share sheet.

But is all of this stuff as good as Google Photos? I’m not adequately prepared to fully answer that question, but here’s the short version: it seems real close.

I have common praise. Both successfully identified obvious objects within photos most of the time. Both also had the occasional miss — identifying an object incorrectly, and not identifying an object at all. Both struggle with plurals in searches, too: a search for “mushroom” in both apps returns photos I took of a cluster of mushrooms at the base of a tree, but searching “mushrooms” does not.

In this set, Google Photos falsely identified a river as a road, something which Apple’s Photos app didn’t do. We all make mistakes.
Four roads and a river

Had Apple’s Photos search detected that river as a road, it would have been in this screenshot.
Road search in Photos
I found that both apps were similarly successful at recognizing faces, with a slight edge for Google. However, I’m not sure the pool of photos I uploaded to Google was comprehensive enough for me to figure out how good it was for recognizing a lot of different faces; my iCloud Photo Library has far more images in it with lots more faces. I’d love if someone uploaded an identical batch of tens of thousands of photos to both, and did a more thorough investigation.

My main concern with Apple’s attempt at photo recognition and categorization was that it wouldn’t be anywhere near competitive with Google’s offering. My (admittedly brief) comparison indicates that this simply isn’t the case. Apple’s offering is properly good.

But, perhaps because it’s doing all of the object and facial recognition on the device, locally, it doesn’t sync any of this stuff within iCloud Photo Library. I hope you enjoyed the tedium of assigning names to faces and confirming all which photos contain each of your friends, because you’re going to have to do that for every device that you own. Have fun!

There’s also a rather nice Details view for each photo. You can tap on the Details button in the upper right or, in a completely non-obvious manoeuvre, you can scroll the photo vertically. There, you’ll see a map of where the photo was taken, any people identified within the image, and related Memories.

And I haven’t even mentioned my favourite new feature.

Memories

Don Draper:

Teddy told me that in Greek, “nostalgia” literally means “the pain from an old wound.” It’s a twinge in your heart far more powerful than memory alone. This device isn’t a spaceship, it’s a time machine. It goes backwards, and forwards… it takes us to a place where we ache to go again.

You’ve probably read that quote in a dozen other reviews and articles about photo-related things, but there’s a good reason for that: photographs are a near-flawless conduit between your eyeballs to your heart. Trite as that excerpt may be, I couldn’t think of a better summary of Memories in iOS 10.

See, the photo analysis that iOS 10 does is not “dumb”; it doesn’t simply leave the data that it collects sitting there for you to comb through. Photos actually tries to do something with the information embedded in all of those images and videos: locations, dates, and — now, in photos — people and objects. It assesses that data looking for anything that might tie certain sets of images together, like those taken within a certain timeframe, or a set of photos from a trip abroad. It automatically groups those photos and videos together into small albums, and creates a short slideshow video from them. That, in a nutshell, is Memories.

I usually take pictures of buildings and empty fields, so my first set of Memories were not particularly inspiring. Don’t get me wrong — the albums were fine, but none were moving or emotional.

And then, one day, Photos surprised me by generating an album of photos of me and my girlfriend over the course of the past year. I guess it figured out that there are a few photos on my phone of us together, and a lot of her, and it put together an album and a video.

Despite all that I know about how automated and mechanized this stuff is, I was and remain deeply touched by the effect.

I’m trying not to sound too sappy here, but, in a way, I want to be extra sappy — I want you to know how powerful this feature can be. Sure, it’s all made by shuffling some bits around and associating whatever data it can find, but when you wake up to find a slideshow of you and your significant other over the past year, it really is pretty powerful.

I can’t confirm this, but Memories seems to prefer edited and liked photos, which makes sense — those are probably the best ones in a given set. It also incorporates Live Photos and video in a really nice way.

If you don’t like the auto-generated video, you can customize it. A theme is assigned by default, but you can pick your own, too, from options like “sentimental” and “dreamy” to “epic” and “extreme, with music and title styles to match. If you don’t like the soundtrack, just tap the adjustment button in the lower-right corner and you can pick from nearly one-hundred provided songs, plus all of the ones in your library. You can also adjust nearly all attributes of the video, including title style and the precise photo selection. But I’ve found that the auto-generated Memories are, generally-speaking, just fine.

The nature of this feature is such that most of the ones that it made for me are quite personal in nature — more on that in a minute. Thankfully, I do take enough photos of buildings and whatnot that it has produced a couple that I feel comfortable sharing. First up is one that was generated entirely automatically:

Here, as they say, is one I made earlier, with a few modifications:

You can imagine that if these were videos of your family or your significant other, they would be much more meaningful. I hope these examples provide you with a sense of what’s possible.

There’s something else unique about Memories, compared to — say — Timehop, or the “On This Day” posts that Facebook dredges up from years past. Because apps like these tend to use primarily public posts, they’re pre-selected based on the kind of image we project of ourselves. But we take far more photos that never get posted for all kinds of reasons.

I have a series of photos from mid-August of a trio of ducks fighting over a fish. I didn’t post them publicly because it’s of little-to-no interest of anyone, I presume, but it reminds me of watching those ducks duke it out on the river. That’s a memory particular to me, and it’s the kind of thing that will one day be served up by Memories.

I will say that I’ve seen it have some problems with facial recognition when cropping portrait-oriented photos to fit within a 16:9 video frame. More than once, the people in the photos have had their heads cut off. Sometimes, it’s only my head visible, instead of the faces of those I’m with; that seems to be the inverse of the most appropriate way to crop an image — who wants to look at themselves?

Regardless, Memories is probably my favourite new feature in iOS 10’s Photos app, and maybe in the entirety of iOS 10. It’s a beautifully-executed and completely effortless high-test nostalgia delivery system.

RAW and Live Photos

iOS 10 unlocks new capabilities for developers as well. Third-party applications can now shoot Live Photos, and encode and decode RAW images. The former capability is fine — I’m happy to have it for those times I want to have a little more control over a Live Photo than the default camera app can provide.

The latter capability, though: oh my. The default camera app doesn’t encode RAW images, but the newest versions of Obscura and Manual can, and they’re totally worth buying just to try RAW shooting on your phone. It’s clear that a lot of detail is obliterated when the photo is processed and compressed as a JPEG; a RAW image is three-to-four times the file size of the same JPEG, and it’s completely lossless. The easiest way to demonstrate the difference is with a simple, unedited comparison of two photos I shot one after another:

Shot as a JPEG, 100% zoom.
Shot as JPEG

Shot as a RAW image, 100% zoom.
Shot as RAW

In the image shot as a JPEG, the trees become a blocky, gestural mess. The fine lines on the outside of the green building on the left are incomplete and chunky. The whole thing looks a little more like an oil painting than a photograph.

In order to process the RAW for web use, I simply applied Photoshop’s “auto” Camera Raw setting; it may have flattened out the shadows, which is why the roof of the castle-looking building looks darker in the JPEG. But, even with that minimal processing, you can clearly see individual tree branches instead of a blocky mess. The train tracks on the overpass are clearly distinct. You can almost make out the windows on the sandstone school in the distance, in the middle of this crop. Every detail is captured far better.

Of course, the JPEG variant looks far better at the typical size of a photo viewed on Facebook, for example, where these photos typically go. And, naturally, the lack of any processing means that a full spectrum of noise is also captured; it’s not quite fine enough to be considered pleasantly grainy. But for those of us who want some more control over individual attributes, the capability of shooting RAW is extremely exciting. It presents far more flexibility, provided third-party photo editing apps jump on the bandwagon. Snapseed already handles RAW in post; I’ve heard confirmations from the developers of several other apps confirming that they will soon support RAW editing, too.

For an utterly unfair comparison, I shot a similar photo on my DSLR, a Canon XSi with a 12 megapixel sensor — the same rating as the one in my iPhone. Of course, the APS-C sensor in it is far larger and the lens I have on it — the cheap and cheerful 40mm pancake — is much nicer, and has a completely different field of view as my iPhone. Even so:

Shot as a RAW image on my DSLR, 100% zoom.
Shot as RAW on my DSLR

There’s a long way for the iPhone’s camera to go to become comparable to a professional DSLR — in fact, I’m not sure it ever can compete on that level. But, with RAW shooting capabilities, I see this as one of the single biggest jumps in image quality in the history of the iPhone. It is properly, exceedingly, brilliantly good.

Messages

Like most of you, I can think of few apps I use more on my iPhone than Messages — Safari, Mail, and Tweetbot are the only three that come to mind as contenders. Its popularity is a direct result of its simplicity and versatility, with few apps making text-based conversation as straightforward.

Perhaps because of that simplicity, Messages has seen few updates in its lifetime. iPhone OS 3 brought MMS support, iOS 5 introduced iMessage, and iOS 8 added more messaging types and a better Details view. But the ubiquity and flexibility of applications for Messages means that those improvements effected amongst the most significant changes on the utility of any app on iOS. While Apple hasn’t released their monthly active user count for iMessages, for example, I bet that it’s one of the most popular messaging standards in the world.

But, while you’ve always been able to send text, pictures, and video through iMessage, the experience has always been rather static. Until now.

In iOS 10, you can now send handwritten and Digital Touch messages through iMessage on your iOS device. What was once a niche feature for Apple Watch owners takes very kindly to the larger displays of the iPhone and iPad, allowing you to send Snapchat-like sketches through iMessage. The heartbeat option is even available if an Apple Watch is paired, and you can mark up photos and videos right from the media picker. In some ways, it seems that Apple is still chasing a bit of Snapchat’s unique style of photo-based messaging.

Messages media picker, buttons hidden
Messages media picker, buttons shown

The media picker has, by the way, been completely redesigned. There’s now a tiny camera preview right in the picker, alongside a double row of recent photos. Swiping right on the picker will show buttons to open the camera or show the entire Camera Roll.

This redesign is simultaneously brilliant and confusing. I love the camera preview, and I think the recent photo picker is fine. But the hidden buttons are far too hidden for my liking, and it’s somewhat easy to miss the small arrow that provides a visual clue. Once you find them, they’re easy; but I have, on more than one occasion, forgotten where the button to access the full Camera Roll picker now resides.

But what if you want to communicate in a more textual way? Well, iOS 10 has plenty of new features there. After you type out a message, you can tap on the keyboard switcher icon to replace any words in your message with emoji. Relevant words or phrases will be highlighted in orange, and tapping on the words will either suggest emoji to replace them with, or simply replace the words if only one character seems to fit the phrase. Yet, despite the extent to which I already communicate through emoji, I could never really get the hang of this feature. The QuickType bar provides a good-enough suggestion throughout the OS that I never really got the hang of tapping on the emoji icon after typing the word I intend to replace, but only in Messages. It simply doesn’t match with the way I think when I bash out a text message. Your mileage may vary.

And then there’s the stuff I got a little obsessed with while testing iOS 10 this summer. Apple has added a whole host of weird and wonderful effects for when you send an iMessage. Press on the Send button, and a full-screen sheet will appear with a bunch of available effects. Some message effects display inline, while others will take over the entire screen the first time the message is read. Some are interactive: “Invisible Ink” requires the recipient to touch over the message to reveal it. An effect like “Lasers” turns the whole display into a nightclub, replete with bangin’ choons. What’s more, sending some messages — like “Happy Birthday” or “Congrats!” — will automatically fill the recipient’s screen with balloons.

Imagine the President getting their daily security briefing with these effects.
Lasers effect on the iPad

I make no bones about how much I love these effects. I’ve only been screwing around with them for the past few months with a handful of people, but they bring so much excitement and joy to any conversation that they’re easy to over-use, potentially to the chagrin of anyone else you’re talking to.

If you hate fun, you’ll probably be disappointed that there’s no way to opt out of receiving them, with the exception of switching on the “Reduce Motion” option in Accessibility settings — but that has all sorts of other side effects, too.

Message effects regression on the Mac

I’ve also noticed that these effects don’t regress very well. Users on devices running older versions of iOS or OS X will see the message followed by a second message reading “(sent with Loud Effect)”, or whatever the effect might be.

Messages has also learned some lessons from Slack. Links to webpages now show inline previews if the message was sent from a device running iOS 10 or MacOS Sierra. These previews can be pretty clever, too: a pasted Twitter link will show the whole tweet, the user it’s from, and any attached media; and, for YouTube links, you can actually play the video inline (but, curiously, not for Vimeo links). You can also react to individual messages with one of six different emotions by tapping and holding on a message bubble, a feature Apple calls “Tapback”, or with stickers from apps — more on that in a moment. Messages containing just emoji, up to three, will display much larger. All of these relatively small tweaks combine to produce some of the most welcome improvements to an app we use dozens of times a day.

Curiously enough, Messages in iOS 10 actually loses some functionality as well. In iOS 8, Apple attempted their take on Snapchat. You’ll recall that tapping and sliding on the camera icon would immediately send a disappearing photo or video. There is no longer a way to do that in iOS 10. Not that anyone would notice, of course — as I noted at the time, that feature was often more frustrating than helpful. I don’t know anyone who used that shortcut to send photos. I suspect few will notice its removal.

But I think that everyone will notice that developers can now add to Messages in a really big way.

iMessage Apps and Stickers

For the past few releases of iOS, Apple has rapidly been opening up their first-party apps to third-party developers. From sharing sheets to Safari, extension points now exist throughout iOS to make the system vastly more capable, efficient, and personalized. And now, they’re giving developers perhaps one of the biggest opportunities in years: apps and stickers in Messages.

Stickers are probably easiest to understand because they sound exactly like what they are: packs of images — still or animated — that you can stick to messages in a conversation. If the success of stickers in every other chat app is to be believed, they’re are going to be one of the hottest new features for both users and developers alike.

Actually, even saying “developers” is a misnomer here. Creating a sticker pack does not require writing a single line of code. The only things anyone needs to build a sticker pack are Xcode, correctly-sized artwork for the stickers in common image file formats, and an icon in different sizes, which means that virtually any idiot can make one. And I can tell you that because this idiot, right here, made a sticker pack in about ten minutes, excluding the amount of time I spent fighting with Xcode. It could scarcely be simpler: drag your sticker image assets into one tab of Xcode, drag a bunch of icon sizes into the other, and build. Unfortunately, you do have to subscribe to Apple’s Developer Program in order to test the app on your device; you can’t use a free Apple ID to build a sticker pack just for yourself.

As a result of this simplicity, I think a lot of artists and designers are going to have a field day making all kinds of sticker packs and selling them. Aside from free stickers — plenty of which will be from brands half-assing their marketing efforts — I’m guessing that the one-dollar price point will be the sweet spot for a typical pack.

From a user’s perspective, these stickers will be a fun addition to pretty much any conversation. They can be dropped — with a slick animation — on top of any message, or they can be sent as plain images in the chat. Some users may get frustrated that stickers typically overlap a message, which can make it hard to read. You can tap and hold on any message bubble to temporarily hide stickers and get more information about what stickers were used.

Stickers are a hoot for users and developers alike. But, of course, if you want more functionality, you’re going to have to write some code and put together an app for Messages. Apple says that developers can create all sorts of interactive environments, optimized for short-duration usage: think back-and-forth games, peer-to-peer payments, and the like.

It’s telling that they call these “iMessage Apps”, and not “Apps for Messages” or some variant thereof. While apps that confine themselves to sending just images or links will work fine over SMS, any of the truly cool interactive apps won’t work.

Apple ships two examples with iOS 10: Music and “#images”. The former, of course, lets you share your most recently-played tracks with friends. Instead of having to switch to Music from a conversation and tapping on the Share button, the track is served to you from within the thread. When combined with rich previews for Apple Music links, the app provides a totally seamless experience.

The “#images” app — I will continue to use quotation marks because I cannot stand that name — is a much-needed enhancement for those of us who like to spice up any conversation with various movie and T.V. references. It appears to use the same Bing-powered image search engine as Siri on the Mac, except perhaps more tailored for Messages. That is to say, it seems more GIF-oriented, and it appears to suggest images based on the conversation. There are even two buttons across the top that are pre-populated with likely search terms. Like any Messages app or sticker pack, you can tap on the arrow in the lower-right corner to expand its view, but in “#images” you can also press on any image’s thumbnail to see a full preview.

“#images” has been the bane of my friends’ discussions with me for the past few months. GIFs are way better than emoji, of course, and any opportunity to reply to a message with Homer Simpson starting a bowl of corn flakes on fire really is a tremendous ability. If I’m completely honest, though, I don’t really need every movie reference on the planet; I only need clips from the Simpsons. I do hope a Frinkiac app is on the way.

Unlike other app extensions, apps running in Messages are entirely independent, and don’t require a container app; however, developers can use their existing and new iOS apps, if they so choose.

And, like pretty much every other extension point on the system, there’s no indication of when an app is installed that features a Messages extension. Unlike every other extension point, there’s a switch that allows you to automatically activate any new Messages apps. I think a similar option should be available for other extension types, like keyboards and share sheet items, as the current method of determining whether an app has installed a new extension is, at best, a matter of trial and error.

Apps and sticker packs are installed in Messages similarly, in a pseudo-Springboard sheet that appears in place of the keyboard. It behaves like Springboard, too: you can tap and hold on an icon to change the order of the apps, or tap the x in the corner to remove the extension. There’s even a row of page indicator dots across the bottom; if you install a lot of apps, it doesn’t scale particularly gracefully.

Too many iMessage apps
I managed to run up this tally just by downloading all of the iOS 10 updates issued to apps already on my phone. Nearly every app that I updated today included a Messages extension. Imagine what it’s going to be like if you really dive deep into the iMessage App Store.

I’m sure that these apps are going to be insanely popular. Consider, for comparison, the popularity of emoji keyboards like Bitmoji or Kimoji. Perhaps a handful of apps will take over, but I anticipate plenty of users overrunning the page dot capacity. I’m surprised that this is not handled more gracefully.

Music

I wrote at length earlier about the interface design changes in Music and News; here, I want to spend a little more time on how those updates affect the usability of the app.

I want to start with the five tabs across the bottom. To me, their relatively subtle change has radically improved how I use Music. Previously, the tabs in Music were, from left to right: For You, What’s New, Radio, Connect, and Library.

The redesigned version of Music makes a subtle but critical change to its overall usability, simply by adjusting the five tabs that appear across the bottom: Library, For You, Browse, Radio, and Search. The implication of this change is a promotion of Library from the lowest priority item to the highest, where it belongs.

Arguably the most significant improvement to usability directly gained from the adjustments to the tab bar is the promotion of Search. After all, when you’re looking for something — whether in Apple Music or your local library — you probably use search. Its previous placement, in the toolbar across the top, was an awkward place for it, primarily because results ended up in the New tab, for reasons I can’t quite explain. By simply adding a search tab bar item, the usability of Music is far better than it used to be.

Library tab
For You tab
iOS 10 Control Centre page 2

Even the rather vaguely-named Browse tab is a boon. The old New tab indicated that you’d only find new releases within; Browse, while more generic, allows Apple to add sub-categories for Curated Playlists, Top Charts, Genres, and the previously-buried Videos feature.

Meanwhile, the Connect features have been moved to the For You tab, and individual artist pages. I don’t know if that will improve its popularity among artists or users; I suspect not.

Within the Library tab, Music loses the weird drop picker that previously allowed you to browse by artists, genres, and so forth. This has been replaced by a simple, straightforward list, and it’s much better for it. There’s very little hunting around in this version of the Music app; most everything is pretty much where you’d expect it.

But, while Apple resolved most of the usability issues of the old app, they created a few new ones as well. “Loving” tracks and playlists — a critical component of the Apple Music experience and the only way to train the For You selection — is now a multi step process. There is no longer a heart button on the lock screen, nor is there one on the playback screen. Instead, you need to unlock your device and tap the ellipsis icon on the playback screen, or beside the item in a list. It’s a little odd to see so much emphasis placed on the ellipsis icon; it’s a metaphor that’s more frequently used on Android.

I love the artwork-as-drop-shadow effect used on this screen.
Now Playing screen
The playback screen is, overall, probably the least-successful element of the redesigned Music app, from a usability perspective. It took me a few days with it before I realized that it was possible to scroll the screen vertically, exposing the shuffle and repeat buttons, adjustable playback queue, and lyrics, when available. There’s simply no visual indicator that it’s possible to scroll this screen. My bug report on this was marked as a duplicate, so I suppose I’m not the only person who feels this way.

There are some holes in other parts of the app as well. There’s still no option to sort albums from an artist by year, otherwise known as “the only acceptable way to sort albums by a single artist”. There’s still no way to filter or search for music by year.

If you want a list of songs from a particular artist, you’ll want to use the Songs menu item to get a giant list of all songs, sorted by artist. There’s no way to do this from within the Artists menu item, which makes no sense to me. If I’m looking for songs by an artist, I’m going to start by looking in Artists; I bet you’d probably do the same.

Aside from the occasional usability bafflement, I’m certain that this version of Music is a much more successful organization of its myriad features. I’ve said many times that my ideal streaming service would feel like a massively extended version of my local library, and Music in iOS 10 comes closest to accomplishing that, even without enabling iCloud Music Library.

Lyrics in iOS 10
So what about some of the new features in the app, like lyrics support and new recommendations in Apple Music? Well, while lyrics are ostensibly supported, I had a hell of a time finding a song where that’s the case. After trying a bunch of different tracks from lots of different genres, I found that lyrics were shown for tracks from Drake’s “Views” album and Kanye West’s “The Life of Pablo“.

Lyrics only display for Apple Music songs, and I do mean only. My purchased-from-iTunes copy of “Views” doesn’t have lyrics, but if I stream the same song from that album on Apple Music, it does.

However, with the notable exception of Kim Mitchell’s truly terrible “Patio Lanterns“, just being able to read the lyrics doesn’t usually communicate the intent or meaning of a song. For that, you need something like Genius — not to be confused with the iTunes feature of the same name. I think it would be more useful if there were some substance behind displaying the lyrics.

While there’s no indication that adjustments have been made to the recommendation algorithms that power For You, there are two playlists that are served up on a weekly basis: the Favourites Mix, refreshed every Wednesday, and the New Releases Mix, refreshed every Friday. Unlike most of the pre-made playlists on Apple Music, these are algorithmically generated, but I’ve found them to be pretty good.

The first New Releases Mix that I got was a decent sampler plate of a bunch of new music that I generally enjoyed. Of the 25 tracks, in the first mix, I’d say that only two or three were misses. From my experience with both Apple Music and Spotify, that success rate compares favourably to the Discover Weekly mix in the latter service. Apple’s mix is, however, focused entirely on new releases a user might like; there doesn’t appear to be an automatically-generated playlist in the vein of Spotify’s.

All told, I think this iteration of Music is markedly more successful than the outgoing one, which grated on me more and more as the year wore on. I haven’t felt that with this version. Though it’s not yet perfect, it’s far better than its predecessor.

Continuity

Universal Clipboard

After launching with a robust set of initial features last year, the overarching concept of Continuity has been updated to support a frequently-requested feature: a universal clipboard.

The idea is simple: copy a piece of text, or an image, or a URL, or whatever on any device you own and have the ability to paste it on a completely different device. Apps like Copied, CloudClip, and Command-C filled in the gap left by the lack of official support for this functionality.

But, now, there is official support for clipboard sync, and it’s pretty good for my very basic uses. Like Handoff, Apple says that the clipboard is encrypted and synced entirely locally over WiFi and Bluetooth; your iCloud account is only used to ensure that it’s you copying or pasting on both devices.

As I said, my use-case for this is extraordinarily simple. Sometimes, I’ll have read something on my iPhone and want to link to it within a post. I can either open a new Safari tab on my iPad or Mac and wade through my iCloud Tabs until I find the right one, or I can just copy it on my iPhone and paste it on my other device. Or, sometimes, I’ll have something copied in a Mac-only app like TextMate that I can paste into an email message on my iPad. It’s pretty cool.

Unfortunately, there’s no visual indication of when an item is available to paste from a different device. I haven’t yet run into an instance where I’ve pasted in the entirely wrong thing from a different device, and the lack of a visual indicator strikes me as very deliberate: Universal Clipboard isn’t something you should have to think about — it “just works”.

Universal Clipboard lacks some of the more power-friendly options of the third-party apps mentioned earlier, like clipboard history and saved snippets, but it does a perfectly cromulent job fulfilling a basic use case for clipboard syncing. It works pretty well for me.

Apple Pay

Apple Pay was only introduced in Canada this June, but I’ve already become accustomed to paying for all kinds of stuff with it. Most payment terminals have supported tap-to-pay for a long time, but Apple Pay is more secure and, from my experience, faster and more reliable.

That it’s come to the web is a good thing; that I no longer have to use PayPal or submit my credit card details to an online store is a very good thing.

None of the places I typically order online from have yet added Apple Pay to their checkout options, so I tried using Stripe’s Apple Pay demo and it seemed to work pretty well.

I’ve dumped this feature into the Continuity section because Apple Pay is also supported in Safari on MacOS Sierra. You just start the purchase on your Mac, and authenticate on your iPhone. Strangely, though, this same cross-device functionality isn’t supported to authenticate an iPad purchase using an iPhone.

iPad

After several years of menial adjustments tailored for the iPad, iOS 9 brought serious systemwide improvements: proper multitasking, keyboard shortcuts, ⌘-Tab application switching, and lots more. iOS 9 was the significant boost the iPad needed, particularly since there are now two iPads named “Pro”. I, perhaps naïvely, thought that this was a renaissance for the iPad — a wakeup call for a platform that should feel like its own experience.

I was wrong. iOS 10 brings very few changes specifically designed for the iPad, and a whole lot of changes that feel like they were scaled-up from the iPhone.

There’s a scaled-up Notification Centre’s Today view that makes for an amazing visual trick in looking both cramped and inefficient with its use of the iPad’s larger display:

iPad Notification Centre

Control Centre also looks a bit odd on the iPad’s larger display, featuring gigantic buttons for AirDrop, AirPlay Mirroring, and Night Shift:

Control Centre, page 1

Half the space in the second Control Centre tile is occupied by a playback output destination list:

Control Centre, page 2

Instead of a list of output devices — something which I doubt most users will be adjusting with enough frequency to merit its equal priority to playback controls — why not show the “What’s Next” queue or additional Apple Music controls?

There are plenty of instances where the iPad simply doesn’t utilize the available screen space effectively. While not every pixel should be filled, shouldn’t playlist descriptions in Apple Music expand to fill the available space?

Music descrptions aren't full-height

Shouldn’t I see more than this in my library?

Music on the iPad

Shouldn’t the timer feel a little more deliberate?

The rotational dial looks so lost on this screen.
Timer screen on the iPad

Then there are the aspects of the iPad’s interface and features that remain, inexplicably, unchanged. The 12.9-inch iPad Pro retains the 5 × 4 (plus dock) home screen layout of the original iPad. The Slide Over drawer still shows the same large rounded cells around each icon, and its lack of scalability has shown as more apps support Slide Over.

That’s not to say that no new iPad features debuted this year. You can now run two instances of Safari side-by-side on iPads that support multitasking; however, it is the only app where this is possible.

The limitations created by the iPad’s form factor — a finger-based touch screen with a bare minimum of hardware buttons — has required ingenious design solutions for common tasks. Windowing, complex toolbars, and other UI components taken for granted were, and are, either impossible or impractical on the iPad. Similar problems were solved when the iPhone was developed. But, while there’s a good argument for retaining some consistency with the iPhone, the iPad is its own experience, and it should be treated as such.

There’s a glimmer of hope for iPad users: Federico Viticci has heard that more iPad-specific features are “in the pipeline“, presumably for an iOS 10.x release. Their absence from the 10.0 release is, however, noteworthy.

Grab Bag

As ever, in addition to the big headlining updates to iOS, there are a bunch of smaller updates to all sorts of apps. This year, though, there’s a deep-level system update as well.

System

Of all of the enhancements rumoured to be coming to iOS, not one revolved around a new file system. Yet, that’s one of the things that’s coming to iOS 10. It’s not finished yet, and it is projected to arrive as part of a system update next year, but it sounds like a thoroughly modern, fast, and future-friendly file system. I’m nowhere near intelligent enough to fully understand APFS, as it’s called, but Lee Hutchinson of Ars Technica wrote a very good early look at it back in June that you should read.

Phone and Contacts

It’s telling that I’ve buried what is ostensibly the core functionality of a smartphone — that is, being a telephone — all the way down here. We don’t really think of our iPhone as a telephone; it’s more of an always-connected internet box in our pants. But that doesn’t mean that its phone functions can’t be improved.

For those of you who use alternative voice or video calling apps, there’s a new API that allows those apps to present a similar ringing screen as the default phone app. And there’s another API that allows third-party apps to flag incoming phone calls as spam and scams. I don’t get that many unsolicited calls, blessedly, but I hope that apps like these can help get rid of the telemarketing industry once and for all.

The phone app also promises to transcribe voicemails using Siri’s speech-to-text engine. My cellular provider doesn’t support visual voicemail, so I wasn’t able to test this feature.

In addition, Apple says that you can set third-party apps as the primary means of contact for different people.

Mail

Mail has an entirely new look within a message thread, with a conversational view very similar to that of Mail on the Mac. This makes a long conversation much easier to follow, and allows you to take action on individual messages by sliding them to either side.

Additionally, there’s a new button in the bottom-left of message lists to filter which messages are shown. After tapping the filter button, you can tap the “filtered by” text that appears in the middle of the bottom toolbar to select filtering criteria; the default is unread messages across all inboxes.

This filter is similar to the Unread inbox introduced in iOS 9; but, with the ability to define much more stringent criteria, it’s far more powerful. I’ve been using it for the past couple of months to try to tame my unruly inbox with an unread count that keeps spiralling out of control.

Mail also offers to unsubscribe you when it detects a message was sent from a mailing list. That can save a great deal of time hunting through the email to find the unsubscribe link and then, inevitably, being asked to fill out a survey or getting caught in some other UI dark pattern. I used it on a couple of newsletters and it seems to have worked with just a tap.

Safari

Safari now supports “unlimited” tabs, up from 36 in iOS 8, and 24 prior to that. I checked this claim out, and got to 260 open tabs before I got bored. Of course, not all those tabs will be in memory, but they’ll be open for your tab hoarding pleasure. In addition, a long-press on the tab button in the lower-right lets you close all 260 of those tabs at once, should A&E show up with a film crew.

Sharing

Ever since iOS 8 allowed third-party developers to add actions from their apps to the Share sheet, I’ve wanted to see this feature enabled systemwide for pretty much anything I could conceivably share. As you can imagine, I save a lot of links to Pinboard and Instapaper. I also subscribe to a bunch of great newsletters, like NextDraft and CNN’s excellent Reliable Sources. But, while third-party email apps have long allowed you to share the contents of emails using the system Share sheet, the default Mail client hasn’t.

It’s a similar story in Safari: you’ve been limited to sharing just the frontmost tab’s URL using the Share sheet, and no other links on the page.

Previously, touching and holding on a link would pull up a series of options, one of which was to send the link to your Reading List. Now, for those of us who don’t use Safari’s Reading List, there’s a far better option available: touching and holding on any link will display a “Share…” option, which launches the system Share sheet. It’s terrific — truly, one of my favourite details in iOS 10.

App Store

As previewed in the week prior to WWDC, this year’s round of major updates brings with it some changes to the way the App Store works. Most of these changes have trickled out in a limited way this summer, including faster review times, and a beta test of ads in the American App Store. I’m Canadian, so I’m still not seeing ads, and that’s fine with me.

One thing that wasn’t clarified initially was the handling of the new Share feature for every third-party app. At the time, I wrote:

I sincerely hope that’s not just an additional item in every third-party app’s 3D Touch menu, because that will get pretty gross pretty fast.

Well, guess what?

That’s exactly how that feature works.

It isn’t as bad as I was expecting it to be. The Share menu item is always farthest-away from the app icon in the 3D Touch list, and it means that every icon on the home screen is 3D Touch-able, even if the app hasn’t been updated in ages.

For TestFlight apps, the Share item becomes a “Send Beta Feedback” item, which is a truly helpful reminder to do that.

Improvements for Apple Watch

While I won’t be writing a WatchOS 3 review — at least, not for today — there are a couple of noteworthy changes for Apple Watch owners on the iPhone.

There’s a new tab along the bottom of the Watch app for a “Face Gallery”. In this tab, Apple showcases different ways to use each of the built-in faces and how they look with a multitude of options and Complications set. I’m not one to speculate too much, but this appears to set the groundwork for many more faces coming to the Watch. I don’t think just any developer will be able to create faces any time soon, but I wouldn’t be surprised to see more partnerships with fashion and fitness brands on unique faces.

In addition, the Apple Watch has been added to the Find My iPhone app — and, yes, it’s still called “Find My iPhone”, despite finding iPhones being literally one-quarter of its functionality. Your guess is as good as mine.

Settings

With all sorts of systemwide adjustments comes the annual reshuffling of the Settings app. This year, the longstanding combined “Mail, Contacts, Calendars” settings screen has become the separate “Mail”, “Contacts”, and “Calendars” settings screens, as it’s now possible to individually delete any of those apps.

Additionally, Siri has been promoted from being buried “General” to a top-level item, complete with that totally gorgeous new icon. It doesn’t really go with the rest of the icons in Settings, to be fair, but it is awfully pretty.

Game Centre options
As Game Centre no longer has a front-end interface, its options have been scaled back to the point of near-pointlessness. There is no longer an option to allow invitations from friends, nor can you enable friend recommendations from Contacts or Facebook. The only option under “Friends Management” is to remove all friends in Game Centre. There is no longer a way to find a list of your Game Centre friends anywhere on iOS or MacOS. Yet, for some reason, the framework lives on. Given these ill-considered omissions, if I were a developer, I wouldn’t necessarily build a new app that’s dependent on it. Just a hunch.

There are a bunch of little tweaks throughout Settings as well. It now warns you if you connect to an insecure WiFi network, and — for some reason — the option to bypass password authentication for free apps has been removed.

Sounds

There may not be any new wallpapers in iOS this year, but a few of the system sounds have been refreshed. Instead of the noise of a padlock clicking shut, the revised lock sound is more reminiscent of a door closing. Perhaps it’s my affinity for the old lock sound, but the new one hasn’t grown on me. It feels comparatively light and thin — more like a screen door than a bank vault.

The new keyboard clicks, however, sound good enough that I kept them on for most of the beta period, and I really hate keyboard noises on smartphones. There’s a subtle difference in the noise between a letter key and a function key — such as shift or the return key — which should help those with reduced vision and those of us who type while walking.

I should say, however, that my dislike of keyboard sounds eventually caught up with me and I switched them back off. It’s a smartphone, not a typewriter.

Conclusion

iOS 10 is a fascinating update to me. Every other version of iOS has had a single defining feature, from the App Store in iPhone OS 2 and multitasking in iOS 4, to the iOS 7 redesign, iOS 8’s inter-app interoperability, and iOS 9’s iPad focus.

iOS 10 seems to buck this trend with its sheer quantity of updates. Developers have been asking for a Siri API for years, and it’s here, albeit in a limited form. The number of developers using the deep integrations in Messages and Maps is already higher than I had anticipated at this stage of iOS 10’s release, and I’m writing this the night before it launches.

Then there are the little things sprinkled throughout the system that I didn’t have time to cover in this review: breaking news notifications and subscriptions in individual News channels, a redesigned back button, CarPlay integrations, and so much more.

I may regularly bemoan individual parts of iOS. There are certain places where I wish Apple had made more progress than they did, but there are also aspects of the system that have been greatly enhanced in ways I’d never have expected. Saying that iOS 10 is the best release of iOS yet is a bit trite — you’d kind of hope the latest version would be, right?

But there’s so much that has gone into this version of iOS that I deeply appreciate. The experience of using it, from when I wake up in the morning to when I go to bed at night — oh, yeah, there’s this great bedtime alarm thing built into the Clock app — that I can’t imagine going back to a previous version of iOS, or to a different platform. It feels like a unified, integrated system across all of my devices. Some people may call this sort of thing “lock-in”, but I like to think of it as a form of customer appreciation.

Whatever the case, I highly recommend updating to iOS 10 as soon as you can. I hope it surprises and delights you the way it did for me the first time someone sent me an iMessage with a goofy effect, or the last time it made a Memories slide show for me. These are little things, and they’re mechanized and automated like crazy, but they feel alive, in a sense. iOS 10 isn’t just the best version of iOS to date; it’s the most human release.


A big thank you to Sam Gross for proof-reading this review, and to Janik Baumgartner for assisting with some iPad verification. Thanks must also go to all the developers who provided beta versions of their apps.

Adrian Chen, writing for the New Yorker:

I worked at Gawker for four years, walking the tightrope. The immediacy of publishing encouraged me to be extremely sure of arguments and facts and to write things I truly believed, since I had nobody to fall back on but myself. And, in order to find an audience, I had to be entertaining and provocative. At the site’s best, these two often conflicting impulses encouraged writing with a spontaneity, humor, and self-assuredness that wasn’t like anything else on the Internet. At its worst, it led to gratuitous meanness and a bad lack of self-awareness. I know I’m talking in generalities, but looking back on one’s old writing is rarely a fruitful prospect, even when it was produced under the most considered circumstances. There are plenty of posts that I’m proud of, and others that make me cringe to think about. Regardless, I can’t imagine having had a better place to develop as a journalist than Gawker.

I empathize.

For the past few years, tech companies have been publicly releasing the diversity statistics of their employees. Over the same amount of time, I’ve compared their numbers to United States national statistics, via the Bureau of Labor Statistics’ releases — you can see that in the 2015 and 2014 editions.

This year, it’s more of the same, in more ways than one: I’ll be comparing those stats side-by-side in the same categories as defined by the BLS’ EEO-1 form — which limits the available racial and gender identity information — followed by some brief analysis. New this year is that I’m also noting the year-over-year percentage point difference. Please be aware that rounding errors and other factors may create imperfect differences from last year’s figures; however, these differences are worthwhile guidance.

One more note: last year, LinkedIn and Yahoo released their stats at the beginning of June and July, respectively, while Amazon and Twitter released theirs later in August. A Yahoo spokesperson told me that their diversity report will be available in September, while a LinkedIn spokesperson is tracking down their report internally. I will update this article should their figures become available.

Gender Diversity

Gender stats are reported by all companies on a global level; ethnic diversity is typically reported at a U.S.-only level. In the past, I’ve compared both sets of stats against U.S. figures; this year, I’m adding worldwide labour participation rates for genders, for a more complete set of stats. The World Bank only reports female labour force participation for their worldwide stats; the male labour force participation has been inferred based on the binary gender system currently used for these reports.

Gender Diversity, U.S.A.
Category Male Female
U.S.A. Overall (approx.) 49% 51%
U.S.A. Workforce (PDF) 53.2%
Δ 0
46.8%
Δ 0
Worldwide Workforce (inferred) 60% 40%
Gender Diversity in Tech Positions

Amazon does not separate tech and non-tech positions, so the same data has been used for both.

Company Male Female
Amazon 63% 37%
Apple 77%
Δ -2
23%
Δ +1
Facebook 83%
Δ -1
17%
Δ +1
Google 81%
Δ -1
19%
Δ +1
LinkedIn 80%
Δ -2
20%
Δ +2
Microsoft 83.0%
Δ +0.2
16.9%
Δ +0.2
Twitter 87% 13%
Yahoo 83%
Δ -1
17%
Δ +1
Gender Diversity in Non-Tech Positions

Amazon does not separate tech and non-tech positions, so the same data has been used for both.

Company Male Female
Amazon 63% 37%
Apple 62%
Δ -1
38%
Δ +1
Facebook 47%
Δ -1
53%
Δ +1
Google 53%
Δ 0
47%
Δ 0
LinkedIn 48%
Δ -2
52%
Δ +2
Microsoft 58.1%
Δ +1.3
41.7%
Δ -1.1
Twitter 50% 50%
Yahoo 48%
Δ +3
52%
Δ -3
Gender Diversity in Leadership/Executive Positions

The “U.S.A.” row uses the “management, business, and financial operations” data row from the BLS report, as a rough and imperfect approximation.

Company Male Female
U.S.A. (PDF, pgs. 23-25) 56.3%
Δ -0.4
43.8%
Δ +0.5
Amazon 75% 25%
Apple 72%
Δ 0
28%
Δ 0
Facebook 73%
Δ -4
27%
Δ +4
Google 76%
Δ -2
24%
Δ +2
LinkedIn 65%
Δ -5
35%
Δ +5
Microsoft 82.6%
Δ +0.1
17.3%
Δ -0.1
Twitter 78% 22%
Yahoo 79%
Δ +3
21%
Δ -3

Ethnic Diversity

As Google says in their report, “ethnicity refers to the EEO-1 categories which we know are imperfect categorizations of race and ethnicity, but reflect the US government reporting requirements”. Please keep that in mind.

The “U.S.A. Workforce” row uses data provided by the Bureau of Labor and Statistics (PDF). Their demographics information (indicated page 9) is kind of a pain in the ass, though: the unemployed column is a percentage of the labour force, but the employed column is a percentage of the total population. I’ve done the math, though, and the results are what’s shown below. In addition, the BLS does not separate out those of Hispanic descent because “[p]eople whose ethnicity is identified as Hispanic or Latino may be of any race.” As such, the row will not add to 100%, but the percentage of Hispanics in the workforce has been noted per the table on page 10.

Similarly, the “U.S.A. Overall” row uses data from the CIA World Factbook, and they, too, do not note those of Hispanic descent separately. This row will also not add to 100%.

Ethic Diversity, U.S.A.
Category White Asian Hispanic Black Mixed Other or
Undeclared
U.S.A. Overall 79.96% 4.43% 15.1% 12.85% 1.61% 1.15%
U.S.A. Workforce (PDF) 79.1%
Δ -0.3
5.6%
Δ +0.1
16.3%
Δ +0.4
12.1%
Δ +0.2
1.8%
Δ +0.5
1.4%
Δ +0.2
Ethnic Diversity in Tech Positions

This year, I’ve added a row for the U.S.A. tech workforce as a whole, for comparison. It uses the “computer and mathematical operations” data row from the BLS report. Amazon does not separate tech and non-tech employees.

Company White Asian Hispanic Black Mixed Other or
Undeclared
U.S.A. (PDF, pg. 26) 70.0%
Δ -0.9
19.2%
Δ +0.7
6.6%
Δ +0.3
9.7%
Δ +1.4
N/A N/A
Amazon 60% 13% 9% 15% N/A 3%
Apple 55%
Δ +2
27%
Δ +2
8%
Δ 0
8%
Δ +1
2%
Δ 0
1%
Δ -5
Facebook 48%
Δ -3
46%
Δ +3
3%
Δ 0
1%
Δ 0
2%
Δ 0
<1%
Δ +<1
Google 57%
Δ -2
37%
Δ +2
3%
Δ +1
1%
Δ 0
3%
Δ 0
<1%
Δ 0
LinkedIn 35%
Δ +1
59%
Δ -2
3%
Δ 0
1%
Δ 0
2%
Δ +1
<1%
Δ 0
Microsoft 55.5%
Δ -0.3
35.8%
Δ +0.4
3.9%
Δ 0
2.3%
Δ +0.1
1.3%
Δ +0.1
0.8%
Δ +0.1
Twitter 56% 37% 3% 1% 1% 2%
Yahoo 31%
Δ 0
62%
Δ +1
2%
Δ -1
1%
Δ 0
1%
Δ 0
3%
Δ 0
Ethnic Diversity in Non-Tech Positions

Amazon does not separate tech and non-tech employees.

Company White Asian Hispanic Black Mixed Other or
Undeclared
Amazon 60% 13% 9% 15% N/A 3%
Apple 58%
Δ +3
12%
Δ +1
16%
Δ +2
11%
Δ +1
3%
Δ 0
1%
Δ -6
Facebook 60%
Δ -2
25%
Δ +1
7%
Δ 0
5%
Δ +2
3%
Δ 0
1%
Δ 0
Google 63%
Δ -1
23%
Δ 0
5%
Δ +1
4%
Δ 0
4%
Δ 0
<1%
Δ 0
LinkedIn 67%
Δ +1
20%
Δ -5
6%
Δ +2
4%
Δ +1
3%
Δ +1
<1%
Δ 0
Microsoft 67.6%
Δ 0
13.8%
Δ +0.7
8.6%
Δ +0.6
6.2%
Δ +0.1
1.4%
Δ +0.1
0.8%
Δ 0
Twitter 62% 24% 4% 4% 1% 5%
Yahoo 67%
Δ +1
18%
Δ -1
6%
Δ 0
3%
Δ 0
3%
Δ 0
4%
Δ +1
Ethnic Diversity in Leadership/Executive Positions

The “U.S.A.” row uses the “management, business, and financial operations” data from the BLS report, as a rough and imperfect approximation of the broad US national trend.

Company White Asian Hispanic Black Mixed Other or
Undeclared
U.S.A. (PDF, pg. 25) 84.2%
Δ -0.1
6.1%
Δ 0
8.9%
Δ +0.5
7.5%
Δ +0.1
N/A N/A
Amazon 71% 18% 4% 4% N/A 3%
Apple 67%
Δ +4
21%
Δ 0
7%
Δ +1
3%
Δ 0
1%
Δ N/A
0%
Δ -6
Facebook 71%
Δ -2
21%
Δ 0
3%
Δ 0
3%
Δ +1
2%
Δ +1
<1%
Δ N/A
Google 70%
Δ -2
25%
Δ +2
1%
Δ 0
2%
Δ 0
2%
Δ +1
<1%
Δ 0
LinkedIn 63%
Δ 0
30%
Δ 0
3%
Δ -1
1%
Δ 0
3%
Δ +1
0%
Δ 0
Microsoft 70.1%
Δ -1.0
22.4%
Δ +1.1
4.0%
Δ +0.1
2.1%
Δ -0.1
0.7%
Δ 0
0.4%
Δ 0
Twitter 72% 28% 0% 0% 0% 0%
Yahoo 72%
Δ -1
22%
Δ +3
3%
Δ +1
0%
Δ -1
0%
Δ -2
4%
Δ +1

Analysis

Let’s get something out of the way: I’m a white twenty-something Canadian who graduated from art college. Analysis of statistics of racial and gender diversity at American tech companies is not exactly my strongest suit. But, hey, you’ve made it this far. I want to be as fair as possible to everyone represented in these stats and at these companies. If there’s a problem, please let me know.

  • Apple notes this year that they achieved pay equity for all U.S. employees.

  • Apple also says that they reduced the amount of employees who chose not to declare their race or ethnicity compared to previous years. The majority of those identified as white.

  • Microsoft was a real mixed bag this year, becoming whiter and more male in a few areas — and, in some, significantly so.

  • Facebook made a relatively large 8 percentage-point shift in favour of women in leadership roles. No other company reported as large of a gain in any demographic.

  • Facebook also became the first company to highlight their LGBTQ community, with 7% of their staff identifying.

  • However, a disproportionately low presence of black employees continues at Facebook, Google, and Microsoft. All three companies have released products with flaws experienced by black and darker-skinned users — issues that, if those companies had a greater proportion of black employees, would likely have been found and corrected.

  • I will reiterate that one of the excuses most frequently cited by tech companies for their lack of diversity is a small selection of underrepresented prospective employees coming out of colleges and universities in the United States. This is false.

  • Across the board, most gains are on the order of one or two percentage points, or even less. This is similar to last year’s incremental improvements.

  • Even though half the companies I survey annually have yet to release their latest data, I don’t anticipate much difference from last year. As I said at the top, however, I will update this should those figures become available.

  • Something that, unfortunately, comes with reporting any stats on gender and ethnicity is that angry white men use it to try to support their thesis that the white male is oppressed. These people can quietly fuck themselves.

Update Aug 15: A LinkedIn spokesperson has told me that their stats will be out by the beginning of October, but noted that their numbers are “looking strong”. We shall see.

Update Oct 19: LinkedIn’s figures are now current for 2016. LinkedIn reported some of the most positive gains overall, especially for women at the company. LinkedIn remains one of the few companies where the non-tech category has more women than men. Even so, an 80/20 split for tech employees puts them in the middle of a pack led by Amazon and Apple.

Update Oct 31: Yahoo’s data is now current for 2016. Their non-tech staff actually became whiter and more male overall, while leadership staff also became more male. There are some minor indications of improvements, but this year’s report from Yahoo generally shows a regressing trend — completely the opposite of the claims of a recent lawsuit against Yahoo.

Just a taste of Anna Wiener’s brilliant essay for N+1 magazine:

An old high school friend emails out of the blue to introduce me to his college buddy: a developer, new to the city, “always a great time!” The developer and I agree to meet for drinks. It’s not clear whether we’re meeting for a date or networking. Not that there’s always a difference: I have one friend who found a job by swiping right and know countless others who go to industry conferences just to fuck — nothing gets them hard like a nonsmoking room charged to the company AmEx. The developer is very handsome and stiltedly sweet. He seems like someone who has opinions about fonts, and he does. It’s clear from the start that we’re there to talk shop. We go to a tiny cocktail bar in the Tenderloin with textured wallpaper and a scrawny bouncer. Photographs are forbidden, which means the place is designed for social media. This city is changing, and I am disgusted by my own complicity.

“There’s no menu, so you can’t just order, you know, a martini,” the developer says, as if I would ever. “You tell the bartender three adjectives, and he’ll customize a drink for you accordingly. It’s great. It’s creative! I’ve been thinking about my adjectives all day.”

This is so very, very good.

Last month, Kickstarter hired Mark Harris to investigate the circumstances around the failure of the most-funded European project in their history: a drone called Zano to be built by a company called Torquing. It’s the evergreen story of a lack of understanding conflicting with the ambition and creeping scope of the product:

On 18 November, the axe fell. Torquing announced via a Kickstarter update that it was entering a creditor’s voluntary liquidation, the UK equivalent roughly of an American “Chapter 7” bankruptcy filing. It appointed a liquidator who would bring its business operations to a close and attempt to sell the company’s remaining assets to pay its outstanding bills. Legal documents show that Torquing had not only burned through the £2.5m from its Kickstarter campaign, it had run up another £1m in debt. It was Kickstarter’s most spectacular flame-out to date.

No more Zanos would be made or sent out. Staff were sent home, and Torquing’s supercomputer was switched off and would be sold for parts. Because the Zano drone checks in over the internet with Torquing’s servers each time it powers up to retrieve calibration data and updates, the few drones in the wild were instantly and permanently grounded, like a dastardly robot army foiled in the last reel of a bad sci-fi film. After an abrupt final post on Kickstarter, Zano’s creators retreated offline and refused to engage with either backers or Kickstarter itself, contrary to the platform’s policies for failed campaigns.

It’s long — Medium estimates a 53 minute read time — but it’s worth spending some time with. Terrific reporting and storytelling make for a compelling autopsy.

At around 9:00 at night, the temperature in Magelang finally drops to a more hospitable 28°C from the 37° or so that it’s been hovering at. My girlfriend and I are here, just outside Yogyakarta, for this leg of the trip and we’ve stopped at a warung for dinner — think of a small trailer that can be pulled behind a bicycle serving ridiculously tasty food. This warung is known for several noodle dishes, but we’ve asked for mie godog — literally, “boiled noodles”. The broth from this cart is made with candlenut and it’s cooking overtop some hot coals in a wok with spring onions, garlic, some mustard greens, and the aforementioned egg noodles. Every few seconds, someone on a scooter or motorbike putters past, inches from the trio of younger men sitting and smoking on the stoop of the karaoke bar next door.

I’ve taken a couple of Live Photos of the scene and play them back, and I realize that it’s captured the sights and sounds well enough that I’ll be able to show my friends and parents back in Canada, but something’s missing: the smell of this place. It’s a distinct blend of engine fumes, clove cigarette smoke, burning wood, and this incredible food. This, to me, says worlds about the immediacy of place of Live Photos, as well as the limitations that they have. They are a welcome step closer to better capturing a moment in time, but the technology isn’t quite good enough yet for this moment.

A warung in Magelang.
A warung in Magelang

I’ve been using an iPhone 6S since launch day — “Space Grey”, 128 GB, non-Plus — and I’ve read all the reviews that matter. But when I boarded a plane on October 24 from Calgary to Surabaya, I was unprepared for the way that this product would impact my travels, and how my travelling would impact my understanding of mobile technology.


We begin this story during a stopover at Vancouver International Airport. As this is an upgrade from an iPhone 5S, I’m still getting used to the size of the 4.7-inch 6S. After just the short hop from Calgary, I’ve noticed that my 6S feels less comfortable in my front-right jeans pocket, to the point where it becomes an obligation to remove it upon sitting down in a tight airplane seat.

This issue is exacerbated by the addition of a case. I never use one, but I felt that it would make my shiny new phone last a little longer in the rougher conditions I’d be experiencing at some points of my trip. My Peel case didn’t show up in time — something about a fulfillment issue — so I settled for Apple’s mid-brown leather model. It’s nice, but even after just a couple of days, I’m already seeing staining on the edge of the case, where it wraps around the display.

At least it gets rid of that damn camera bump.

My girlfriend and I kill some time by hopping on the moving walkways and checking out some of the kitschy tourist shops that dot the halls. I pull out my phone and take a few short videos across the different available video quality settings. I’m sure 4K looks better, but I don’t have a display that can take advantage of that resolution; 60fps looks great, but also looks too much like a home movie. I kind of wish Apple would add a 24fps mode, for a more cinematic feel. I settle on 30fps 1080p: it’s not exotic or technologically advanced these days, but it works pretty much everywhere and looks gorgeous. Even without the optical stability of the 6S Plus, I’m still impressed by how well the camera cancels out shakiness.

After a couple of hours, we board the twelve-plus-hour flight to Taipei. I pull my phone out, sit down, and notice that the Airbus seats lack power outlets. I check my phone, and it looks like there’s 50-odd percent left. In airplane mode, it should be fine for listening to music for much of the flight and taking the odd photo and video. Maybe without much battery life to spare, I’d even get some sleep.

Taipei from above, 5:28 AM.
Taipei

We land in Taipei bright and early, and steer immediately to the complimentary showers to freshen up. My iPhone is on the last drips of power in low battery mode, but the shower room has an outlet to top it up. We have an eight-hour layover here which we’ll be spending entirely in the airport — thankfully, with free and reasonably speedy WiFi.

I review the photos I’ve taken while circling the city earlier and am pleasantly surprised at their quality in the dim twilight and smog.

In a little over two hours, we’ve seen most of the airport, which, as with every other, consists of duty free shops only occasionally separated by small cafés and restaurants. There are plenty of tech-related shops selling iPhones, MacBooks, and iPads, all at prices much greater than the exchange rate would suggest. Even outside of the airport, Apple products, in particular, are expensive on this side of the world, especially for the middle-or-lower-class income bracket.

I try to log into Tumblr, an account on which I’ve enabled two-factor authentication via text message. I realize that I cannot receive the confirmation message as I’ve turned off all data on my phone to avoid exorbitant roaming charges. Damn.

After another few hours spent walking throughout the airport in a fruitless search for a basic and inexpensive shirt, it’s finally time to board the flight to Surabaya via Singapore.


Despite taking the same plane and the same seats for the second half of this flight, it’s necessary — for some reason — to leave the aircraft and turn around, passing through a security check again. This irritates me, as my pockets and bag are full of crap that I’ve accumulated in entirely “secure” zones, yet cannot be brought back onto the flight.

To make matters worse, the WiFi at Singapore’s airport requires text message authentication, which I cannot get, cf. my troubles logging into Tumblr. It’s also usually possible to get a code from an attendant, but none are present because it’s late at night, of course.

Thanks to the extra memory in the A9 SoC, I still have plenty of Safari tabs cached so I don’t necessarily need a live internet connection. Unfortunately, it hasn’t preserved all of them — the camera still takes as much memory as it can. My pet theory is that Apple could put desktop levels of RAM in the iPhone and the camera would still purge Safari tabs from the cache.


It’s 11-something at night by the time we land in Surabaya. My phone retains a decent charge despite none of the planes including seat-back power outlets. We exit the airport into the overwhelming Indonesian humidity and heat, and hop into a small van to take us to our hotel.

As we wind through the city, I try recording with my phone pressed against the window. If you’ve ever filmed anything at night in even a moderately well-lit city, you know how difficult this is; in Surabaya, with its extremely dim lighting, almost nothing is visible. I put my phone on the seat and watch the city scroll by.


In the morning, we head over to the mall to pick up a SIM card for my time here. On Telekomsel, 4 GB of 3G data plus plenty of messages and call time costs me just 250,000 Rupiah, or about $25 Canadian. I later learn that it should have cost about half that, but I’m a tourist. Whatever the case, that’s a remarkable deal; at home, I pay $55 per month for 1 GB of data.

I’ve never previously tried swapping my SIM while iMessage is active, or adding a phone number to an existing iMessage account. I have to power-cycle my phone so that Telekomsel can activate the SIM, and another time to get it to work with iMessage, after re-enabling cellular data.

But it doesn’t quite work correctly. I’m presented with a prompt to “update” my Apple ID password, and I can’t figure out whether I need to set a new password or simply type it in again. I try the latter and find that the WiFi hotspot I’m connected to is too slow to reach the Apple ID servers. I try a few times, try a third power cycle, pop in my Apple ID password again, and iMessage is finally activated.

I try messaging a friend in Calgary. To my surprise, it fails. I realize that I must add the country code; despite having prior correspondence of several years while in Canada, it does not automatically resolve this. My friend reports that he’s receiving messages from both my new Indonesian number and my iCloud email address. I chalk this up as another instance where iMessage doesn’t understand that we typically want to message people, not numbers or addresses.

I get a tap on the wrist: my Apple Watch notifies me that it is using iMessage with the same email addresses that I’ve been using for years. Sigh.


After two days in Surabaya, we board a plane for Bali. Destination: Ubud, near the middle of the island. After checking into our hotel, I grab my “proper” camera and head off on a short walking tour of the area.

I’ve opted to bring my seven year-old Canon XSi — coincidentally sporting the same 12 megapixel count of the iPhone 6S — and Canon’s cheap and cheerful 40mm portrait lens plus a polarizer on this vacation (those are affiliate links). It’s not the latest gear, but it’s versatile enough when accompanied by my phone.


Ubud is a fascinating little town. It isn’t coastal, so we don’t get any beach time, but it’s an artistic and vibrant place. It happens to be extremely hot during the early afternoon, which makes spending any extended time under the sun uncomfortable and delays the time that we decide to explore. Due to Bali’s proximity to the Equator, the sun sets somewhere between 5:30 and 6:00, and “magic hour” seems to last the entirety of late afternoon. That’s great news for my vacation photos.

In spite of the heat, we take a walk one afternoon in search of some sandals; the ones provided by the hotel are fine for the room, but not for walking around the city. We duck into a small restaurant for lunch, and my girlfriend orders sate. It’s served in a miniature clay grill overtop hot coals, and I realize that this is the kind of moment the Live Photo feature was built for.

Other reviews have pointed out that it’s sometimes hard to remember that the video component continues to record after taking the still photo. I find it difficult to remember that it begins to record video before tapping the shutter button, so I must remember to wait a couple of seconds between tapping to focus and snapping the still; I must also remember to keep my phone raised after taking the picture. It takes me a few tries to get the hang of it, but I’m pleased by the result. Yet, I cannot share it with anyone — a month after the 6S’ release, it seems that none of the popular services that I use support Live Photos.

The next night, we explore the city later in the afternoon, when it’s a tiny bit cooler. I haven’t remembered to bring my DSLR, as we only plan on going for dinner and poking around some boutiques.

We spot a sign directing passers-by to a rice field “50 metres” away, and decide to take a look. After a walk of probably double that distance along a very sketchy path, with sheer drops on one side, we arrive at one of the most breathtaking landscapes I’ve ever seen. Rice paddy fields stretch from both sides of the single-track lane, framed by coconut trees. A rooster crows in the distance. The sun is low in the sky behind a bit of smog, so it’s a perfect golden hue.

Rice paddy fields in Ubud.
Rice paddy fields in Ubud

It’s so beautiful that it takes me a few minutes to remember to pull out my phone and, low-ish battery be damned, begin snapping. I snap plenty of landscapes on either side, take the requisite panorama, and even a few Live Photos. Later at the hotel, I review these photos and realize that I can’t remember which ones are “Live”, and which ones are not. I look in vain for a Live Photos album; despite every other “special” photo and video format available on the iPhone being filtered into their own album, it simply doesn’t exist for Live Photos. I try searching “live”, or looking for an icon in the thumbnail view — neither works.

I finally stumble across them as I swipe through the photos I shot on the rice fields that day and notice a slight amount of motion. This is apparently the only indicator of a Live Photo, and the only way to find one. Not easy.

But, as I take a look at the few I’ve shot so far, I see great value in the feature. Live Photos can’t capture everything, but they greatly enhance an otherwise static scene. The sound and video snippet add context and a better sense of place: the rooster crowing, the crickets, and even the steam and smoke curling up from that sate the previous day. I have some perfectly fine still photos, too, but their context is entirely imagined; every Live Photo I’ve taken so far does a better job bringing the memory back. It’s too bad that the heat and smell of the place can’t yet be captured as well.

In any case, I don’t think Live Photos are the gimmick some see them as. They’re a little bit cute, but they work remarkably well.


We spend a day travelling from Ubud to Seminyak and seeing the sights there. Our driver, Sandi, had in his car — among the television screens, faux fur on the dash, and short shag roof liner — a USB outlet for passengers to charge their phones. But, he tells me as I plug mine in, most people just use their power banks. I tell him that I’ve noticed a lot of portable batteries around and he says that some people carry two or more, just in case. He says that this is completely normal.

I’m starting to question the power consumption of my own phone. I’ve used it for long enough in Calgary that I know that I can get a full day’s use out of it, from 7:00 in the morning until late at night. Here, I’m not getting even half that. I check the battery statistics and see that all of my usual web-connected apps have a “low signal” notice.

Not only is 3G service somewhat slower than you might expect in this region, it also has patchier coverage. That eats battery life at a much more significant rate, particularly if you have background services polling for data regularly. iOS is supposed to compensate for this, but if you have email inboxes set to refresh on a timed schedule, it seems as though it will obey that regardless of signal strength.

The low battery mode in iOS 9 does a good job of substantially increasing battery life when cellular coverage is less than ideal. I find it indispensable: coverage is too poor for my inboxes or Tweetbot to refresh regularly, and I don’t really want to check my email much while on holiday anyway.

After dropping our bags at the hotel, we head to Uluwatu for the world-famous kecak dance, performed at sunset on a cliff above the sea. I am so captivated by the dance that I all but forget to take photos until the climax, where the dancer playing the monkey is encircled by fire.

We hang around following the dance to take photos with some of the performers. There are a couple of floodlights illuminating the stage area, but it’s still pretty dark. We get our turn to take a selfie with the monkey performer, and I turn on the new front-facing flash. The photo comes out well — great white balance, well-exposed, and not too grainy — but we look sweaty and tired; I shall spare you that sight.


The next day, we head to the beach. Our hotel is just two blocks away, but even that feels sweltering; the cool waters of the Indian Ocean are beckoning. I shoot with both my iPhone and DSLR here. Normally, I’d be very cautious about stepping into the waves for some more immersive shots with my iPhone pocketed, but the increased water resistance of the 6S gives me more confidence that a few light splashes won’t be an issue, especially with a case.

When we get back to the chairs by the side of the beach, I notice that some lint from my pocket has accumulated around the edges of the case. I pop my phone out to dust it off and notice just how nice it feels in the hand. It is not, to my eyes, the best-looking iPhone industrial design — that would be the 5S followed by the original model — but it is the best-built, by far, and feels fantastic in the hand, despite the size. I’m looking forward to using it regularly without a case again at home.


We weave through Seminyak, then onto Yogyakarta, Magelang, and Rembang. Dinner in the latter two cities is often spent at warungs — it is some of the best food you can have anywhere, provided you know which ones are clean.

Our last dinner in Rembang is in a particularly interesting warung. The proprietor is known for his interpretation of nasi tahu — literally translated as rice and tofu. He sets up his preparation table surrounded on three sides by small, low benches, each of which can seat no more than three or four people. Tarps are draped overtop to protect against the possibility of rain — ’tis the season, after all.

We’ve squeezed ourselves onto the bench directly opposite the cook, currently mashing together peanuts, garlic, lime, and some broth into a paste while frying fist-sized lumps of tofu. It’s crowded and, with a bubbling wok of oil behind the cook, it’s hot, but the combination of every sensation makes the scene unforgettable. I want to show people at home, so I take a few photos on my iPhone of the cook at work, trying also to capture the close quarters of the space.

A warung in Rembang serving nasi tahu.
A warung serving nasi tahu

It occurs to me that taking photographs in this environment would be unnatural and straining were it not for a camera as compact and unassuming as my iPhone. Even my DSLR equipped with a pancake-style portrait lens — which I’ve specifically chosen to be less imposing — would be too obtrusive in this moment.


The final few days of our vacation is spent at a home in Surabaya that doesn’t have WiFi. That’s fine in terms of my data consumption, but the slower 3G connection tends to choke on any modern media-heavy site. Every unnecessary tracking script and every bloated cover image brings my web browsing to a crawl.

Then, I run into an issue where my connection refuses to complete. My iPhone shows me a dialog box informing me that there has been a “PDP authentication failure”. I do not know what PDP is, why it must authenticate, or why its failure means I can’t load anything online. I reset my network settings and that seems to work for a while, only for PDP to be unauthenticated again, or whatever.

I reset and search the great IT help desk that is Google for an answer. The top result is a Reddit thread, so I tap on it, only for it to fail to load. I page back and try an Apple Support thread link and it works fine; though, of course, it has no answers. Reddit, specifically, will not load on my 3G connection.

I get sidetracked from my PDP issue and do a little bit of digging. It turns out that Indonesian pornography laws prohibit both porn itself, and any conduit for it. Though Indonesia does not have a nationwide firewall a la China, the federal government has pressured the major ISPs and cellular networks to block major sites that allow access to porn.

Later in the day, we get carried away at Historica Coffee and forget to grab dinner. There’s not much open at midnight on a Wednesday, particularly if you’re not that interested in what I had been warned was maybe-it’s-food from sketchier vendors.

I swipe to the right on my home screen expecting to see shortcuts to late night food, but that feature isn’t enabled here.

I open Yelp. “Yelp is not available in your country.”

We opt for a nearby late night Chinese food place, and it’s pretty damn good.


On the long series of flights home, I get a chance to review photos from both my DSLR and iPhone while triaging my Instapaper queue. I have more than a few articles saved that proclaim that the camera in an iPhone today is good enough to be considered a camera, not just a smartphone camera. These articles began to percolate around the time of the iPhone 4S, and they are a perennial curiousity for me, especially as I glance at my screen of crisp photos taken on my DSLR.

There’s no question that an iPhone has never had a better camera than those in the 6S and 6S Plus today, with the latter edging out the former due to its optical stabilization. iPhones — and smartphones in general — have taken very good quality photos for the past few years, and I would not hesitate to print or frame any of them. In fact, every photo in this travelogue is unedited, and I think they all look pretty good.

But I’m looking now at photos from that paddy field back in Ubud, and there is an inescapable muddiness to the trees in the background. I didn’t bring my DSLR on that walk to compare, but I’ve no doubt it would render a vastly clearer and more detailed image.

iPhone on the left; Canon on the right, both at 100% size. Both feature 12 MP sensors, but the iPhone has a much wider-angle lens and a significantly smaller sensor.
iPhone on the left, Canon on the right

Similarly, I have photos taken on both cameras from atop a cliff near Uluwatu of surfers paddling in the waves. The wide-angle lens of my iPhone provides a better idea of the scope of the scene, but the surfers are reduce to dark blobs. The images captured on my “real” camera show the clarity in the water and the surfers are clearly human beings.

This is, of course, a completely unfair comparison: the APS-C sensor in my XSi has about ten times more area than the iPhone’s sensor, and it’s paired with a much bigger lens which allows more light in. But, it does illustrate just how different the quality of image is from each device.

There are all kinds of tricks that are easier with a DSLR, too, like tracking focus from or of a moving object. For example, I will look through the windshield from a moving car for potentially interesting roadside scenes. Upon spotting one, I’ll grab focus of something of a similar focal distance as the objects of the scene, then move my camera in the opposite direction of travel at a similar speed. This is much easier on highways where speeds are constant, so I’m able to develop a rhythm of sorts. With my DSLR, this is tricky, but something I can reliably do; I have never succeeded in using this technique on my iPhone. It might be the rolling shutter or something I’m not doing quite right, but I also have not heard of someone else doing something similar.

I offer this not as a complaint with the iPhone’s camera, but as clarification that there is still great value to having a camera with a big-ass sensor and a great lens. I’m under no illusions; I am an optimistic hobbyist photographer, at best, but I can’t shake the feeling that I made the right decision in bringing my DSLR as well. It’s bulky, cumbersome, old, has “hot” pixels on the sensor, and creates gigantic RAW files that occupy a lot of space on my MacBook Air.1 However, it creates beautiful images to last a lifetime, and that’s what counts most for me.


I’ve spent an hour or so in an “e-library” in Taipei’s international airport wrapping up this travelogue. Nobody seems to use the e-libraries here, so they function as a pseudo-private lounge, and a pretty great place to find a power outlet. It’s quite nice.

There were some things I expected about bringing my iPhone to Indonesia. I anticipated that I’d use it to keep in touch with a few people at home, look up addresses and directions, and be able to take great-quality photos anywhere, any time. But I learned a lot about the phone, too: Live Photos showed their brilliance, and I was able to extend my battery life despite none of the aircraft including seatback power. I found out just how well the camera works for capturing intimate moments that would feel artificial or posed if I were to use my DSLR, and figured out some new iMessage limitations.

What I learned most, though, isn’t about the iPhone 6S directly; it’s about the role of technology and the internet in a developing nation.

In most developing nations, the proliferation of technology is limited by policy and economics; Indonesia is no exception to this. But, while I was there, I saw people regularly carry two, three, or more smartphones: usually an inexpensive Android phone — something like a OnePlus or a Xiaomi — plus either an iPhone or a BlackBerry. Apple’s products are still very much a luxury: an iPhone is about a third more expensive in Indonesia than it is in the United States, while the median income is half that of the U.S.2

The Jakarta Post reports that only about 29% of Indonesians are connected to the internet, and the internet they’re connected to is different than the one you and I are used to. But they’re making the most of what they’ve got, and established their own rules and understanding — it isn’t rude to use your phone at the dinner table, for instance, and Path remains alive (remember Path?). Not all the services and products you and I have come to expect have made their way there, and if you think search in Apple Maps is poor where you live, you’ve seen nothing yet.

I escaped to Indonesia for a relaxing vacation in a part of the world I’ve never visited. I vowed to get off the beaten path and out of my cushy boutique hotel. In doing so, I leave with a hint — but only a hint — of what life is like for hundreds of millions of Indonesians. In doing so, I learned a little bit of how they use technology; their smartphone is often their only computer and only connection to the internet.

There is something further to consider here: we — designers, developers, and product people — spend a lot of time worrying about how our new product looks and works in U.S. English on an LTE connection, for the tastes of an American (or, at least, Euro-centric) audience. We spend little time asking how it will function for people who fall outside those parameters — parameters which, by the way, narrow as fast as greater amounts of people get connected to the web. My quip about Path earlier is indicative of this: we assume Path is dead because we don’t use it; yet, it has, as far as I can work out, a respectable user base in Southeast Asia, and that market grows every day.

I’m not pretending to be fully educated in the country after spending just three weeks there, but I am certain I understand it better than three weeks ago. Indonesia is beautiful, breathtaking, delicious, and full of the nicest and most accommodating people I’ve ever met, and I’m Canadian. You should go. Bring a camera.


  1. Not to mention post-processing in Photos on OS X, which remains an app that is hard to love. My workflow for a trip like this is to shoot hundreds of images, import them all into one album for the trip, and then pick my selects from that album.

    In Aperture, I’d give five-star ratings to the images I was certain about, four-star ratings to those that might have potential, and no stars to images I wouldn’t bother using. (The digital packrat in me doesn’t delete them — just in case, I suppose.) Then, I could simply filter to four-star-or-better images and edit within that set, upgrading some to five-stars if I deemed them worthy. Exporting was as simple as selecting the already-filtered set within the album.

    Photos doesn’t have this level of granularity: you either “like” a photo, or you do not. That keeps things a lot simpler, and I don’t mind that. What I do mind is that there appears to be no way to find only photos I’ve liked within an album. My workaround has been to create a smart album with that filtering criteria, but that seems like a workaround, not a solution. ↥︎

  2. This has other effects, too: a couple of years ago, I guessed that data problems and inconsistencies in Apple Maps would be less frequent in places with more iPhone users, and I think that’s true. With less penetration in Indonesia, Apple Maps often lacked significant local points-of-interest. ↥︎

Alex Guyot reviewed watchOS 2 for MacStories:

Software defines the Apple Watch as much, if not more so, than the hardware which embodies it. But as well designed and integrated as the first iteration of the Apple Watch’s software (aptly named, though questionably capitalized, watchOS) was, it wasn’t perfect. Some might argue it wasn’t even acceptable. Third-party apps were underpowered and massively handicapped by long load times; when disconnected from its iPhone, the Watch was rendered useless for almost anything except telling the time; and with no Activation Lock, stolen Watches could be reset and resold with ease.

Enter, watchOS 2.

One thing I didn’t notice in Guyot’s review is a comment on overall battery life. When I was running watchOS 1, I’d get home with probably 15-20% life at the end of the day; with watchOS 2, I’m seeing 40-60% left at the end of the day, and I am certainly not using it less.

Don’t miss the footnotes in this review; some of his best observations are buried therein.