Month: June 2015

The year after the release of iOS 7 was a mad dash for designers and developers updating their apps to make them fit with the new user interface language. Many developers took this time to reconsider how their apps should work, too, in a sort of conceptual, flow-oriented way. They could ask themselves questions like “Does this screen have the best layout possible?” or “Is this glyph as clear as it should be?” But there was comparatively little in terms of absolute new functionality. Yes, there were thousands of new APIs and new frameworks like SpriteKit. But the vast majority of innovations on the sides of both Apple and iOS developers came from strides made in UI design.

If iOS 7 tipped the scales a bit too much in the direction of the “how it looks” part of design, iOS 8 went directly for “how it works”. In addition to the plethora of new features available to end users — Health, predictive text, Continuity, and iCloud Photo Library, to name a few — iOS 8 also dumped onto developers colossal new capabilities unprecedented on the platform, in the form of App Extensions. Because these APIs were so new to iOS, few developers were able to put together really effective ways of using Extensions while I was working on my review. But, over the last eight months, we’ve seen enhancements to apps that we could only have dreamt about previously.

September’s release of iOS 8 — and my launch-day review — only told half the story of the current version of the operating system. This is the second half of that story.

iOS 8, Revisited

It’s only really possible to understand how one really uses an operating system after the novelty of its newness has worn off, and we are well past the honeymoon period here, people. We’re on the cusp of the introduction of an updated version of iOS. How’s iOS 8 holding up?

Messages

The headlining new feature in Messages was the addition of disappearing audio and video messages. When I was using prerelease versions of iOS 8, I was curious to see what kind of adoption these features would have when the general public got their hands on the OS. And, many months in, I have yet to receive a single intentional audio or video message from my iPhone-using friends, though I have received a number of accidental audio clips because the gesture used to send them is sometimes a little finicky.

The lack of video messages doesn’t surprise me in the slightest. While iMessage may be one of the world’s most popular messaging services, everyone I know uses Snapchat to quickly send photos and videos. Unlike Messages, which sends giant photos and full-res HD video, Snapchat heavily compresses everything. It looks a little shitty, even on a tiny phone screen, but it’s fast and doesn’t eat up your capped data plan.

But I haven’t sent or received a single audio message, and that’s not what I expected:

Leaving a brief audio message is a great way of sending a message that’s more personal than a text message, and less interruptive than a phone call. Apple has executed many aspects of this feature remarkably well, too. The resulting audio files are tiny and heavily-compressed — typically less than 1 KB per second — making them perfect for a simple voice message that sends and receives quickly. When the other end receives the message, they don’t have to interact with the notification at all. They can simply raise their phone to their ear and the audio message will play.

Nothing about the execution seems flawed to me, less the easy-to-trigger gesture to send these recordings. I don’t think my friends are especially rude, either. Perhaps it’s just easy enough to decide whether to send a text message or make a phone call, and there isn’t much wiggle-room in between.

Keyboard

I have complaints.

I’ve been using the soft iOS keyboard since 2007, so I’ve become acclimated to its rather unique characteristics. When Apple changes something about it, I notice. And, oh boy, have I noticed some changes.

The most significant change, by far, is the introduction of predictive typing, dubbed QuickType. Appearing in a strip above the main keyboard area are three cells guessing at what you might type next. Sometimes, it’s pretty clever: when given an a or b choice in a text message, for example, it will display both a and b as predictive options. I like that.

What I don’t like is what this has done to autocorrect. I’m not sure if it’s entirely a side effect of the predictive typing engine, but autocorrect’s behaviour has changed in iOS 8 and it drives me crazy.

When the QuickType bar is enabled, the autocorrect suggestion will appear in the middle cell of the bar instead of as a floating balloon above the word, as it has done since the very first version of iOS. I find this far too subtle. Even more subtle is the way you ignore the autocorrect suggestion: since the bubble doesn’t exist for you to tap on to ignore it, you tap on the leftmost cell of the QuickType bar with your verbatim spelling. And that feels really weird to me.

This behaviour is something I never got used to, so I turned off the predictive keyboard days after publishing my review in September. This brings the keyboard back to a more iOS 7-like state, with classic autocorrect bubbles. But I still think something’s going on under the hood with the autocorrect engine. I can’t prove it, but suggested corrections have become substantially worse after I upgraded to iOS 8, and I’ve heard similar stories from others. I’m not sure my perception matches reality; it might simply be confirmation bias. But it feels like I’m getting worse suggestions than I did previously.

Apple also still has yet to fix the awful shift key added in iOS 7.1. I’ve heard rumours that the iOS 9 keyboard’s shift key has been redesigned. We’ll see.

In better news, the emoji keyboard was significantly improved in iOS 8.3, with an infinitely scrolling palette instead of siloed sections. It makes a lot more sense, and it’s a welcome change. Unlike its OS X counterpart, however, it doesn’t provide search functionality, nor does it include a full extended character palette. It would be great if a future version of iOS included these features, especially on iPad.

Camera

I take a lot of pictures on my phone; therefore, I’m almost certain that I’ve used very few new iOS 8 features more than manual exposure compensation. Oh, sure, the iPhone still has the best ratio of photo quality to the amount of effort required to take it. But now you can put in a hair more effort in a simple way, and get a hell of a better photo out of it.

One of the things I discovered through using this feature all the time is that it’s possible to layer exposure compensation with focus lock or HDR. This means it’s possible to capture far better images of high-contrast scenes, like sunrises and sunsets, or live concerts. It’s also possible to abuse this feature to create excessively over- or under-exposed scenes, which can be used to interesting effect.

Like its slow-mo counterpart, the new time-lapse video feature isn’t something that anyone I know of — save Casey Neistat — uses very often, but is great to have when you want it. It’s the kind of feature that seems unnecessary most of the time, but when you need it, you probably don’t want to dig around for a third-party app. It’s better to have it built-in. The results are excellent, rivalling — in practical terms — the kind of results I get doing something similar on my DSLR without doing the kind of work that a DSLR time-lapse requires.

Photos

The biggest enhancement to Photos in iOS 8 was iCloud Photo Library. I covered my experiences with iCPL in my review of Photos for OS X; here’s an excerpt:

Uploading [gigabytes of] photos on my home broadband connection took what I imagine is a long time, but I’m not certain exactly how long because it’s completely invisible to the user. It runs in the background on your Mac and on your iPhone, when you’re charging and connected to WiFi. I can’t find any setting that would allow you to do this over LTE, but I’m not sure why you’d want to — photos taken on my 5S are something like 2-3 MB apiece. (I’m aware that this paragraph is going to sound incredibly dated in a few years’ time, for lots of reasons.)

And this is primarily what sets iCPL apart from backup solutions like Backblaze, or other “automatic” photo uploaders like Google+ or Dropbox: it’s automatic and baked in at the system level. Dropbox can’t do that because it can’t stay running in the background, or spawn daemons to do the same. On a Mac, it’s a similar story. Because Power Nap doesn’t have a public API, competing services can only sync while the Mac is awake. iCPL, on the other hand, can take full advantage of being a first-party app with full private API access, so it continues to sync overnight. Nice, right?

In short, it’s a pretty nice multi-device backup and syncing solution for your photos and videos. One thing I neglected to mention in that review, though, is an obvious caveat: it’s an Apple-devices-only black box. So if you’re in a mixed-device household, or you are reasonably skeptical of putting all your photos in one cloud, iCPL is probably not for you.

You can now search your Photos library, too, by location — including “Nearby”, which is a nice touch — title, description, and other metadata. Much of this information is weirdly only editable in other applications, like any of Apple’s OS X photo apps; you can’t categorize photos in this fashion on iOS. I’ve found that it’s still way too much work to try to tag even some of my photos with extended metadata. If this functionality is actually to be used by a significant user base, it needs to be far less work and far more automated.

Health

Here’s a feature I was really interested in using over a great deal of time. My iPhone has a record of pretty much every step I’ve taken since August, and that paints an intriguing snapshot of my life since then. On a daily or weekly basis, I can identify my sedentary desk job with peaks of activity surrounding it, then a sharp spike in my weekend activity. Over the course of the past several months, I can see exactly when winter hit, and a rise since the beginning of May when it started to get warm again.

Of course, these sorts of data points are expected. You could probably draw a similar activity graph based purely on guesswork, without needing to record each individual step. But the collected aggregate data feels meaningful specifically because it is not guesswork. You carry your phone and, if you have one, your Apple Watch with you pretty much everywhere, so it’s probably one of the best ways to gather data on your physical activity.

But Health is not a great way to actually view a lot of that information. Steps are charted, for example, but the y-axis only has minimum and maximum markings, so it’s not possible to see precisely how many steps you took on any day but today. Maybe that’s a good thing; maybe, as Federico Viticci alluded to, it isn’t necessary to precisely document everything, because ten, or twenty, or fifty steps in either direction isn’t actually going to make that much of a difference. But it’s not easy to estimate, because the axis’ scale varies day-to-day, especially if you have an erratic activity cycle.

The next level of granularity can be found by tapping on the chart, then tapping on “Show All Data”. This turns it from a level of detail that is incomprehensible because it is lacking detail into a level of detail that is incomprehensible because it offers far too much detail. This view is a simple table of every step you’ve taken, grouped into activity “sessions”, as best the system can discern it. For me, it displays — after taking many minutes to load — brief stints of seconds-to-minutes of activity, with tens-to-hundreds of steps. Tapping any given cell will allow you to edit the number of steps. That’s it. This is the same view as the calorie counter, or calcium intake, or sleep tracker, or any of the other HealthKit functions, but it simply doesn’t scale well to the number of steps taken in a day.

HealthKit

The trouble with Health is that it doesn’t actually do anything with the information it collects. Sure, I can see that I took about 11,000 steps yesterday, but is that good? Does that mean anything? Health feels like it’s only a dashboard and settings panel for a bunch of third-party functionality by way of HealthKit. Apple explains the framework thus:

HealthKit allows apps that provide health and fitness services to share their data with the new Health app and with each other. A user’s health information is stored in a centralized and secure location and the user decides which data should be shared with your app.

In a nut, it’s a unique, specific, and secure centralized database for health and fitness information.

I spent some time with a few different apps that tap into HealthKit, including Strava, Lifesum, UP Coffee, Human, and Sleep Better. I’m not going to review each app individually, but there are common strokes between many of the HealthKit apps: most require some kind of manual data entry, and this can be tedious.

If you want reasonably accurate meal tracking with Lifesum, you need to enter every single food item you eat. That can be hard if you cook most meals yourself using fresh or raw ingredients, and don’t eat out at chain restaurants. (I’m not being preachy or elitist; it’s just my lifestyle.) Not all ingredients or meals will have all nutritional data associated with them, so it’s not entirely accurate or helpful for tracking specific intakes, like iron or vitamin B12.

Similarly, tracking my caffeine intake with UP Coffee requires me to manually input my caffeinated beverage consumption. That’s somewhat easier than meal tracking because I typically drink coffee fewer times per day than I eat.

But apps that are able to more passively collect this data, such as Sleep Better or Strava, are naturally much more intuitive because using them doesn’t feel like data entry or labour. I understand the limitations of meal tracking and how monumentally difficult it would be to automate something like that, but the side effect of manual data entry means that I’m less likely to follow through. Since I’m at a healthy weight and average fitness level, I suppose that tracking this information is less compelling than it would be for, say, someone with health or fitness goals.

I looked for apps that could provide some recommendations based on my HealthKit data. Aside from apps that remind me to stand up every so often, there’s not a lot that I could find on the Store, though I suspect there are legal reasons for that. I also looked apps that could provide this data to my local health care provider, but I can’t find any hospital or clinic in my area that has an app with that functionality.

Health and HealthKit haven’t radically transformed my life or made me healthier, but they have made me more aware of my activity levels, my food intake, and my sleep habits. Moreover, some apps have made it downright fun to keep track of my activities. I feel pretty good when I see I’ve walked 20,000 steps in a day, or that my sleep was 90% efficient. I bet if I combined this with an Apple Watch, I’d have a fantastic time ensuring I stay physically active, like I’m being coached.

Continuity

Perhaps the most silent improvement to iOS 8 is Continuity, a set of functions that use WiFi and Bluetooth to make working between iOS and OS X hardware more seamless.

Cellular Over WiFi and Bluetooth

Some functions allow non-cellular devices to bridge to the cellular network via an iPhone, thereby allowing you to send and receive text messages, make and receive phone calls, and create an instant personal hotspot. The latter isn’t new, per se, but it is vastly enhanced. Previously, you had to fish around for your phone, open Settings, toggle Personal Hotspot to “on”, and type the password on your Mac. A true ordeal. Now, you can just select your phone from your Mac’s WiFi menu, even if Personal Hotspot isn’t on.

I’ve found these technologies useful roughly in the order in which I’ve listed them. I send and receive texts all the time on my Mac, and it’s wonderful. Sometimes, I’ll be trying to organize a gathering with a few friends, some of whom use iPhones, and some that do not. It’s very nice to be able to switch between the conversations as each person replies, and not have to pick up my phone to answer messages from the non-iPhone users. My biggest quibble is that the read status of text messages is not synced, so messages I’ve read or deleted on my iPhone remain “unread” on my Mac.

I have made and received phone calls on my Mac, too, and I actually quite like it. It’s very nice to be able to chat on the phone the same way I take FaceTime calls, for example. The biggest limitation, for me, has been the lack of a keypad on OS X, which means I can’t buzz people into my apartment from the comfort and convenience of my Mac.

The always-available personal hotspot function is something I have used only a couple of times, but has been effortless and seamless each time. It’s kind of like my WiFi-only iPad has transformed into the WiFi and cellular model, or my MacBook Air turned into that 3G-capable MacBook Pro prototype. Better than either of those two, though, is that I don’t have to buy an extra cellular data plan. It’s not like my phone didn’t have this functionality before, but making it so easily accessible is a very nice touch.

Handoff

The final bit of Continuity technology is Handoff, which allows you to start a task on one device and continue it up on another. If this sounds familiar, it’s because it’s something Apple tried to do in 2011 with iCloud Sync, kind of. You could start bashing out a document in Pages on your iPhone on your commute, then open Pages on your laptop at work and open the document from iCloud. Apple even bragged that the text insertion point would be in the same place.

Unlike iCloud Document Sync, which was just a document syncing model, Handoff is an activity syncing model.

Handoff is much more clever. Basically, if you have devices signed in with the same iCloud account, an activity that you start in the foremost app on one device can be picked up in the same app on another device. That means you can find directions in Maps on your Mac, then grab your iPhone, slide up on the little Maps icon on the lock screen — opposite the Camera shortcut — and now your directions are loaded onto your iPhone. Seamless, in theory.

In practice, I’ve found Handoff to be significantly less reliable than its other Continuity counterparts. I can almost always send text messages from my Mac and answer phone calls, but transferring a webpage from my Mac to my iPhone is a dice roll. It will go for days without working, then behave correctly for a while, then not work again, in precisely the same conditions. I’m at a complete loss to explain why.

When it works, though, it’s pretty cool. I can start reading an article on my Mac, then realize it’s getting late and hop on my iPad in my bedroom, and it’s right there. It’s a subtle and quiet feature, but it’s very impressive. When it works.

AirDrop

It always struck me as bizarre that iOS devices and Macs both had a technology called AirDrop, but they behaved completely differently and didn’t work with each other. As of iOS 8 and Yosemite, the madness has ended: AirDrop works the same way on both platforms, and you can share between both kinds of devices.

And I must say that it works pretty well. I AirDrop stuff all the time to my friends, and between my own devices. It’s an extremely simple way of instantly sharing pretty much anything to someone nearby, and I do mean “instant”: whatever you share will open automatically on the receiving device. If that’s not necessarily what you want, you can share via Messages or something; there is, as far as I know, no way to change this behaviour.

Spotlight

Oh boy, here’s something I love. Spotlight has been turned from a simple and basic way of searching your device into a powerhouse search engine.

Spotlight integrates itself into iOS in two ways: the now-familiar yet still-hidden swipe-down gesture on the home screen, and in Safari’s address bar. Search results powered by the same engine are also available via Siri. The engine surfaces recently-popular or breaking news stories, Wikipedia articles, suggested websites, and items from Apple’s online stores.

In practice, this means I can go directly to the stories and items that are immediately relevant, bypassing the previously-requisite Google search or typing Wikipedia’s address into Safari. If I’m curious about something these days, I just type it into Spotlight, and it usually finds something I’m looking for.

A Brief Word on Apple’s Increasing Self-Promotion

Back in iOS 5, Apple began an increasing encroachment of self-promotion within their mobile OS by adding an iTunes button to the Music app. It was a small tweak, but a gesture that signified that they wanted to push additional purchasing options into the OS. In the releases since, the self-promotion opportunities within iOS have only increased.

As I mentioned above, items in Apple’s online stores lay in amongst the results delivered by Spotlight. If I search for “Kendrick Lamar”, it will present me with an iTunes link as the top hit, not the Wikipedia bio I — and, I’d venture a guess, most people — would be looking for.

iOS 8 also has a feature that suggests apps from the App Store based on your location. Passing by a Starbucks? If you don’t have the Starbucks app on your phone, you might see a Starbucks logo in the lower-left corner of your lock screen — the same place where a Handoff app usually appears.

Along a similar plane is the rise in non-removable default apps. As of iOS 8.2, Apple added six compared to iOS 7: Podcasts, Tips, Health, iBooks, FaceTime, and Apple Watch. There 31 total apps on a default iPhone running iOS 8.2 or higher that the user cannot remove, and every single person I know with an iPhone has a folder on one of their home screens where they stash at least half of these default apps. These are not tech-savvy people; they do not read Daring Fireball nor do they know what an API is. But they know this sucks.

We all put up with tech company bullshit. When you throw in your hat with Google, you know that the reason they’re really good at “big data” things is because they’re mining your information along with everyone else’s to built comprehensive associative databases. When you buy into Apple’s ecosystem, you know that their primary income source is in your purchase of their products and service offerings. It makes business sense to integrate prompts for those offerings into an operating system with a reach in the hundreds of millions. It’s marketing that the company controls and for which they do not pay a dime.

Yet this constant reminder that I’m using an Apple OS on an Apple phone with constant reminders of Apple’s services and “oh, hey, look: they make a watch now” is grating. I wouldn’t mind it so much if there were a way to reduce the impression of pretty much any of these pain points individually, but it’s limited. There’s some relief: you can disable suggested apps on the lock screen, and screenshots of iOS 8.4 suggest the Store button is being removed from Music. I’m not suggesting Apple throw away their self-promotional activities entirely, but I think it would be prudent to evaluate just how many of them are tolerable. It’s stretching the limits of user-friendliness.

Third-Party Extensibility

Much in the way that iOS 7 was kind of a soft reboot for the OS — iOS 1.0, 2.0, if you like — iOS 8 is the 2.0 “developer” release. Apple delivered in spades at WWDC last year, with new APIs that allow for the kind of inter-app operability and deep integration that makes the OS better for developers and users alike.

App Extensions

App Extensions have completely and fundamentally changed the way I use iOS. That’s this section, in a nutshell. There was a lot of major news at WWDC last year, from an entirely new programming language to a full OS X redesign, but App Extensions are among the most significant enhancements. And, as in my review last year, I want to tackle each of the six extension points individually. Kind of.

Sharing and Actions

I’m going to start with these two in conjunction because there seems to be little understanding or consistency as to how they are distinct. To me, Sharing extensions should mean “take this thing I’m looking at and send it to another app”, while Action extensions should mean “do something to this thing I’m looking in-place”. Or, perhaps Share should mean “pull up a dialog for me to take further actions on this thing I’m looking at”, and Action extensions should mean “take this thing I’m looking at out of this app and into another”.

But even with my fuzzy understanding, there are seemingly no rules. Pinner and Instapaper both have modal-type Share extensions, but adding a todo to Things with its Action extension also pulls up a modal sheet. Meanwhile, the Bing translation Action translates in-place on any webpage. Both kinds can accept the same kinds of data, as defined by the developer via MIME type, and both amount to doing stuff in one app via another.

The best way I can think of distinguishing between the two types is that a Sharing extension always displays a modal interface overtop an app, while an Action may or may not.

In any case, I’ve found both kinds of extensions extremely useful. Where previously a host app had to decide whether to allow sharing to another app, now the client app gets to make that decision, for the most part. It makes more sense: you decide what mix of apps go on your device, and those apps should defer to your choices. Now, I get to decide what goes in my Share sheets. I can even — surprise, surprise — turn of some of Apple’s defaults. Don’t use the default Facebook or Twitter integration? No problem – just flip the toggle off and you won’t see them. (This, unfortunately, doesn’t apply to Action extensions, so you’ll be seeing that printer icon everywhere whether you like it or not.) Want to see some third party Sharing extensions, but not others? Just flip their toggles. You can also sort your Share and Action extensions in any order you’d like.

That brings me to the biggest problem with third-party extensions: newly-installed extensions are completely undiscoverable. There is no visual indication when an app is updated with Share or Action extension support, and extensions come disabled by default. You will only figure it out if you scroll right to the end of your row of extensions, tap the “More” button, then scroll through the list. The only other way that you may find out is if the developer has included it in their update notes and you bother to check the changelog, which you won’t since you, like most people, probably have automatic updates enabled and haven’t seen the Updates tab of the App Store in, like, forever. Even with all the time I’ve been using iOS 8 and apps have been supporting it, I’m still finding new extensions in Share sheets.

What’s fantastic about Share sheet extensions is that they make any old app that uses the default sharing API feel instantly tailored. It means developers don’t have to support every bookmarking service individually, or pick and choose the ones they want to support; they can just tell the sharing API to handle it. I use Pinboard and Instapaper; you may prefer Pocket, Pinterest, or whatever new thing A16Z is investing in. That’s a lot of different sharing APIs to support. Even the login experience is far better for users, who now only have to sign in once with the client app, instead of each app individually.

I simply can’t say enough good things about Sharing and Action extensions.

Today Widgets

I can, however, say far fewer good things about the widgets that can occupy the Today view in Notification Centre. Generally, this isn’t Apple’s fault; rather, it’s the problem of developers wanting to use a new feature, but not having a solid justification or considered concept for doing so. This mindset has led to the creation of widgets such as Dropbox’s, which displays the last three files that were updated, or Tumblr’s trending topics widget. Then there are widgets like the NYT Cooking widget that suggests meals to cook, which sounds great in theory, but makes no sense as a random-access widget.

None of these widgets account for the ways normal people use Notification Centre. I’ve found that widgets that succeed in the Today view are conceptually similar to WatchKit apps: at-a-glance information that requires little interaction. Human’s widget, for instance, displays your current progress to 30, 60, or 90 minutes of daily activity. It requires no user interaction and is just informative enough.

Apple’s own widgets can be hidden in the Today view, too, with the exception of the date at the top. That creates an excellent opportunity for third parties to create more specific interpretations of those widgets. For example, I don’t drive to work, so Apple’s estimated commute time widget is useless to me. I do, however, wish to know when the next train is arriving, so I keep Transit’s excellent widget in my Today view. Similarly, Apple’s weather reading sometimes doesn’t show current conditions. The Fresh Air widget, on the other hand, always does, and it forecasts the weather for calendar events.

But while the best Today widgets display low-barrier glanceable information, none of them feel particularly instantaneous. This is due in large part to their standby state; or, more specifically, it’s due to the time it takes to recover from their standby state. Today widgets are required to be low-power, low-memory kinds of deals which only refresh when the users is viewing Notification Centre. While that makes sense, iOS only refreshes widgets when the animation that shows Notification Centre is fully complete. So: you drag from the top of the screen to show Notification Centre, see glimpses of cached information as you drag the sheet down, see a flash of every widget refreshing, then you can interact with any of them. It only takes a couple of seconds, but it makes for a user experience that is rougher than it should be for timely widgets like these. It would feel a lot more instantaneous if Today widgets, or at least the first few, refreshed as Notification Centre is being activated, rather than at the end of the activation.

Furthermore, widgets refresh when you scroll to bring them into view. I have Fresh Air at the top of my Today view, and Fantastical just offscreen below it. If I invoke Notification Centre, Fresh Air will refresh; when I scroll, Fantastical will refresh; then, when I scroll back up, Fresh Air will refresh again. This behaviour is apparently something that iOS does, and something that developers cannot control.

Finally, though Apple provides several of their own Today widgets, there doesn’t seem to be an agreed-upon set of visual interface rules. Most widgets respect the same left-side padding of the default ones, and many have similar typographic and hierarchical treatments, but then you get the odd monstrosity like Yahoo’s Weather widget or the aforementioned Tumblr one.

Today widgets should feel like smooth, passive ways to get small snippets of timely or location-related information. Instead, they come off a little janky. They’re a real missed opportunity for some developers, and a kludgy add-on for most. Hopefully the Watch will force the kind of focus demanded by widgets in iOS.

Photo Editing

Photo editors on iOS are kind of my thing. I’ve used all of the popular ones and, though I’ve settled on a workflow that I like, I keep a bunch of others on my iPhone just in case. Yet, after nearly nine months of the possibility of extensions to Apple’s default Photos app, just two apps on my phone — Afterlight and Litely — have such an extension, and I’ve tried a few dozen of the most popular ones. Even my beloved VSCOcam doesn’t have a Photos extension, despite being used in the demo of this API at WWDC. (I reached out to VSCO for comment on this, but I haven’t heard back from them.)

As for the apps that do have an extension for Photos, well, they’re okay. I find that they’re really hidden — instead of residing in the palette of editing tools at the bottom, there’s a little ellipsis in the upper-left corner of the app. Tap on it, then tap the extension you’d like to use, or tap More to see if any other extensions have been installed since you last did this — Photo extensions are hidden and disabled by default, like Sharing extensions.

It’s hard to give a generalized take on what Photo extensions are like, or typify their experience, but I’m going to try to do that. In order to do so, I had to go grab a few more apps. I downloaded Fragment, which does some rather trippy kaleidoscopic effects, and Camera+, which pained me to download because I think John Casasanta is kind of an asshole. But let’s not dwell on the past.

The extensions from both Fragment and Litely are somewhat lighter-weight versions of their parent apps, while Camera+ and Afterlight provide near-full experiences. That’s kind of cool to have in an extension: nearly running one app inside of another. You can make your edits, then tap Done, and the photo will be saved in-place in a flattened state; there is no granular undo post-save, nor is there a way to modify the applied edits. The original copy of the photo is saved, however, so you can revert entirely, but this, of course, destroys all of the edits made to a photo.

I’m struggling to understand the practical purpose of this extension point as it is right now. The full app is still required to exist somewhere on your phone; even if they’re buried in a folder somewhere, they must exist. Perhaps an ideal world would require you only to open Photos any time you wanted to make an edit, and future versions of the API will allow for nondestructive editing between several extensions. But I don’t see the power of this yet. It seems too hidden for average users, and not powerful enough for people who wish to have a full post-production environment on their phone or tablet.

Keyboards

There are some things I imagined Apple would never allow on iOS; third-party keyboards are among those things. Yet, here we are, with third-party keyboards supported natively in iOS, before the introduction of a third-party Siri API, for example.

I have tried pretty much all of the popular third-party keyboards for iOS — Fleksy, Swype, SwiftKey, Minuum, and so forth — running them for days to weeks at a time. And the keyboard that has stuck with me most has been — [dramatic pause] — the default one, for a singular reason: it’s the only one that feels fast.

Sure, pretty much all of the third-party keyboards you can find have a way better shift key than the default, and plenty are more capable. But I don’t type one-handed frequently enough to get a use out of a gestural keyboard like Swype; most of the time, I find these gestures distracting. Third-party keyboards also don’t have access to the system’s autocorrect dictionary, which means that developers need to build in their own autocorrect logic and users need to train the new keyboard. I didn’t think this would be as frustrating as it turned out to be. Third party keyboards also can’t automatically switch languages depending on which Messages conversation you’re in, which is something I don’t use, but plenty of people I know do.

But, as I wrote above, the main reason I stuck with the iOS keyboard is that it’s the fastest one. It launches immediately when it’s called and key taps are registered as fast as I can type with my two thumbs. That’s not to imply that I don’t have complaints with the default keyboard — I do, or have you not been reading? — but it’s simply the best option for me. And, judging by the people I’ve talked to, it’s the best option for most of them as well. Like me, they tried the most popular ones and, like me, most of them are back with the default.

The ones who have stuck with third-party keyboards have done so for reasons I didn’t necessarily think of. Kristap Sauters, for example, has found that SwiftKey is far better at swapping languages dynamically: he can start a sentence in one language, type a word from another, and it will detect this change better than the default. This is not a feature I would have found because it isn’t one I use.

The best third-party keyboards for my usage are those that do not try to replace the default alphanumeric one, but rather try to do things it can’t. Popkey, for example, is a ridiculous animated GIF library, but it’s smartly packaged as a keyboard so you can reply to text messages and emails just so. David Smith’s Emoji++ is another great third-party keyboard that effectively replaced Apple’s segmented emoji keyboard prior to 8.2, but it was Sherlocked with iOS 8.3.

I’m not sure whether the issues I have with third-party keyboards are the fault of iOS’ implementation, or the keyboards’ developers. Whatever the case, it’s enough to prevent me from using a non-default keyboard on a regular basis.

Documents and Files

iOS now has an exposed file system! Kind of.

Technically two extension points, Document and File providers allow for an app to identify itself as a place where other apps can send and receive files. iCloud Drive is an example of a file provider, but now third parties like Dropbox can provide documents to an app that supports it. So you can store your Pages documents in Dropbox instead of iCloud Drive, and have a similar level of synchronicity between your Mac and iPad.

Better still is Panic’s creative interpretation of this capability. You can open documents and files from Transmit in other apps, and since Transmit is an FTP client, that basically means that you can open any file you have access to in supported apps. That’s amazingly powerful.

This isn’t a type of extension with which I’ve spent a great deal of time. I’m not Federico Viticci, and I don’t have his automation prowess. But for power users or people who use their iPad as more than a kick-back-and-read device, it seems pretty great.

Notifications

I find it fascinating how iOS and OS X are built by the same company at the same time, but often do not share features; or, at least, their feature additions come at different rates.

The push notification API is the perfect example of the staggered rollout and feature incongruence across Apple’s operating systems. Though notifications existed since the beginning of the iPhone, they were modal and weren’t opened up to developers. By the time iOS 5 rolled around in 2011, notifications became far more scalable with the introduction of Notification Centre; it took until 2012 for them to be brought into OS X to replace, for most developers, the venerable Growl. In 2013, OS X notifications gained inline actions and replies, but iOS remained stubbornly without either.

So it was a relief when iOS 9 brought actionable notifications to Apple’s mobile platform. Onstage, they demoed archiving an email, replying to an iMessage, and the third-party potential of — for instance — liking a wall post on Facebook. This left me with the impression that I‘d be able to reply to third party notifications inline, too. But it turns out that third party developers don’t have access to the inline reply API, which is a real bummer.

Interactive notifications are fabulous otherwise, though. I suspect my email open rate has gone down dramatically since I can just deal with new messages as they arrive in both Mailbox and Spark, the two email apps I typically use. I use the “fav” button in Tweetbot notifications frequently as well, and Fantastical’s “snooze” feature for reminder notifications is perfect.

Unfortunately, plenty of third-party developers still haven’t added interactive notifications to their apps. Facebook’s Paper app doesn’t have them, nor does NYT Now, where I could imagine saving a breaking news story for later reading. On the other hand, perhaps it’s best that most developers don’t seem to be trying to shoehorn this feature into apps where it doesn’t belong.

I’m looking forward to further improvements in this space. Ideally, developers will be able to add inline replying, and perhaps they’ll even be able to draw their own custom UIs in notifications — Fantastical could, for example, present snooze options inline. There’s so much potential for notifications, especially in conjunction with the Watch.

The State of iOS

Apple’s mobile operating system has matured into an incredibly robust platform. They’ve spent a lot of time over the past two years rebuilding parts of the OS to make it last another seven or eight years, and things are coming up Milhouse.

But the last two years of defining an entirely new visual design for the platform and an updated functional design have also clearly taken their toll. There have been bugs — a lot of bugs. After several months with iOS 8, I’ve gotten used to some of its little foibles. I learned not to tap the space between the keyboard and the notification area while replying to a message, until that was fixed nearly 300 days after first being reported as a bug in the earliest iOS 8 betas. I learned all sorts of things that I shouldn’t do, and ways of troubleshooting core parts of the OS that really shouldn’t need troubleshooting.

It’s been a really rough ride for developers, too. As a plethora of new capabilities were given to third parties, the app review team found widely varying interpretations of the new API usage guidelines. Things which were perfectly fine in one app already sold in the store may not be okay in a different app. Inconsistent rejections hampered developers this year and eroded their confidence in the platform.

There’s a lot for Apple to do this year. They always have a long todo list, but this year’s feels more urgent than most. Apple’s sales have never been better, but the confidence of developers and users feels a little shakier than it has for a while, for both iOS and OS X.

I am excited, as always, for Monday’s keynote. I can’t wait to see what new things developers get to take advantage of, from really big things — like a Siri API, perhaps — to littler things. One thing is for certain: there’s no shortage of things for Apple to do with their platforms. Every year, I feel the same way, no matter how robust and mature their platforms get: they’re just getting started.

Edward Snowden, in an op-ed for the NY Times:

We are witnessing the emergence of a post-terror generation, one that rejects a worldview defined by a singular tragedy. For the first time since the attacks of Sept. 11, 2001, we see the outline of a politics that turns away from reaction and fear in favor of resilience and reason. With each court victory, with every change in the law, we demonstrate facts are more convincing than fear. As a society, we rediscover that the value of a right is not in what it hides, but in what it protects.

Incredibly well stated.

The events of the past two years have made it clear that members of the US government were knowingly lying to or misleading the American people about the nature of their intelligence programs. This week’s victory, albeit a small one, in passing the USA FREEDOM Act would not have occurred without Snowden’s disclosures. The USA PATRIOT Act1 would have been re-signed without a second thought or a word of debate had Snowden not made the disclosures he did.

I know it doesn’t exactly work this way, but this evidence should justify a full pardon. This debate needed to happen, and it clearly wasn’t going to unless such disclosures were made. They may have been unauthorized, but they were wholly necessary.


  1. These acronyms are stupid. I will continue to capitalize them in this fashion to illustrate this. ↥︎

Hey, remember yesterday when I somewhat cynically tempered your expectations over the USA FREEDOM act?

For practical purposes, this does not limit the surveillance capabilities of the U.S. government much, especially since the NSA programs exposed by Snowden have likely not been curbed, and there’s no way of knowing their standing until the next Snowden comes along.

Well, big news: the next Snowden has come along and his name is, uh, Edward Snowden. Charlie Savage, et al., report for the New York Times:

Without public notice or debate, the Obama administration has expanded the National Security Agency’s warrantless surveillance of Americans’ international Internet traffic to search for evidence of malicious computer hacking, according to classified N.S.A. documents.

[…]

While the Senate passed legislation this week limiting some of the N.S.A.’s authority, it involved provisions in the U.S.A. Patriot Act and did not apply to the warrantless wiretapping program.

Many of the NSA programs exposed by Snowden operate under similar legal jurisdiction as this one, so the USA FREEDOM Act doesn’t apply to them, either.

[A sad trombone and slide whistle play as a gigantic “Mission Accomplished” banner unfurls behind the President.]

John Gruber reacts to Tim Cook’s privacy-oriented speech from Tuesday, and Thomas Ricker’s response:

Apple needs to provide best-of-breed services and privacy, not second-best-but-more-private services. Many people will and do choose convenience and reliability over privacy. Apple’s superior position on privacy needs to be the icing on the cake, not their primary selling point.

The argument being that Apple, or any tech company, should not have to choose between offering great services and protecting user privacy.

I’ve alluded to this before, but I’m not sure Apple can provide a Google-equivalent quality of cloud service while keeping things private. Take, for example, one of the most impressive features of Google Photos: its ability to catalogue and tag the contents of your photos automatically, without requiring any intervention on the user’s part. They search “cat”, and they get all the pictures they’ve taken of cats. Magic.

To do this, they need a lot of information on what cats look like from every conceivable angle, in a wide range of lighting conditions. The good news is that Google has billions upon billions of users’ image searches of cats. They’ve crawled the web for the past fifteen years and found a whole bunch of pictures of cats based on the alt and title tags of the images, their captions, and the content around them on the originating page.

While they haven’t released details on exactly how their image recognition tech works, it’s reasonable to guess that Google has been tracking a user’s image search for “cat” to the images they click on. The results that get more clicks — in aggregate — are probably going to be the most accurate representation of the search term. Google can therefore take the knowledge garnered from billions of searches a day on myriad search terms and create a way for their Photos product to detect the subject matter of user images.

We have, in simple terms, been providing Google with keywords and an indication of their accuracy for years, which they can use across all of their services.

Apple struggles with this kind of machine learning prowess because they don’t operate services in the same way that Google does. They don’t analyze user-provided data in aggregate; they often even keep a single user’s information siloed across multiple services. Google, by contrast, governs most of their services under a single privacy policy that allows them to blend all data collected under this policy. This allows them to offer a breezy first-run experience and services that require multiple kinds of data cross-referenced and associated with one another.

Google Now is the perfect example of such a service. It mines your email, calendar, location, todos, Google’s collected search data, and other information to try to guess what you will need to see at a given moment. If you are at the airport at 14:39 and you have a plane ticket in your Gmail for 15:48, it can identify a flight number and show you whether your flight is on time or not. Word on the street is that Apple is going to compete with Now, but without the kind of deep integration and cross-comparison that Google’s consolidated privacy policy allows, I’m curious to see how Apple could implement something in a privacy-friendly way.

Maps is another example. Google’s data is, generally, better than Apple’s1 because it’s been around for a while, it’s free to use, and they’re able to check their data against pages they’ve crawled. They also sent out a bunch of cars to verify data and take Street View pictures. Apple partnered with a bunch of other companies to build their data index, but it’s hard to compete against Google crawling the web every single day for all kinds of information that could be used to bolster map results.

I’ve gotten this far without talking about the money aspect. It is in Google’s best interests to be accurate and create an index that is as robust as possible which is cross-referenced on a per-user basis, and in aggregate. The more they understand, the more relevant ads they can sell.

Apple doesn’t have such an incentive, and that’s a value I cherish. But this is not an excuse for creating worse products. I want the very best, of course, and Google is the high watermark for almost every web service. But I fear that it is unrealistic for Apple to improve their machine learning capabilities and web service quality without relaxing their firm stance on privacy. I’m not sure what I worry about more: that Apple’s service quality will suffer, or that I will have to give up a little bit of my privacy to prevent it from doing so.


  1. If you disagree with this in your area, that’s fair. I’d still be willing to bet you that search is way, way better in Google Maps. ↥︎

On Monday, Recode’s Dawn Chmielewski and Peter Kafka reported that Apple’s upcoming television and streaming service isn’t ready for a WWDC announcement. Now, today, Brian X Chen of the New York Times reports that the new Apple TV hardware won’t be at WWDC either:

The company planned as recently as mid-May to use the event to spotlight new Apple TV hardware, along with an improved remote control and a tool kit for developers to make apps for the entertainment device. But those plans were postponed partly because the product was not ready for prime time, according to two people briefed on the product.

Apple seems to be doing pretty well with managing expectations.

I haven’t noticed the removal of any TBA-type sessions from this year’s schedule, so it looks like the rumoured SDK might still be a go. A support document still refers to the third-generation1 Apple TV as the hub for HomeKit, so I wouldn’t be surprised to see further functionality rolled out soon.


  1. “Or later”. ↥︎

Yoni Heisler, BGR:

Ahead of its planned split from eBay, PayPal is planning to roll out a new terms of service agreement for its customers which would allow the company to pepper its userbase with robocalls and text messages. What’s more, the updated terms of service would allow PayPal to contact users at either their designated phone number or even an undisclosed number PayPal managed to obtain through other means. Set to go into effect on July 1, PayPal’s updated user agreement is not an opt-in type of deal, which makes it all the more worrisome.

I’m sure there’s someone out there who is dying to receive automated phone calls at a number which is unlisted yet PayPal managed to uncover, but that person is not me, nor anyone I know. But, hey, can you expect anything better from PayPal?

The Washington Post’s Mike DeBonis, with some better news for your privacy:

The USA Freedom Act represents the first legislative overhaul passed in response to the 2013 disclosures of former National Security Agency contractor Edward Snowden, who revealed the NSA’s bulk collection of telephone “metadata” and the legal rationale for it — the little-noticed Section 215 of the USA Patriot Act, passed in the months after the Sept. 11, 2001, attacks.

The new legislation places additional curbs on that authority, most significantly by mandating a six-month transition to a system in which the call data — which includes call numbers, times and durations — would remain in private company hands but could be searched on a case-by-case basis under a court order. One supporter, Sen. Patrick J. Leahy (D-Vt.), described the legislation as “the most significant surveillance reform in decades.”

Can we clear something up to start? “USA FREEDOM” is apparently an acronym for “Uniting and Strengthening America by Fulfilling Rights and Ending Eavesdropping, Dragnet-collection and Online Monitoring”, which is at least as mangled and forced as the USA PATRIOT Act’s “Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorism”. American lawmakers need to stop this. It isn’t clever.

There’s a lot to like in this bill, aside from its name. FISA courts, processes, and decisions are now much more accessible, and there are more stringent requirements for gaining phone records. But it also makes Section 215 and “roving” wiretaps valid until the end of 2019. For practical purposes, this does not limit the surveillance capabilities of the U.S. government much, especially since the NSA programs exposed by Snowden have likely not been curbed, and there’s no way of knowing their standing until the next Snowden comes along.

Yesterday, the Associated Press reported that the Cessnas some Americans have seen flying counterclockwise for hours above their cities are, indeed, FBI surveillance aircraft. FAA regulations make it plausible to track these flights in real time, if you know the tail numbers. But the FBI is apparently in the process of scrubbing a lot of their records from popular flight trackers — the aircraft mentioned in this Reddit comment apparently hasn’t flown in nearly a year. According to the same commenter, grain of salt implied, several of their aircraft are now sharing tail numbers.

It’s understandable that new surveillance methods are required as technology changes, but a federal agency flying a plane overhead equipped with technology to intercept cellular calls seems more intrusive than it could possibly be worth. Or, at least, until the FBI produces evidence otherwise, if they can.

Matthew Panzarino, TechCrunch:

“We believe the customer should be in control of their own information. You might like these so-called free services, but we don’t think they’re worth having your email, your search history and now even your family photos data mined and sold off for god knows what advertising purpose. And we think some day, customers will see this for what it is.”

[…]

It’s a masterful stroke of speechifying. As I’ve mentioned before, by taking this stance (which I do not believe to be disingenuous, their profit centers support it), Apple has put all other cloud companies in the unfortunate position of digging themselves out of a moral communications hole to prove their altruism when it comes to user data.

I’m not saying that Cook is correct in brutalizing the motives of companies like Google or Facebook — but it does craft a strong portrait — because Apple is safer and ‘not interested’ in your data casts a cloud (ahem) of doubt over pretty much every other company in its league.

This is awfully compelling. USB-C kind of replicates a lot of Thunderbolt’s functionality, but in a worse way. This allows for the full PCI-e spectrum of things to connect via the same cable that provides power and two 4K video streams. In Apple terms, a next-generation Thunderbolt Display paired with a next-generation MacBook could connect via a single cable and output to a 5K Retina display while being charged from that one cable. That sounds pretty much perfect.

When I linked to Matt Gemmell’s excellent “Respect Metrics” piece yesterday, I said:

Whenever I go to a site that is nothing more than endless rewrites of other people’s posts wrapped in teaser headlines and served alongside endless ads, related content, modal lightboxes, “toasters”, and a hundred different sharing options — I’m looking in your direction, Business Insider and the Verge — I internally question whether the site is actually proud of its authors’ writing.

I’m not sure this was fair to Business Insider. Why would I suggest they limit their contempt of writers to those on their payroll?

Kara Bloomgarden-Smoke, of the Observer:

“Elon Musk: Tesla, SpaceX, and the Quest for a Fantastic Future” by Bloomberg BusinessWeek technology reporter Ashlee Vance, which came out last month, became the source for many BI stories about the eccentric tech billionaire. Although the practice of pulling juicy material from a book or longer article and making a story out of it isn’t unique to BI, the number of stories that originated in Mr. Vance’s book was notable.

“Everyone will do a post or two, but BI takes it to the next level,” Mr. Vance told the Observer. “They were serializing the book in lots of little posts.”

As Bloomgarden-Smoke reports, Vance asked Business Insider to stop a couple of times, and then they began redirecting posts based on the book to Amazon. Except it’s far from altruistic, because this story

http://www.businessinsider.com/elon-musk-quotes-success-book-2015-5

redirects to

http://www.amazon.com/gp/product/0062301233/ref=as_li_tl?ie=UTF8&camp=1789&creative=390957&creativeASIN=0062301233&linkCode=as2&tag=thebusiinsi-20&linkId=FESV2NZEEZ4INJDH`.

Among all the crap in that URL, I’d like to direct your attention to this: tag=thebusiinsi-20. Not only did they pull stories en masse from Vance’s book, they are now redirecting those links to an Amazon referral page. Slimy, even by Business Insider’s low, low standards.

Matt Gemmell:

I have a list of metrics that I automatically – even subconsciously – use when visiting a web site, to determine whether it’s worth my focus. Am I just a pair of eyeballs, or is this author really speaking to me? Have they given due thought to showing their work in the best light, or just thrown it up there? You can tell a lot about how a site’s author, or owning company, feels about you by how they balance the various tensions of design, content, monetisation, functionality, audience retention, and more.

Whenever I go to a site that is nothing more than endless rewrites of other people’s posts wrapped in teaser headlines and served alongside endless ads, related content, modal lightboxes, “toasters”, and a hundred different sharing options — I’m looking in your direction, Business Insider and the Verge — I internally question whether the site is actually proud of its authors’ writing. In short, does the publication respect its writers? In most cases, the answer is “no”. And if it doesn’t respect its writers, you can bet it doesn’t respect readers, either.

Jason Koebler, Vice:

For now, the company is primarily experimenting with the HBO model of pitching its own original programming to viewers. The company is only showing trailers for shows like Orange Is the New Black and House of Cards — it has not attempted to sell third party ads, and the company told me that, for the moment, only specific users in specific markets are seeing ads.

[…]

It’s worth noting that, though Netflix hasn’t had any ads so far, it has the potential to deliver much more targeted ads (which can be sold for higher rates) than a standard cable company. Netflix has a detailed history of every show you’ve ever watched, meaning it can infer your interests and so on.

One of the key selling features of Netflix was its lack of ads. You plop down on your couch in front of your Roku, Apple TV, or your computer, and fire up the show or movie you want to watch. It’s a compelling pitch. Now, they’re bailing on that. And they could potentially — as pointed out in a style of FUD that Vice does so well — combine the frustration you feel when sitting through an ad break with the disgust you feel when you realize just how targeted the advertising is.

Vlad Savov kicked things off at the Verge by speculating on the use and nature of the appropriated crown on the righthand side of the device:

Like the Apple Watch, the ZenWatch 2 has a metal crown, which gives you “a new way to interact” with the Android Wear interface. Asus hasn’t yet detailed the specifics of how this will work, however, and Android Wear doesn’t have the same software support as Apple has for scrolling with the digital crown in its own Watch. So this seems likely to just be an external button.

Pure speculation — Asus’ press release never claimed that this button could be anything more, but nor did it clarify precisely what it would do. Upon clarification, Savov made a subtle change to that paragraph. See if you can spot it:

Like the Apple Watch, the ZenWatch 2 has a metal crown, which gives you “a new way to interact” with the Android Wear interface. It initially seemed as though this would work like the digital crown on Apple’s watch, however it turns out to simply be a power button with a fancy title.

The speculative text has magically vanished.

But it didn’t vanish fast enough for publications like Fortune, who suck up other sites’ writing like blue whales swallow krill. Reporter Robert Hackett felt so bold that he titled his interpretation of Savov’s article “This Company Just Copied the Apple Watch’s Best Feature”: “this company” being Asus, “copied” not really happening, and this headline format reaching the point of parody back when the Onion launched ClickHole.

Oh, yeah, and the watch is oddly reminiscent of the Apple Watch in a great deal of ways, least of all in the slow-motion product porn shots of the bands that you can buy with it, but let’s leave that aside for now.