The year after the release of iOS 7 was a mad dash for designers and developers updating their apps to make them fit with the new user interface language. Many developers took this time to reconsider how their apps should work, too, in a sort of conceptual, flow-oriented way. They could ask themselves questions like “Does this screen have the best layout possible?” or “Is this glyph as clear as it should be?” But there was comparatively little in terms of absolute new functionality. Yes, there were thousands of new APIs and new frameworks like SpriteKit. But the vast majority of innovations on the sides of both Apple and iOS developers came from strides made in UI design.
If iOS 7 tipped the scales a bit too much in the direction of the “how it looks” part of design, iOS 8 went directly for “how it works”. In addition to the plethora of new features available to end users — Health, predictive text, Continuity, and iCloud Photo Library, to name a few — iOS 8 also dumped onto developers colossal new capabilities unprecedented on the platform, in the form of App Extensions. Because these APIs were so new to iOS, few developers were able to put together really effective ways of using Extensions while I was working on my review. But, over the last eight months, we’ve seen enhancements to apps that we could only have dreamt about previously.
September’s release of iOS 8 — and my launch-day review — only told half the story of the current version of the operating system. This is the second half of that story.
iOS 8, Revisited
It’s only really possible to understand how one really uses an operating system after the novelty of its newness has worn off, and we are well past the honeymoon period here, people. We’re on the cusp of the introduction of an updated version of iOS. How’s iOS 8 holding up?
Messages
The headlining new feature in Messages was the addition of disappearing audio and video messages. When I was using prerelease versions of iOS 8, I was curious to see what kind of adoption these features would have when the general public got their hands on the OS. And, many months in, I have yet to receive a single intentional audio or video message from my iPhone-using friends, though I have received a number of accidental audio clips because the gesture used to send them is sometimes a little finicky.
The lack of video messages doesn’t surprise me in the slightest. While iMessage may be one of the world’s most popular messaging services, everyone I know uses Snapchat to quickly send photos and videos. Unlike Messages, which sends giant photos and full-res HD video, Snapchat heavily compresses everything. It looks a little shitty, even on a tiny phone screen, but it’s fast and doesn’t eat up your capped data plan.
But I haven’t sent or received a single audio message, and that’s not what I expected:
Leaving a brief audio message is a great way of sending a message that’s more personal than a text message, and less interruptive than a phone call. Apple has executed many aspects of this feature remarkably well, too. The resulting audio files are tiny and heavily-compressed — typically less than 1 KB per second — making them perfect for a simple voice message that sends and receives quickly. When the other end receives the message, they don’t have to interact with the notification at all. They can simply raise their phone to their ear and the audio message will play.
Nothing about the execution seems flawed to me, less the easy-to-trigger gesture to send these recordings. I don’t think my friends are especially rude, either. Perhaps it’s just easy enough to decide whether to send a text message or make a phone call, and there isn’t much wiggle-room in between.
Keyboard
I have complaints.
I’ve been using the soft iOS keyboard since 2007, so I’ve become acclimated to its rather unique characteristics. When Apple changes something about it, I notice. And, oh boy, have I noticed some changes.
The most significant change, by far, is the introduction of predictive typing, dubbed QuickType. Appearing in a strip above the main keyboard area are three cells guessing at what you might type next. Sometimes, it’s pretty clever: when given an a or b choice in a text message, for example, it will display both a and b as predictive options. I like that.
What I don’t like is what this has done to autocorrect. I’m not sure if it’s entirely a side effect of the predictive typing engine, but autocorrect’s behaviour has changed in iOS 8 and it drives me crazy.
When the QuickType bar is enabled, the autocorrect suggestion will appear in the middle cell of the bar instead of as a floating balloon above the word, as it has done since the very first version of iOS. I find this far too subtle. Even more subtle is the way you ignore the autocorrect suggestion: since the bubble doesn’t exist for you to tap on to ignore it, you tap on the leftmost cell of the QuickType bar with your verbatim spelling. And that feels really weird to me.
This behaviour is something I never got used to, so I turned off the predictive keyboard days after publishing my review in September. This brings the keyboard back to a more iOS 7-like state, with classic autocorrect bubbles. But I still think something’s going on under the hood with the autocorrect engine. I can’t prove it, but suggested corrections have become substantially worse after I upgraded to iOS 8, and I’ve heard similar stories from others. I’m not sure my perception matches reality; it might simply be confirmation bias. But it feels like I’m getting worse suggestions than I did previously.
Apple also still has yet to fix the awful shift key added in iOS 7.1. I’ve heard rumours that the iOS 9 keyboard’s shift key has been redesigned. We’ll see.
In better news, the emoji keyboard was significantly improved in iOS 8.3, with an infinitely scrolling palette instead of siloed sections. It makes a lot more sense, and it’s a welcome change. Unlike its OS X counterpart, however, it doesn’t provide search functionality, nor does it include a full extended character palette. It would be great if a future version of iOS included these features, especially on iPad.
Camera
I take a lot of pictures on my phone; therefore, I’m almost certain that I’ve used very few new iOS 8 features more than manual exposure compensation. Oh, sure, the iPhone still has the best ratio of photo quality to the amount of effort required to take it. But now you can put in a hair more effort in a simple way, and get a hell of a better photo out of it.
One of the things I discovered through using this feature all the time is that it’s possible to layer exposure compensation with focus lock or HDR. This means it’s possible to capture far better images of high-contrast scenes, like sunrises and sunsets, or live concerts. It’s also possible to abuse this feature to create excessively over- or under-exposed scenes, which can be used to interesting effect.
Like its slow-mo counterpart, the new time-lapse video feature isn’t something that anyone I know of — save Casey Neistat — uses very often, but is great to have when you want it. It’s the kind of feature that seems unnecessary most of the time, but when you need it, you probably don’t want to dig around for a third-party app. It’s better to have it built-in. The results are excellent, rivalling — in practical terms — the kind of results I get doing something similar on my DSLR without doing the kind of work that a DSLR time-lapse requires.
Photos
The biggest enhancement to Photos in iOS 8 was iCloud Photo Library. I covered my experiences with iCPL in my review of Photos for OS X; here’s an excerpt:
Uploading [gigabytes of] photos on my home broadband connection took what I imagine is a long time, but I’m not certain exactly how long because it’s completely invisible to the user. It runs in the background on your Mac and on your iPhone, when you’re charging and connected to WiFi. I can’t find any setting that would allow you to do this over LTE, but I’m not sure why you’d want to — photos taken on my 5S are something like 2-3 MB apiece. (I’m aware that this paragraph is going to sound incredibly dated in a few years’ time, for lots of reasons.)
And this is primarily what sets iCPL apart from backup solutions like Backblaze, or other “automatic” photo uploaders like Google+ or Dropbox: it’s automatic and baked in at the system level. Dropbox can’t do that because it can’t stay running in the background, or spawn daemons to do the same. On a Mac, it’s a similar story. Because Power Nap doesn’t have a public API, competing services can only sync while the Mac is awake. iCPL, on the other hand, can take full advantage of being a first-party app with full private API access, so it continues to sync overnight. Nice, right?
In short, it’s a pretty nice multi-device backup and syncing solution for your photos and videos. One thing I neglected to mention in that review, though, is an obvious caveat: it’s an Apple-devices-only black box. So if you’re in a mixed-device household, or you are reasonably skeptical of putting all your photos in one cloud, iCPL is probably not for you.
You can now search your Photos library, too, by location — including “Nearby”, which is a nice touch — title, description, and other metadata. Much of this information is weirdly only editable in other applications, like any of Apple’s OS X photo apps; you can’t categorize photos in this fashion on iOS. I’ve found that it’s still way too much work to try to tag even some of my photos with extended metadata. If this functionality is actually to be used by a significant user base, it needs to be far less work and far more automated.
Health
Here’s a feature I was really interested in using over a great deal of time. My iPhone has a record of pretty much every step I’ve taken since August, and that paints an intriguing snapshot of my life since then. On a daily or weekly basis, I can identify my sedentary desk job with peaks of activity surrounding it, then a sharp spike in my weekend activity. Over the course of the past several months, I can see exactly when winter hit, and a rise since the beginning of May when it started to get warm again.
Of course, these sorts of data points are expected. You could probably draw a similar activity graph based purely on guesswork, without needing to record each individual step. But the collected aggregate data feels meaningful specifically because it is not guesswork. You carry your phone and, if you have one, your Apple Watch with you pretty much everywhere, so it’s probably one of the best ways to gather data on your physical activity.
But Health is not a great way to actually view a lot of that information. Steps are charted, for example, but the y-axis only has minimum and maximum markings, so it’s not possible to see precisely how many steps you took on any day but today. Maybe that’s a good thing; maybe, as Federico Viticci alluded to, it isn’t necessary to precisely document everything, because ten, or twenty, or fifty steps in either direction isn’t actually going to make that much of a difference. But it’s not easy to estimate, because the axis’ scale varies day-to-day, especially if you have an erratic activity cycle.
The next level of granularity can be found by tapping on the chart, then tapping on “Show All Data”. This turns it from a level of detail that is incomprehensible because it is lacking detail into a level of detail that is incomprehensible because it offers far too much detail. This view is a simple table of every step you’ve taken, grouped into activity “sessions”, as best the system can discern it. For me, it displays — after taking many minutes to load — brief stints of seconds-to-minutes of activity, with tens-to-hundreds of steps. Tapping any given cell will allow you to edit the number of steps. That’s it. This is the same view as the calorie counter, or calcium intake, or sleep tracker, or any of the other HealthKit functions, but it simply doesn’t scale well to the number of steps taken in a day.
HealthKit
The trouble with Health is that it doesn’t actually do anything with the information it collects. Sure, I can see that I took about 11,000 steps yesterday, but is that good? Does that mean anything? Health feels like it’s only a dashboard and settings panel for a bunch of third-party functionality by way of HealthKit. Apple explains the framework thus:
HealthKit allows apps that provide health and fitness services to share their data with the new Health app and with each other. A user’s health information is stored in a centralized and secure location and the user decides which data should be shared with your app.
In a nut, it’s a unique, specific, and secure centralized database for health and fitness information.
I spent some time with a few different apps that tap into HealthKit, including Strava, Lifesum, UP Coffee, Human, and Sleep Better. I’m not going to review each app individually, but there are common strokes between many of the HealthKit apps: most require some kind of manual data entry, and this can be tedious.
If you want reasonably accurate meal tracking with Lifesum, you need to enter every single food item you eat. That can be hard if you cook most meals yourself using fresh or raw ingredients, and don’t eat out at chain restaurants. (I’m not being preachy or elitist; it’s just my lifestyle.) Not all ingredients or meals will have all nutritional data associated with them, so it’s not entirely accurate or helpful for tracking specific intakes, like iron or vitamin B12.
Similarly, tracking my caffeine intake with UP Coffee requires me to manually input my caffeinated beverage consumption. That’s somewhat easier than meal tracking because I typically drink coffee fewer times per day than I eat.
But apps that are able to more passively collect this data, such as Sleep Better or Strava, are naturally much more intuitive because using them doesn’t feel like data entry or labour. I understand the limitations of meal tracking and how monumentally difficult it would be to automate something like that, but the side effect of manual data entry means that I’m less likely to follow through. Since I’m at a healthy weight and average fitness level, I suppose that tracking this information is less compelling than it would be for, say, someone with health or fitness goals.
I looked for apps that could provide some recommendations based on my HealthKit data. Aside from apps that remind me to stand up every so often, there’s not a lot that I could find on the Store, though I suspect there are legal reasons for that. I also looked apps that could provide this data to my local health care provider, but I can’t find any hospital or clinic in my area that has an app with that functionality.
Health and HealthKit haven’t radically transformed my life or made me healthier, but they have made me more aware of my activity levels, my food intake, and my sleep habits. Moreover, some apps have made it downright fun to keep track of my activities. I feel pretty good when I see I’ve walked 20,000 steps in a day, or that my sleep was 90% efficient. I bet if I combined this with an Apple Watch, I’d have a fantastic time ensuring I stay physically active, like I’m being coached.
Continuity
Perhaps the most silent improvement to iOS 8 is Continuity, a set of functions that use WiFi and Bluetooth to make working between iOS and OS X hardware more seamless.
Cellular Over WiFi and Bluetooth
Some functions allow non-cellular devices to bridge to the cellular network via an iPhone, thereby allowing you to send and receive text messages, make and receive phone calls, and create an instant personal hotspot. The latter isn’t new, per se, but it is vastly enhanced. Previously, you had to fish around for your phone, open Settings, toggle Personal Hotspot to “on”, and type the password on your Mac. A true ordeal. Now, you can just select your phone from your Mac’s WiFi menu, even if Personal Hotspot isn’t on.
I’ve found these technologies useful roughly in the order in which I’ve listed them. I send and receive texts all the time on my Mac, and it’s wonderful. Sometimes, I’ll be trying to organize a gathering with a few friends, some of whom use iPhones, and some that do not. It’s very nice to be able to switch between the conversations as each person replies, and not have to pick up my phone to answer messages from the non-iPhone users. My biggest quibble is that the read status of text messages is not synced, so messages I’ve read or deleted on my iPhone remain “unread” on my Mac.
I have made and received phone calls on my Mac, too, and I actually quite like it. It’s very nice to be able to chat on the phone the same way I take FaceTime calls, for example. The biggest limitation, for me, has been the lack of a keypad on OS X, which means I can’t buzz people into my apartment from the comfort and convenience of my Mac.
The always-available personal hotspot function is something I have used only a couple of times, but has been effortless and seamless each time. It’s kind of like my WiFi-only iPad has transformed into the WiFi and cellular model, or my MacBook Air turned into that 3G-capable MacBook Pro prototype. Better than either of those two, though, is that I don’t have to buy an extra cellular data plan. It’s not like my phone didn’t have this functionality before, but making it so easily accessible is a very nice touch.
Handoff
The final bit of Continuity technology is Handoff, which allows you to start a task on one device and continue it up on another. If this sounds familiar, it’s because it’s something Apple tried to do in 2011 with iCloud Sync, kind of. You could start bashing out a document in Pages on your iPhone on your commute, then open Pages on your laptop at work and open the document from iCloud. Apple even bragged that the text insertion point would be in the same place.
Unlike iCloud Document Sync, which was just a document syncing model, Handoff is an activity syncing model.
Handoff is much more clever. Basically, if you have devices signed in with the same iCloud account, an activity that you start in the foremost app on one device can be picked up in the same app on another device. That means you can find directions in Maps on your Mac, then grab your iPhone, slide up on the little Maps icon on the lock screen — opposite the Camera shortcut — and now your directions are loaded onto your iPhone. Seamless, in theory.
In practice, I’ve found Handoff to be significantly less reliable than its other Continuity counterparts. I can almost always send text messages from my Mac and answer phone calls, but transferring a webpage from my Mac to my iPhone is a dice roll. It will go for days without working, then behave correctly for a while, then not work again, in precisely the same conditions. I’m at a complete loss to explain why.
When it works, though, it’s pretty cool. I can start reading an article on my Mac, then realize it’s getting late and hop on my iPad in my bedroom, and it’s right there. It’s a subtle and quiet feature, but it’s very impressive. When it works.
AirDrop
It always struck me as bizarre that iOS devices and Macs both had a technology called AirDrop, but they behaved completely differently and didn’t work with each other. As of iOS 8 and Yosemite, the madness has ended: AirDrop works the same way on both platforms, and you can share between both kinds of devices.
And I must say that it works pretty well. I AirDrop stuff all the time to my friends, and between my own devices. It’s an extremely simple way of instantly sharing pretty much anything to someone nearby, and I do mean “instant”: whatever you share will open automatically on the receiving device. If that’s not necessarily what you want, you can share via Messages or something; there is, as far as I know, no way to change this behaviour.
Spotlight
Oh boy, here’s something I love. Spotlight has been turned from a simple and basic way of searching your device into a powerhouse search engine.
Spotlight integrates itself into iOS in two ways: the now-familiar yet still-hidden swipe-down gesture on the home screen, and in Safari’s address bar. Search results powered by the same engine are also available via Siri. The engine surfaces recently-popular or breaking news stories, Wikipedia articles, suggested websites, and items from Apple’s online stores.
In practice, this means I can go directly to the stories and items that are immediately relevant, bypassing the previously-requisite Google search or typing Wikipedia’s address into Safari. If I’m curious about something these days, I just type it into Spotlight, and it usually finds something I’m looking for.
A Brief Word on Apple’s Increasing Self-Promotion
Back in iOS 5, Apple began an increasing encroachment of self-promotion within their mobile OS by adding an iTunes button to the Music app. It was a small tweak, but a gesture that signified that they wanted to push additional purchasing options into the OS. In the releases since, the self-promotion opportunities within iOS have only increased.
As I mentioned above, items in Apple’s online stores lay in amongst the results delivered by Spotlight. If I search for “Kendrick Lamar”, it will present me with an iTunes link as the top hit, not the Wikipedia bio I — and, I’d venture a guess, most people — would be looking for.
iOS 8 also has a feature that suggests apps from the App Store based on your location. Passing by a Starbucks? If you don’t have the Starbucks app on your phone, you might see a Starbucks logo in the lower-left corner of your lock screen — the same place where a Handoff app usually appears.
Along a similar plane is the rise in non-removable default apps. As of iOS 8.2, Apple added six compared to iOS 7: Podcasts, Tips, Health, iBooks, FaceTime, and Apple Watch. There 31 total apps on a default iPhone running iOS 8.2 or higher that the user cannot remove, and every single person I know with an iPhone has a folder on one of their home screens where they stash at least half of these default apps. These are not tech-savvy people; they do not read Daring Fireball nor do they know what an API is. But they know this sucks.
We all put up with tech company bullshit. When you throw in your hat with Google, you know that the reason they’re really good at “big data” things is because they’re mining your information along with everyone else’s to built comprehensive associative databases. When you buy into Apple’s ecosystem, you know that their primary income source is in your purchase of their products and service offerings. It makes business sense to integrate prompts for those offerings into an operating system with a reach in the hundreds of millions. It’s marketing that the company controls and for which they do not pay a dime.
Yet this constant reminder that I’m using an Apple OS on an Apple phone with constant reminders of Apple’s services and “oh, hey, look: they make a watch now” is grating. I wouldn’t mind it so much if there were a way to reduce the impression of pretty much any of these pain points individually, but it’s limited. There’s some relief: you can disable suggested apps on the lock screen, and screenshots of iOS 8.4 suggest the Store button is being removed from Music. I’m not suggesting Apple throw away their self-promotional activities entirely, but I think it would be prudent to evaluate just how many of them are tolerable. It’s stretching the limits of user-friendliness.
Third-Party Extensibility
Much in the way that iOS 7 was kind of a soft reboot for the OS — iOS 1.0, 2.0, if you like — iOS 8 is the 2.0 “developer” release. Apple delivered in spades at WWDC last year, with new APIs that allow for the kind of inter-app operability and deep integration that makes the OS better for developers and users alike.
App Extensions
App Extensions have completely and fundamentally changed the way I use iOS. That’s this section, in a nutshell. There was a lot of major news at WWDC last year, from an entirely new programming language to a full OS X redesign, but App Extensions are among the most significant enhancements. And, as in my review last year, I want to tackle each of the six extension points individually. Kind of.
Sharing and Actions
I’m going to start with these two in conjunction because there seems to be little understanding or consistency as to how they are distinct. To me, Sharing extensions should mean “take this thing I’m looking at and send it to another app”, while Action extensions should mean “do something to this thing I’m looking in-place”. Or, perhaps Share should mean “pull up a dialog for me to take further actions on this thing I’m looking at”, and Action extensions should mean “take this thing I’m looking at out of this app and into another”.
But even with my fuzzy understanding, there are seemingly no rules. Pinner and Instapaper both have modal-type Share extensions, but adding a todo to Things with its Action extension also pulls up a modal sheet. Meanwhile, the Bing translation Action translates in-place on any webpage. Both kinds can accept the same kinds of data, as defined by the developer via MIME type, and both amount to doing stuff in one app via another.
The best way I can think of distinguishing between the two types is that a Sharing extension always displays a modal interface overtop an app, while an Action may or may not.
In any case, I’ve found both kinds of extensions extremely useful. Where previously a host app had to decide whether to allow sharing to another app, now the client app gets to make that decision, for the most part. It makes more sense: you decide what mix of apps go on your device, and those apps should defer to your choices. Now, I get to decide what goes in my Share sheets. I can even — surprise, surprise — turn of some of Apple’s defaults. Don’t use the default Facebook or Twitter integration? No problem – just flip the toggle off and you won’t see them. (This, unfortunately, doesn’t apply to Action extensions, so you’ll be seeing that printer icon everywhere whether you like it or not.) Want to see some third party Sharing extensions, but not others? Just flip their toggles. You can also sort your Share and Action extensions in any order you’d like.
That brings me to the biggest problem with third-party extensions: newly-installed extensions are completely undiscoverable. There is no visual indication when an app is updated with Share or Action extension support, and extensions come disabled by default. You will only figure it out if you scroll right to the end of your row of extensions, tap the “More” button, then scroll through the list. The only other way that you may find out is if the developer has included it in their update notes and you bother to check the changelog, which you won’t since you, like most people, probably have automatic updates enabled and haven’t seen the Updates tab of the App Store in, like, forever. Even with all the time I’ve been using iOS 8 and apps have been supporting it, I’m still finding new extensions in Share sheets.
What’s fantastic about Share sheet extensions is that they make any old app that uses the default sharing API feel instantly tailored. It means developers don’t have to support every bookmarking service individually, or pick and choose the ones they want to support; they can just tell the sharing API to handle it. I use Pinboard and Instapaper; you may prefer Pocket, Pinterest, or whatever new thing A16Z is investing in. That’s a lot of different sharing APIs to support. Even the login experience is far better for users, who now only have to sign in once with the client app, instead of each app individually.
I simply can’t say enough good things about Sharing and Action extensions.
Today Widgets
I can, however, say far fewer good things about the widgets that can occupy the Today view in Notification Centre. Generally, this isn’t Apple’s fault; rather, it’s the problem of developers wanting to use a new feature, but not having a solid justification or considered concept for doing so. This mindset has led to the creation of widgets such as Dropbox’s, which displays the last three files that were updated, or Tumblr’s trending topics widget. Then there are widgets like the NYT Cooking widget that suggests meals to cook, which sounds great in theory, but makes no sense as a random-access widget.
None of these widgets account for the ways normal people use Notification Centre. I’ve found that widgets that succeed in the Today view are conceptually similar to WatchKit apps: at-a-glance information that requires little interaction. Human’s widget, for instance, displays your current progress to 30, 60, or 90 minutes of daily activity. It requires no user interaction and is just informative enough.
Apple’s own widgets can be hidden in the Today view, too, with the exception of the date at the top. That creates an excellent opportunity for third parties to create more specific interpretations of those widgets. For example, I don’t drive to work, so Apple’s estimated commute time widget is useless to me. I do, however, wish to know when the next train is arriving, so I keep Transit’s excellent widget in my Today view. Similarly, Apple’s weather reading sometimes doesn’t show current conditions. The Fresh Air widget, on the other hand, always does, and it forecasts the weather for calendar events.
But while the best Today widgets display low-barrier glanceable information, none of them feel particularly instantaneous. This is due in large part to their standby state; or, more specifically, it’s due to the time it takes to recover from their standby state. Today widgets are required to be low-power, low-memory kinds of deals which only refresh when the users is viewing Notification Centre. While that makes sense, iOS only refreshes widgets when the animation that shows Notification Centre is fully complete. So: you drag from the top of the screen to show Notification Centre, see glimpses of cached information as you drag the sheet down, see a flash of every widget refreshing, then you can interact with any of them. It only takes a couple of seconds, but it makes for a user experience that is rougher than it should be for timely widgets like these. It would feel a lot more instantaneous if Today widgets, or at least the first few, refreshed as Notification Centre is being activated, rather than at the end of the activation.
Furthermore, widgets refresh when you scroll to bring them into view. I have Fresh Air at the top of my Today view, and Fantastical just offscreen below it. If I invoke Notification Centre, Fresh Air will refresh; when I scroll, Fantastical will refresh; then, when I scroll back up, Fresh Air will refresh again. This behaviour is apparently something that iOS does, and something that developers cannot control.
Finally, though Apple provides several of their own Today widgets, there doesn’t seem to be an agreed-upon set of visual interface rules. Most widgets respect the same left-side padding of the default ones, and many have similar typographic and hierarchical treatments, but then you get the odd monstrosity like Yahoo’s Weather widget or the aforementioned Tumblr one.
Today widgets should feel like smooth, passive ways to get small snippets of timely or location-related information. Instead, they come off a little janky. They’re a real missed opportunity for some developers, and a kludgy add-on for most. Hopefully the Watch will force the kind of focus demanded by widgets in iOS.
Photo Editing
Photo editors on iOS are kind of my thing. I’ve used all of the popular ones and, though I’ve settled on a workflow that I like, I keep a bunch of others on my iPhone just in case. Yet, after nearly nine months of the possibility of extensions to Apple’s default Photos app, just two apps on my phone — Afterlight and Litely — have such an extension, and I’ve tried a few dozen of the most popular ones. Even my beloved VSCOcam doesn’t have a Photos extension, despite being used in the demo of this API at WWDC. (I reached out to VSCO for comment on this, but I haven’t heard back from them.)
As for the apps that do have an extension for Photos, well, they’re okay. I find that they’re really hidden — instead of residing in the palette of editing tools at the bottom, there’s a little ellipsis in the upper-left corner of the app. Tap on it, then tap the extension you’d like to use, or tap More to see if any other extensions have been installed since you last did this — Photo extensions are hidden and disabled by default, like Sharing extensions.
It’s hard to give a generalized take on what Photo extensions are like, or typify their experience, but I’m going to try to do that. In order to do so, I had to go grab a few more apps. I downloaded Fragment, which does some rather trippy kaleidoscopic effects, and Camera+, which pained me to download because I think John Casasanta is kind of an asshole. But let’s not dwell on the past.
The extensions from both Fragment and Litely are somewhat lighter-weight versions of their parent apps, while Camera+ and Afterlight provide near-full experiences. That’s kind of cool to have in an extension: nearly running one app inside of another. You can make your edits, then tap Done, and the photo will be saved in-place in a flattened state; there is no granular undo post-save, nor is there a way to modify the applied edits. The original copy of the photo is saved, however, so you can revert entirely, but this, of course, destroys all of the edits made to a photo.
I’m struggling to understand the practical purpose of this extension point as it is right now. The full app is still required to exist somewhere on your phone; even if they’re buried in a folder somewhere, they must exist. Perhaps an ideal world would require you only to open Photos any time you wanted to make an edit, and future versions of the API will allow for nondestructive editing between several extensions. But I don’t see the power of this yet. It seems too hidden for average users, and not powerful enough for people who wish to have a full post-production environment on their phone or tablet.
Keyboards
There are some things I imagined Apple would never allow on iOS; third-party keyboards are among those things. Yet, here we are, with third-party keyboards supported natively in iOS, before the introduction of a third-party Siri API, for example.
I have tried pretty much all of the popular third-party keyboards for iOS — Fleksy, Swype, SwiftKey, Minuum, and so forth — running them for days to weeks at a time. And the keyboard that has stuck with me most has been — [dramatic pause] — the default one, for a singular reason: it’s the only one that feels fast.
Sure, pretty much all of the third-party keyboards you can find have a way better shift key than the default, and plenty are more capable. But I don’t type one-handed frequently enough to get a use out of a gestural keyboard like Swype; most of the time, I find these gestures distracting. Third-party keyboards also don’t have access to the system’s autocorrect dictionary, which means that developers need to build in their own autocorrect logic and users need to train the new keyboard. I didn’t think this would be as frustrating as it turned out to be. Third party keyboards also can’t automatically switch languages depending on which Messages conversation you’re in, which is something I don’t use, but plenty of people I know do.
But, as I wrote above, the main reason I stuck with the iOS keyboard is that it’s the fastest one. It launches immediately when it’s called and key taps are registered as fast as I can type with my two thumbs. That’s not to imply that I don’t have complaints with the default keyboard — I do, or have you not been reading? — but it’s simply the best option for me. And, judging by the people I’ve talked to, it’s the best option for most of them as well. Like me, they tried the most popular ones and, like me, most of them are back with the default.
The ones who have stuck with third-party keyboards have done so for reasons I didn’t necessarily think of. Kristap Sauters, for example, has found that SwiftKey is far better at swapping languages dynamically: he can start a sentence in one language, type a word from another, and it will detect this change better than the default. This is not a feature I would have found because it isn’t one I use.
The best third-party keyboards for my usage are those that do not try to replace the default alphanumeric one, but rather try to do things it can’t. Popkey, for example, is a ridiculous animated GIF library, but it’s smartly packaged as a keyboard so you can reply to text messages and emails just so. David Smith’s Emoji++ is another great third-party keyboard that effectively replaced Apple’s segmented emoji keyboard prior to 8.2, but it was Sherlocked with iOS 8.3.
I’m not sure whether the issues I have with third-party keyboards are the fault of iOS’ implementation, or the keyboards’ developers. Whatever the case, it’s enough to prevent me from using a non-default keyboard on a regular basis.
Documents and Files
iOS now has an exposed file system! Kind of.
Technically two extension points, Document and File providers allow for an app to identify itself as a place where other apps can send and receive files. iCloud Drive is an example of a file provider, but now third parties like Dropbox can provide documents to an app that supports it. So you can store your Pages documents in Dropbox instead of iCloud Drive, and have a similar level of synchronicity between your Mac and iPad.
Better still is Panic’s creative interpretation of this capability. You can open documents and files from Transmit in other apps, and since Transmit is an FTP client, that basically means that you can open any file you have access to in supported apps. That’s amazingly powerful.
This isn’t a type of extension with which I’ve spent a great deal of time. I’m not Federico Viticci, and I don’t have his automation prowess. But for power users or people who use their iPad as more than a kick-back-and-read device, it seems pretty great.
Notifications
I find it fascinating how iOS and OS X are built by the same company at the same time, but often do not share features; or, at least, their feature additions come at different rates.
The push notification API is the perfect example of the staggered rollout and feature incongruence across Apple’s operating systems. Though notifications existed since the beginning of the iPhone, they were modal and weren’t opened up to developers. By the time iOS 5 rolled around in 2011, notifications became far more scalable with the introduction of Notification Centre; it took until 2012 for them to be brought into OS X to replace, for most developers, the venerable Growl. In 2013, OS X notifications gained inline actions and replies, but iOS remained stubbornly without either.
So it was a relief when iOS 9 brought actionable notifications to Apple’s mobile platform. Onstage, they demoed archiving an email, replying to an iMessage, and the third-party potential of — for instance — liking a wall post on Facebook. This left me with the impression that I‘d be able to reply to third party notifications inline, too. But it turns out that third party developers don’t have access to the inline reply API, which is a real bummer.
Interactive notifications are fabulous otherwise, though. I suspect my email open rate has gone down dramatically since I can just deal with new messages as they arrive in both Mailbox and Spark, the two email apps I typically use. I use the “fav” button in Tweetbot notifications frequently as well, and Fantastical’s “snooze” feature for reminder notifications is perfect.
Unfortunately, plenty of third-party developers still haven’t added interactive notifications to their apps. Facebook’s Paper app doesn’t have them, nor does NYT Now, where I could imagine saving a breaking news story for later reading. On the other hand, perhaps it’s best that most developers don’t seem to be trying to shoehorn this feature into apps where it doesn’t belong.
I’m looking forward to further improvements in this space. Ideally, developers will be able to add inline replying, and perhaps they’ll even be able to draw their own custom UIs in notifications — Fantastical could, for example, present snooze options inline. There’s so much potential for notifications, especially in conjunction with the Watch.
The State of iOS
Apple’s mobile operating system has matured into an incredibly robust platform. They’ve spent a lot of time over the past two years rebuilding parts of the OS to make it last another seven or eight years, and things are coming up Milhouse.
But the last two years of defining an entirely new visual design for the platform and an updated functional design have also clearly taken their toll. There have been bugs — a lot of bugs. After several months with iOS 8, I’ve gotten used to some of its little foibles. I learned not to tap the space between the keyboard and the notification area while replying to a message, until that was fixed nearly 300 days after first being reported as a bug in the earliest iOS 8 betas. I learned all sorts of things that I shouldn’t do, and ways of troubleshooting core parts of the OS that really shouldn’t need troubleshooting.
It’s been a really rough ride for developers, too. As a plethora of new capabilities were given to third parties, the app review team found widely varying interpretations of the new API usage guidelines. Things which were perfectly fine in one app already sold in the store may not be okay in a different app. Inconsistent rejections hampered developers this year and eroded their confidence in the platform.
There’s a lot for Apple to do this year. They always have a long todo list, but this year’s feels more urgent than most. Apple’s sales have never been better, but the confidence of developers and users feels a little shakier than it has for a while, for both iOS and OS X.
I am excited, as always, for Monday’s keynote. I can’t wait to see what new things developers get to take advantage of, from really big things — like a Siri API, perhaps — to littler things. One thing is for certain: there’s no shortage of things for Apple to do with their platforms. Every year, I feel the same way, no matter how robust and mature their platforms get: they’re just getting started.