Apple Understands Photography, I’m Pretty Sure thenextweb.com

Boris Veldhuijzen van Zanten, co-founder of the Next Web, in an article headlined “Apple Doesn’t Understand Photography” that’s getting quite a lot of traction:

The most innovative thing Apple did with their Photo app recently was the addition of a ‘Selfie’ filter. You can find the folder in your Photos app, and yeah, it is filled with Selfies.

Apart from that Apple still thinks we use photography as we did it 30 years ago: we go on a trip, take a bunch of photo’s then struggle with how to show our friends these photos when we get back from our trip.

I think van Zanten missed WWDC because I cannot figure out why he wrote this article otherwise. Case in point:

What is the problem that needs fixing? It is that photography is changing. I showed my girlfriend some tiny text on the back of a credit card. Without hesitating she pulled out her camera, took a photo, and then zoomed in on the photo to read the text.

The camera in your iPhone is a zoom in device for small text or objects.

iOS 10 will ship with a magnification feature enabled via a triple-press of the home button.

Now you could argue there are different apps for different purposes and I should simply use Evernote for notes, Snapchat for disposable stuff and remember to delete the photos I no longer want. But that’s not how life works. I’m paying for lunch and take a quick pic of the receipt. That’s two actions: Swipe up, take photo.

I could also launch my receipts app. Then means unlocking my phone, finding the app, launching it, clicking the ‘Add receipt photo’, taking the photo, etc. That’s easily 10 steps. Nobody has time for that.

Adding a note in Evernote is also more complicated than just swiping up and launching the camera. But Apple could easily make this easier. If Apple can detect a face in a photo it should be able to detect a receipt as well. If it can detect a selfie surely it can differentiate between ‘holiday photos’ and regular snapshots. In fact, if the photo was taken on a weekday, during work hours, and close to work or home there’s a 95% chance this isn’t some kind of holiday event that needs a photo album.

iOS 10 will also ship with object detection built into the Photos app and, yes, it detects receipts. It doesn’t create albums for any of this stuff; it just tags the photo. This was all announced at WWDC; van Zanten’s article was written nearly a week after the opening keynote.

None of those images are meant to be saved ‘for later’. A year from now nobody will care about what I did at 9:06 AM while waiting in line at the coffee bar. It might be interesting for 1 other person (the person I’m getting coffee for) but it can safely disappear into the void an hour later.

Automatically deleting photos from the Camera strikes me as a terrible idea; for photos taken from within Messages, I can see the value. However, that’s still automatic deletion of user data caused by the system, and that’s rarely okay. There are features within Photos today that help solve this problem, like iCloud Photo Library and device-optimized storage, so deleting photos is rarely necessary. The aforementioned automatic scene detection features coming in iOS 10 mean that far less user intervention is necessary to organize and maintain a large photo library.

Are there loads of features I hope to see from Photos and the Camera app in future versions of iOS? Sure. But to argue that Apple “doesn’t understand photography” using issues that were solved at WWDC is callous and lazy.