Computational Photography’s Uncanny Valley newyorker.com

Kyle Chayka, the New Yorker:

In January, I traded my iPhone 7 for an iPhone 12 Pro, and I’ve been dismayed by the camera’s performance. On the 7, the slight roughness of the images I took seemed like a logical product of the camera’s limited capabilities. I didn’t mind imperfections like the “digital noise” that occurred when a subject was underlit or too far away, and I liked that any editing of photos was up to me. On the 12 Pro, by contrast, the digital manipulations are aggressive and unsolicited. One expects a person’s face in front of a sunlit window to appear darkened, for instance, since a traditional camera lens, like the human eye, can only let light in through a single aperture size in a given instant. But on my iPhone 12 Pro even a backlit face appears strangely illuminated. The editing might make for a theoretically improved photo—it’s nice to see faces — yet the effect is creepy. When I press the shutter button to take a picture, the image in the frame often appears for an instant as it did to my naked eye. Then it clarifies and brightens into something unrecognizable, and there’s no way of reversing the process. David Fitt, a professional photographer based in Paris, also went from an iPhone 7 to a 12 Pro, in 2020, and he still prefers the 7’s less powerful camera. On the 12 Pro, “I shoot it and it looks overprocessed,” he said. “They bring details back in the highlights and in the shadows that often are more than what you see in real life. It looks over-real.”

I find Chayka’s specific word choice — “digital manipulations [that] are […] unsolicited” — worth thinking about. It is the same sort of question raised by Lux’s look at the iPhone 13 camera system, which Chayka also links to. These are processing choices that go far beyond the decisions made by the developers of film stock or DSLR settings. An iPhone’s camera slices the image into its individual parts and tunes each one, resulting in images that span a quality gamut.

There are many things about the default camera app’s processing that are not to my tastes, but one attribute tops the list: its aggressive noise reduction. I wish it would back off and permit a little more grain, which gives images texture and compromises details less.

Right now, the iPhone’s image processing pipeline sometimes feels like it lacks confidence in the camera’s abilities. As anyone who shoots RAW on their iPhone’s camera can attest, it is a very capable lens and sensor. It can be allowed to breathe a little more.

This piece opens with the story of a mother of two who recently upgraded her iPhone — from an “iPhone 10”, per the New Yorker’s house style — and was unhappy with her 12 Pro’s photo capabilities. I found the closing part of that story funny:

[…] The new iPhone promises “next level” photography with push-button ease. But the results look odd and uncanny. “Make it less smart — I’m serious,” she said. Lately she’s taken to carrying a Pixel, from Google’s line of smartphones, for the sole purpose of taking pictures.

Google’s Pixel was the phone that really kicked off this computational photography stuff. It seems its interpretation of images is seen — by this owner anyway — as less intrusive. But I do not see the iPhone 12 Pro issues she raises in my own 12 Pro’s photos, such as desaturated warm tones.