What Phone Cameras See in 2023
As I’ve already mentioned, these smartphone cameras are so much [about] software now that the photo that you get when you hit that shutter button isn’t so much reality as it is this computer’s best interpretation of what it thinks you want reality to look like.
When you snap that photo on your phone, you’re not necessarily getting back a capture of what was really in front of you. They’re really bending it, in many ways. The iPhone’s “thing” is: when you take a photo, it likes to identify faces and evenly light them. It tries every time.
This is a ‘clever’ step in iPhone photography processing: since it can super-rapidly segment the image in components, like human subjects, it can apply selective adjustments. What matters is the degree of adjustment and its quality. I was a bit disappointed to find that this adjustment seems to be equally heavy-handed as the previous iPhone: I have honestly never seen it make for a better photo. The result is simply jarring.
That’s precisely the issue here. The iPhone’s camera hardware is outstanding, but how iOS interprets and remixes the data it gets fed from the camera often leads to results that I find … boring and uninspired unless I manually touch them up with edits and effects.
The camera system in the iPhone XS was the first time Apple marketed its computational photography efforts with Smart HDR and, perhaps not coincidentally, it was also the first time I can remember there being complaints about over-processed images. The quirks kept coming: last year, the New Yorker carried an article about unnatural-looking iPhone photos.
I wish Apple would offer a way to adjust how aggressive the processing is and/or bring back the Keep Normal Photo option.
Maybe I should be using a third-party camera app, but I haven’t seen this particular option in Halide — I don’t want to save huge RAW files — and there’s still no true way to change the default camera app.
After I watched Brownlee’s video, I wondered if it would make sense for someone to create a third-party camera app focused on having a lighter touch on processed images. I do not know enough about the camera APIs to understand if this is plausible. But, interestingly, there is a setting in Halide to save only processed HEIC photos, and there is another setting to turn off Deep Fusion and Smart HDR; Deep Fusion is Apple’s term for improving texture and detail in lower-light photos. That gets partway toward what I want to see.
I tested the effects of this setting by taking two photos on my iPhone 12 Pro in Halide: one with the “Smartest Processing” toggle on, and another of the same scene with it switched off. I found turning it off creates a situation that is the worst of both worlds: the dynamic range and detail of photos is noticeably compromised, but photos are still passed through the same overly aggressive noise reduction system as any other image. In a photo of my unlit dining room against a nearby window, the wood grain of the table was evident in the photo with the “Smartest Processing” turned on, as was the crisp edge of the table top. When “Smartest Processing” was turned off, the table was rendered as a brown smear and the edge was uneven. Images with “Smartest Processing” turned on sometimes appear oversharpened, but they are overall a better interpretation of the scene.
I also tested this with some photos of my partner, including in dramatic light. I did not see the bizarre face flattening that Brownlee saw, but the highlights in each example were handled in ways where neither the “Smartest Processing” version nor the less processed version appeared correct.
The problems do not appear to be a form of overprocessing as much as they are unnatural or unexpected results of processing. Deep Fusion is great; Portrait Mode, as an option, is often excellent as well. But some of the selective enhancements made by the iPhone — the way it slices a scene into individual components for separate adjustments — sometimes fail to resolve in a satisfying final photo. Again, I tested that one toggle in Halide on my iPhone 12 Pro, and there are probably major differences in photos from any more recent iPhone. There are also many components of the iPhone’s image processing pipeline that have nothing to do with that toggle. However, the same kinds of complaints are being raised by iPhone 14 Pro users, and it has a larger high-resolution sensor and lots more processing power.
I am on an approximately three-year iPhone upgrade cycle and, so, I hope Apple relaxes its unnatural photo processing engine in the 15 Pro models. There is a vast middle ground between the completely unprocessed RAW images nerds like me enjoy working with and the photos produced by the default Camera app. There is room to create images with more character that are better representations of the scene. Sometimes, the imperfections in a photo — the grain, some slightly blown-out highlights, white balance that is way too warm — are what gives it an emotional quality, and trying to smooth those things out can make it feel sterile and inhuman.
Computers are good at taking very precise instructions literally, and there are many ways in which the digital versions of things are superior to their analogue counterparts. But that does not always make them better. It is tangential, but I am reminded a little of the problem of iTunes’ shuffle function, which would always play songs as jumbled as a computer’s random number generator could determine. However, users hated when two songs from the same artist would play back-to-back because it felt less random. So Apple introduced Smart Shuffle, which decreased randomness to create a more varied experience that felt completely random. Sometimes, the result to strive for is the one that is not technically correct but feels the most correct.