Apple Explains How It Thinks About Photography in an A.I. Era theverge.com

For a review of this year’s new iPhone Pro models, Nilay Patel, of the Verge, asked Apple about the company’s view of photography. There is a really good, on-the-record response which seems to draw a clear line on the processing of an image. Put simply, it seems Apple’s perspective is to try to accurately capture a scene as it occurred. While images may have taken on a too-processed look for my liking, the intention seems to be to capture light as it was, not simulate a memory which never occurred.

Patel:

That’s a sharp and clear answer, but I’m curious how Apple contends with the relentless addition of AI editing to the iPhone’s competitors. The company is already taking small steps in that direction: a feature called “Clean Up” will arrive with Apple Intelligence, which will allow you to remove objects from photos like Google’s Magic Eraser. McCormack told me that feature will somehow mark the resulting images as having been generatively edited, although he didn’t say how.

In my testing of Clean Up on an image on the latest iOS 18.1 beta build, Apple adds EXIF tags to the image to mark it as being edited with generative A.I. tools. EXIF tags can be erased, though, and I do not see any other indicators. It is possible they exist and I missed them.

Apple’s tools are more cautious, so far, compared to those from its competitors. Even if you include the unreleased Image Playground product — something which I do not see much value in Apple releasing at all — nothing the company is doing on the generative A.I. front is so far allowing people to create entirely fraudulent photos. It is possible Apple does not have the technology of Google’s Magic Editor and, so, perhaps this is an unfair comparison. If it does, though, it should elect not to release it — a choice Google ought to have made as well.