Moon Zoom Questions reddit.com

I thought this exploration by Reddit user ibreakphotos of Samsung’s Moon detection feature was well done. It is a smart test which lines up with previous experiments in its conclusions: these pictures are simply more detailed than is possible from even an impressive zoom lens.

Samsung has explained how its camera works for pictures of the Moon, and it is what you would probably expect: the camera software has been trained to identify the Moon and, because it is such a predictable object, it can reliably infer details which are not actually present. Whether these images and others like them are enhanced or generated seems increasingly like a distinction without a difference in a world where the most popular cameras rely heavily on computational power to make images better than what the optics are capable of. Also, these Moon pictures sure seem like a gimmick — how many mediocre pictures need to be posted to Instagram of the Moon with nothing else in the frame?

If it sounds like I am being flippant, it is only because I think this becomes more worrisome if it moves away from marketing stunts, as it seems likely to. In the case of these Moon pictures, it seems pretty clear to me that Samsung’s camera software is mapping known detail onto a known object. It is untruthful, but not too impactful. What happens as the categories of known scene types are expanded? As I wrote last month, there were already problems with a simple compression algorithm used in Xerox copiers, which ended up subtly changing numbers in documents. Without getting into FUD territory, imagine those sorts of errors or assumptions in photography, and with the complexity of a machine learning black box in the capture pipeline.