Inside Apple’s iOS 16 Remake of the iPhone’s Iconic Lock Screen techradar.com

Lance Ulanoff, TechRadar:

The “magazine look” [Alan] Dye mentioned is more than just the overall composition of Lock Screen elements. It’s that fluff of dog fur or the swell of flowing hair that intersects with the time element and, instead of sitting behind the numbers, layers on top of it. It’s an arresting – and professional – look that’s created automatically. Apple calls it “segmentation.”

Creating this look is something Dye, and his design team dreamed of for years.

“We’ve been wanting to achieve this look, but the segmentation’s gotten so good, that we really feel comfortable putting [in there]. Unless the segmentation is just ridiculously good, it breaks the illusion.”

If you have not installed the iOS 16 beta, trust me on this: this segmentation effect is really really good. I am not saying that all photos I have tried work perfectly, but only a handful of images that seem like they should work have not; most portraits create a fabulous and compelling lock screen. What an upgrade.

Ulanoff:

Dye told us that if the system doesn’t think the photo will look great, it won’t suggest it, a point of care and attention that helps guide the user towards more visually arresting Lock Screens.

This is the only thing that worries me. I am sure Apple has trained its machine learning workflows on a wide array of images, people, animals and so forth. Yet I have a hard time shaking Maciej Cegłowski’s description of machine learning as “money laundering for bias”. Apple’s changes have been thoughtful, as demonstrated in this interview with Dye and Craig Federighi, but I always have that worry about subjective choices made by machines.