Apple to Roll Out Child Safety Feature That Locally Screens Messages for Nudity to U.K., Canada ⇥ theguardian.com
Alex Hern, the Guardian:
A safety feature that uses AI technology to scan messages sent to and from children will soon hit British iPhones, Apple has announced.
The feature, referred to as “communication safety in Messages”, allows parents to turn on warnings for their children’s iPhones. When enabled, all photos sent or received by the child using the Messages app will be scanned for nudity.
[…]
Apple has also dropped several controversial options from the update before release. In its initial announcement of its plans, the company suggested that parents would be automatically alerted if young children, under 13, sent or received such images; in the final release, those alerts are nowhere to be found.
Hern repeatedly writes about this “iPhone” feature, but Apple says this feature is on iPads and Macs, too. Rene Ritchie says the feature will also be coming to Canadian devices. Ritchie does not say when it will roll out, but I bet it will happen in the same software updates as the U.K. launch.
I maintain this feature is a welcome one and should be an option for all users, at least on the receiving side. This is not the far more controversial CSAM detection feature, which Apple has yet to release or communicate updates. Apple first rolled out this feature in the U.S. with iOS 15.2 in December. I remain concerned about the power of an algorithmic process unaudited by a third party, and whether it will intervene with Goldilocks sensitivity. If it uses a similar photo recognition process as the Photos app, that is not the most confidence-inspiring start.
Even so, in this case, I truly believe doing something is better than doing nothing. If its false positive rate is acceptably low, it may feel more trustworthy, though I think Apple needs to better communicate the use of on-device processing for such a sensitive feature — recall the ‘brassiere’ incident of 2017. The flip-side concern is its false negative rate. That is obviously a concern but, it must be noted, the worst case scenario of failing to flag nudity is the present situation.