Apple Plans to Allow Developers to Add Communications Safety Features to Their Apps wired.com

Lily Hay Newman, of Wired, also reported on today’s privacy announcements from Apple. In addition to confirming it has stopped its iCloud photo scanning efforts, it told her about its plans for its existing child safety features:

The company told WIRED that while it is not ready to announce a specific timeline for expanding its Communication Safety features, the company is working on adding the ability to detect nudity in videos sent through Messages when the protection is enabled. The company also plans to expand the offering beyond Messages to its other communication applications. Ultimately, the goal is to make it possible for third-party developers to incorporate the Communication Safety tools into their own applications. The more the features can proliferate, Apple says, the more likely it is that children will get the information and support they need before they are exploited.

I re-read what I wrote about Apple’s announcements today and I am worried I came off as indifferent to the problem of CSAM and how it is enabled by the widespread adoption of internet-connected devices, especially with cameras. There are few problems — perhaps none — of a more pressing universal concern than ensuring children are not exploited and their safety is not at risk. But I am also worried about the use of these heinous crimes to make it harder or a public relations risk to increase user privacy and security.

This is a difficult needle to thread, but I appreciate these efforts to balance the privacy needs of many against the risks of creating unnecessary roadblocks for law enforcement or enabling criminals.