Jack Nicas, New York Times:
They all tell a similar story: They ran apps that helped people limit the time they and their children spent on iPhones. Then Apple created its own screen-time tracker. And then Apple made staying in business very, very difficult.
Over the past year, Apple has removed or restricted at least 11 of the 17 most downloaded screen-time and parental-control apps, according to an analysis by The New York Times and Sensor Tower, an app-data firm. Apple has also clamped down on a number of lesser-known apps.
In some cases, Apple forced companies to remove features that allowed parents to control their children’s devices or that blocked children’s access to certain apps and adult content. In other cases, it simply pulled the apps from its App Store.
The Times is eager to suggest anticompetitive behaviour by Apple, but I’m not so sure. Apps on iOS are sandboxed, which means that they’re highly restricted in how they may interact with other third-party apps on the system.
The impacted developers have been using a variety of methods to track screen time, as there has not been any official means of tracking this data. This included the use of background location, VPNs and MDM-based solutions, and sometimes a combination of methods.
Some of the developers, we understand, were told they were in violation of App Store developer guideline 2.5.4, which specifies when multitasking apps are allowed to use background location. Specifically, developers were told they were “misusing background location mode for purposes other than location-related features.”
Others were told their app violated developer guideline 2.5.1, which references using public APIs in an unapproved manner.
Combine this with a statement given by an Apple spokesperson to the Times that these apps are potential privacy violators, and I’m not surprised that they’re being restricted or even removed from the App Store.
What this reporting illustrates most of all is just how poor Apple’s communication with developers often continues to be. Case in point, from Nicas:
On Jan. 19, Mr. Ramasubbu received a message from Apple that said he had 30 days to change the Mobicip app or it would be removed from the App Store. “If you have any questions about this information, please reply to this message to let us know,” the note said. “Best regards, App Store Review.”
Over the next 27 days, Mr. Ramasubbu responded four times seeking more information. He eventually resubmitted the app with changes he hoped would satisfy Apple’s demands.
Then, with Mobicip’s deadline just a few days away, Apple responded three times to his earlier detailed questions — with virtually the same message: “Your app uses public A.P.I.s in an unapproved manner, which does not comply with guideline 2.5.1 of the App Store Review Guidelines.”
App Review should, at the very least, prevent rule breakers from getting into the App Store in the first place. They failed to do that by allowing high-profile parental control apps into the store that cannot work without violating their rules. But they should at least be very clear about the circumstances of rule violation, particularly when an app has already been approved.
It’s also clear that there is a demand for these apps. I think it would be great if there were APIs for Screen Time data, perhaps tied into HealthKit. Of course, it’s worth worrying about what Facebook is likely to do with that kind of information.
Update: In an email republished by MacRumors, Phil Schiller confirms that the company told developers to stop using MDM profiles as a way to monitor or limit device use in non-enterprise contexts. Also, it is notable that the Times did not publish the statement they received from Apple in full.
Notably, Nicas does not cite Perez’s story in his piece, continuing the Times’ long and let’s-say-proud history of failing to credit others’ original reporting. ↩︎