Apple Suspends Human Analysis of Siri Responses in Response to Privacy Concerns techcrunch.com

Matthew Panzarino, TechCrunch:

“We are committed to delivering a great Siri experience while protecting user privacy,” Apple said in a statement to TechCrunch. “While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”

[…]

An explicit way for users to agree to the audio being used this way is table stakes in this kind of business. I’m glad Apple says it will be adding one.

It also aligns better with the way that Apple handles other data like app performance data that can be used by developers to identify and fix bugs in their software. Currently, when you set up your iPhone, you must give Apple permission to transmit that data.

What’s truly bizarre to me is that there is already a way to prevent Siri logging — it’s just not user exposed. What good reason is there for this to not be something users can choose whether to participate in?

For what it’s worth, if I had been presented the option to allow Apple to maybe use my Siri requests to improve the service overall, I’d have at least considered it; but, since this was done in such a sneaky way, I’m more eager to switch it off. The net result is the same, but the integrity of Apple’s communications matters.

Similarly, this response is far better than their first.