Written by Nick Heer.

In Three Cases, the FTC Has Demanded Deletion of Algorithms, Code, or Models Derived From Illegally Collected Data

Kate Kaye, Protocol:

When it comes to today’s data-centric business models, algorithmic systems and the data used to build and train them are intellectual property, products that are core to how many companies operate and generate revenue. While in the past the FTC has required companies to disgorge ill-gotten monetary gains obtained through deceptive practices, forcing them to delete algorithmic systems built with ill-gotten data could become a more routine approach, one that modernizes FTC enforcement to directly affect how companies do business.

[…]

The winds inside the FTC seem to be shifting. “Commissioners have previously voted to allow data protection law violators to retain algorithms and technologies that derive much of their value from ill-gotten data,” former FTC Commissioner Rohit Chopra, now director of the Consumer Financial Protection Bureau, wrote in a statement related to the Everalbum case. He said requiring the company to “forfeit the fruits of its deception” was “an important course correction.”

As with authorities requiring Clearview to delete identification data, I am confused about how it is possible to extricate illegally-acquired materials from software like machine learning models. In the Everalbum case (PDF), for example, the FTC ordered the company to delete data derived from the collection of faces from users who deactivated their accounts, as well as any algorithms or models created from that information. But it is possible some or all of those photos were used to train the machine learning models used by all users. Without rolling the model back to a state before the creation of any data relevant to the FTC’s order, how is this possible? I am genuinely curious. This sounds like a far way to treat businesses that have exploited illegally acquired data, but I am unclear how it works.

The headline on this article is ridiculous, by the way. It claims this penalty strategy “spells death for algorithms”, but the article clarifies “the term ‘algorithm’ can cover any piece of code that can make a software application do a set of actions”. Whoever picked this headline entirely divorced it from Kaye’s excellent reporting.