Lina Khan’s FTC Gets Results Against Data Broker X-Mode ⇥ ftc.gov
In 2020, Joseph Cox of Vice published an investigation into HYAS, explaining how it received precise location data from X-Mode; I linked to this story at the time. The latter company, now Outlogic, obtained that data from, according to Cox’s reporting, an SDK embedded “in over 400 apps [which] gathers information on 60 million global monthly users on average”. It sold access to that data to marketers, law enforcement, and intelligence firms. Months later, Apple and Google said apps in their stores would be prohibited from embedding X-Mode’s SDK.
Even in the famously permissive privacy environment of the United States, it turns out some aspects of the company’s behaviour could be illegal and, in 2022, the FTC filed a complaint (PDF) alleging seven counts of “unfair and deceptive” trade. Today, the Commission has announced a settlement.
Lesley Fair of the FTC:
[…] Among other things, the proposed order puts substantial limits on sharing certain sensitive location data and requires the company to develop a comprehensive sensitive location data program to prevent the use and sale of consumers’ sensitive location data. X-Mode/Outlogic also must take steps to prevent clients from associating consumers with locations that provide services to LGBTQ+ individuals or with locations of public gatherings like marches or protests. In addition, the company must take effective steps to see to it that clients don’t use their location data to determine the identity or location of a specific individual’s home. And even for location data that may not reveal visits to sensitive locations, X-Mode/Outlogic must ensure consumers provide informed consent before it uses that data. Finally, X-Mode/Outlogic must delete or render non-sensitive the historical data it collected from its own apps or SDK and must tell its customers about the FTC’s requirement that such data should be deleted or rendered non-sensitive.
This all sounds good — it really does — but a closer reading of the reasoning behind the consent order (PDF) leaves a lot to be desired. Here are the seven counts from the original complaint (linked above) as described by the section title for each:
“X-Mode’s location data could be used to identify people and track them to sensitive locations”
“X-Mode failed to honour consumers’ privacy choices”
“X-Mode failed to notify users of its own apps of the purposes for which their location data would be used”
“X-Mode has provided app publishers with deceptive consumer disclosures”
“X-Mode fails to verify that third-party apps notified consumers of the purposes for which their location data would be used”
“X-Mode has targeted consumers based on sensitive characteristics”
“X-Mode’s business practices cause or are likely to cause substantial injury to consumers”
These are not entirely objections to X-Mode’s sale of location data in a gross violation of their privacy. These are mostly procedural violations, which you can see more clearly in the analysis of the proposed order (PDF). The first and fifth counts are both violations of the rights of protected classes; the second is an allegation of data collection after users had opted out. But the other four are all related to providing insufficient notice or consent, which is the kind of weak justification illustrating the boundaries of U.S. privacy law. Meaningful privacy regulation would not allow the exploitation of real-time location data even if a user had nominally agreed to it. Khan’s FTC is clearly working with the legal frameworks that are available, not the ones that are needed.
Sen. Ron Wyden’s office, which ran an investigation into X-Mode’s practices, is optimistic with reservations. Wyden correctly observes that this should not be decided on a case-by-case basis; everyone deserves a minimum standard of privacy. Though this post and case is U.S.-focused, that expectation is true worldwide, and we ought to pass much stricter privacy laws here in Canada.