Search Results for: data broker

Om Malik:

The companies we should be worried about are the many smaller and mid-sized companies that most of us have never heard about. Whether it is app developers surreptitiously selling information to third parties, data breaches at retailers (and their digital platforms), or data-brokers with security systems that have more holes than swiss cheese, these companies will continue to be the cause of most headaches in our digital lives. And they are the group more likely to take liberties with data and privacy.

[…]

And at the top of the list are companies that have always been hostile to their customers: telephone companies, electric utilities, insurance companies, for-profit hospital systems, big airlines, and other such organizations. They will only use “smart data” to amplify their past bad behavior.

There is good reason to focus on the biggest and most valuable companies, since they, by default, have the most influence. But setting the bar so high — the bills proposed in the United States only apply to companies with a market cap over $600 billion — neglects the many smaller companies and industries that are begging for better oversight.

This is something I have been concerned about for years because an overwhelming focus on the biggest tech firms means far less scrutiny of companies that are big enough to do real harm and anonymous enough to avoid consequences. Every company is a technology company now, to some extent or another: oil refineries rely on computer control and proprietary software; lumber mills use automated multi-axis saws; airplanes are computers with engines, seats, and wings, all of which have their own computers. All of these connected systems come with risks for the market and for consumers from the supply chain level up.

Malik cites a recent case of utilities in Texas changing customers’ smart thermostats as an example of a previously unthinkable concern. I would point to cellular carriers, ad tech companies, and data brokers as industries that exploit privacy vulnerabilities without consent for their own gain.

The high value bar of the legislation currently proposed in the U.S. means that many sectors with little competition remain unaddressed. I have been looking at hotel reservations for an upcoming trip, and I was reminded of the lack of competition in travel booking websites. Booking Holdings owns Booking.com — obviously — plus Priceline, Agoda, and Kayak, among several other brands. In addition to Expedia.com, Expedia Group owns many other companies such as Hotels.com, Hotwire, Orbitz, Travelocity, and Trivago. If you are from North America or Europe and you are booking a hotel room online, chances are that it will be through a service owned by one of these two companies which, combined, represent 92% of the U.S. market. But neither one is worth anywhere near $600 billion, so they would not be required to divest any of their brands should the current crop of U.S. tech company bills become law.

After a bit of a bummer post — sorry — I wanted to highlight a few things that impressed me after today’s WWDC opener, beginning with privacy features.

Shoshana Wodinsky, Gizmodo:

First up is Mail Privacy Protection, which is a new tab in Apple’s Mail app that’s meant to do what the name implies: letting users decide what data the program shares. Under this new tab, users can choose to hide their IP address and location details from email senders, not unlike the recent iOS 14 updates that keep apps from slurping up details like precise location and a phone’s mobile ad ID. As an added benefit, Apple says its new mailbox settings will keep people from tracking whether you opened the email they sent you and when that email was opened.

This is an interesting twist on the tracker blocking features of some other email apps. But instead of trying to block them, the Mail app in iOS 15, iPadOS 15, and MacOS Monterey will download everything in every message, even when you do not open a message. And it will do so indirectly, “routed through multiple proxy servers”, in Apple’s words. It appears that marketers will still get a very approximate idea of your location — Apple says that it is at a “region” level — but will not know if you did or did not open a message.

This is pretty clever. Any image can theoretically be used as a tracker, so it is a constant cat-and-mouse game for apps like Hey to find and block while still displaying relevant pictures. This is the “I am Spartacus” gambit: instead of fighting the trackers, this technique embraces them all, rendering them useless for understanding open rates or tracking any user.

Marketers, take note.

Wodinsky:

On top of the inbox updates, the company also announced new “app privacy reports,” which will surface more detailed intel about how non-Apple apps are tracking your activity across your device. Similar to Safari privacy reports, these will break down which apps on your device are accessing what kind of data, and how much of that data gets sent to specific third-party trackers. As part of that report, users will also get an overview of how often a given app accessed your microphone, camera, or precise location over the past week. Think of it as a quick list to shame the worst privacy offenders on your phone.

In a preemptive counterstrike, Facebook announced today that it would begin showing creators a breakdown of how much of their earnings from in-app purchases are going to Apple and Google. Tag, you’re it.

Wodinsky:

Apple introduced a slew of new features for iCloud on the privacy front. First, the company announced Private Relay, a new VPN service built into iCloud that will let users browsing on Safari completely encrypt their traffic. Apple says this setting ensures that “no one between the user and the website they are visiting can access and read” any data sent over Private Relay, not even Apple or the user’s network provider. […]

This comes with iCloud Plus, which is Apple’s new name for all of its paid iCloud plans. iCloud Private Relay does not allow you to pick a different country and only works in Safari; you should not think of it as a replacement for a VPN in many circumstances. As such, it should play nicely with personal and corporate VPNs.

iCloud Private Relay will not be available in several countries, including Belarus, China, the Philippines, and Saudi Arabia.

Michael Grothaus of Fast Company was briefed on these features before today’s keynote, and spoke with Craig Federighi about them:

Federighi explains that governments are often reactive when it comes to technology – and there’s no way for them to get around that. At least on the consumer front, companies do most of the innovating. They’re also the ones who find new ways to exploit data. So governments can put rules around technologies or processes only after they’ve become a problem. Those rules often lag far behind the speed of such innovations. That’s why even if governments were more proactive, it would still fall on companies such as Apple to develop new privacy-enhancing technologies.

That being said, Federighi believes that “there’s absolutely a role where government can look at what companies like Apple are doing and say, ‘You know, that thing is such a universal good – such an important recognition of customer rights – and Apple has proven it’s possible. So maybe it should be something that becomes a more of a requirement.’ But that may tend to lag [Apple’s privacy] innovation and creation of some new thing that they can evaluate and decide to make essentially the law.”

I am sure regulation will not preemptively correct every privacy ill, but surely there are good reasons that the data broker industry is uniquely capable and creepy in the United States compared to other developed countries. Privacy problems are not a U.S.-only problem, but they are a U.S.-mostly problem — and, because so much personal information of users worldwide is stored on servers controlled by U.S. entities under U.S. laws, we are all sucked into the failure of the U.S. to legislate.

Although App Tracking Transparency only shipped this week as part of iOS 14.5, Apple announced it last year, and it got Facebook all riled up. The company has aggressively campaigned against the feature, arguing that it will harm small businesses because, as Facebook’s Dan Levy wrote, precisely targeted ads bring businesses’ costs down:

This affects not just app developers, but also small businesses that rely on personalized ads to grow. Here’s why. Small businesses have small budgets. For these small budgets to work, they have to be targeted at the customers that matter to small businesses. It doesn’t do a local wedding planner any good to reach people who aren’t planning a wedding. Likewise, it doesn’t do a small ecommerce outfit selling customized dog leashes any good to reach cat owners. Put simply, by dramatically limiting the effectiveness of personalized advertising, Apple’s policy will make it much harder for small businesses to reach their target audience, which will limit their growth and their ability to compete with big companies.

This line of reasoning was thoroughly debunked by Facebook’s ex-employees and the Electronic Frontier Foundation’s Andrés Arrieta who pointed out that behaviourally-targeted ads are often more expensive than more weakly-targeted versions because of the many intermediaries taking their cut. These types of ads produce mixed results for advertisers, have little benefit for publishers, are not very well targeted, and require us to sacrifice our privacy with few ways of opting out.

Then, in a Clubhouse chat with Josh Constine last month, Mark Zuckerberg said that Facebook “may even be in a stronger position” after the introduction of App Tracking Transparency because of Facebook’s uniquely large amount of user data. But that was contradicted somewhat in today’s quarterly earnings report in a comment from CFO David Wehner (emphasis mine):

We expect second quarter 2021 year-over-year total revenue growth to remain stable or modestly accelerate relative to the growth rate in the first quarter of 2021 as we lap slower growth related to the pandemic during the second quarter of 2020. In the third and fourth quarters of 2021, we expect year-over-year total revenue growth rates to significantly decelerate sequentially as we lap periods of increasingly strong growth. We continue to expect increased ad targeting headwinds in 2021 from regulatory and platform changes, notably the recently-launched iOS 14.5 update, which we expect to begin having an impact in the second quarter. This is factored into our outlook.

On the call, Wehner said that the impact would be “manageable” due to the company’s increased investments in e-commerce. How much Facebook’s own revenue will be impacted will, as the company says, be seen later this year. This quarter, however, there are no such worries for Facebook.

Barbara Ortutay, Associated Press:

The company said it earned $9.5 billion, or $3.30 per share, in the January-March period. That’s up 94% from $4.9 billion, or $1.71 per share, a year earlier.

Revenue grew 48% to $26.17 billion from $17.44 billion.

But for the small businesses Facebook ostensibly cares about, things got more expensive:

The average price of ads on Facebook grew 30% from a year earlier, while the number of ads increased by 12%.

Alex Heath of the Information on Twitter:

Takeaway from Facebook earnings:

  • Its pricing power for ads is increasing dramatically as Apple makes cheap ads less efficient

  • The business is becoming more efficient as it grows (43% operating margin!) […]

As is often the case for stories about privacy changes — whether regulatory or at a platform level — much of the coverage about App Tracking Transparency has been centred around its potential effects on the giants of the industry: Amazon, Facebook, and Google. But this may actually have a greater impact on smaller ad tech companies and data brokers. That is fine; I have repeatedly highlighted the surreptitious danger of these companies that are not household names. But Facebook and Google can adapt and avoid major hits to their businesses because they are massive — and they may, as Zuckerberg said, do even better. They are certainly charging more for ads.

That is not to say that we should give up and accept that these businesses destroy our privacy to enrich themselves and their shareholders. If we threw in the towel every time we realized that lawmaking was difficult or that laws would be broken sometimes, we wouldn’t have any laws.

You may have noticed my pivot from Apple’s platform rules to a more regulated approach. That is because I maintain that a legal solution is the only correct one. While I am glad this new control exists in iOS, privacy is not something people should buy. And, pursuant to Facebook’s earnings and forecast, there should not be a benefit from the increased scarcity of data due to better privacy controls.

While many news organizations were satisfied with covering today’s launch of App Tracking Transparency in iOS 14.5 as a feature that, at most, illustrates a key difference between Apple and Facebook, for example, Mike Isaac and Jack Nicas of the New York Times decided to write a parallel article about the apparently fractured relationship between the companies’ CEOs. And it is a doozy.

I do not like these kinds of articles at the best of times. Regardless of how closely executives are tied to the companies they are involved with, I do not think there is much value in seeing them as inextricably linked. I do not think we can extrapolate personal animosity from competitiveness, and I think the CEO-as-celebrity narrative is a worrisome premise.

So this is the kind of article that I am going to approach with trepidation. Sure enough, it is chock full of anecdotes that do not simply portray Apple and Facebook as two companies that have some competitive overlap and very different approaches to privacy, but an “all-out war” between two bitter enemies in Tim Cook and Mark Zuckerberg. I did not learn much but, as I re-read the article, a single paragraph stuck out:

Those contrasts have widened with their deeply divergent visions for the digital future. Mr. Cook wants people to pay a premium — often to Apple — for a safer, more private version of the internet. It is a strategy that keeps Apple firmly in control. But Mr. Zuckerberg champions an “open” internet where services like Facebook are effectively free. In that scenario, advertisers foot the bill.

This reads like a Facebook PR person has spun it already, since it is the distillation of the company’s false compromise between privacy and revenue. It also misrepresents how lock-in and opt-in work on the internet.

If you want to talk about control over the internet, you really have to start with Facebook, Google — and, to a lesser extent, Amazon. All three companies insidiously lock people into their data-mining platforms without presenting a real means of consent or opting out. In addition to being de facto infrastructure, these companies never really stop tracking you. They can stop showing you ads based on the personalized data they have collected, but they may continue to slurp up behavioural information anyhow. And that’s only the three biggest companies in this space; there are thousands of other ad tech businesses and data brokers gorging themselves on data you never meaningfully consented to sharing.

Apple’s apparent control over the internet is comparatively meagre. If you rid yourself of all Apple hardware and software, you quit using its services, and you delete your iCloud account, you have zero affiliation with Apple. As far as it knows, you no longer exist. This is undoubtably a tedious, time-consuming, and expensive thing to do — but you can entirely opt out of Apple’s ecosystem. I know many people who have.

It is hard to see how Apple’s greater emphasis on privacy enables it to have more control over the internet in the long run. You would have to be a deeply cynical person who believes Apple would oppose a strict national privacy law — something Cook has repeatedly called for — because it creates a market for Apple’s more privacy-friendly products, and you would have to ignore the overwhelming majority of people who demand greater privacy online for that to be true. Of course Tim Cook, CEO of Apple, would rather you buy your technology products from Apple, but this company policy is not mere veneer. It is a longstanding commitment — though it is imperfect and has its limits — as is the company’s stance towards an open internet.1

But an open internet does not mean one in which all advertising is individually targeted using data farmed through independent apps and websites that serve as proxies for the surveillance practices of Facebook and Google. In the history of advertising, the privacy-hostile premise that these companies are selling is fairly recent. Shooting for pinpoint relevancy is a waste of time and privacy when relevant enough ads can be targeted to someone browsing a list of coffee cake recipes, an article about wedding locations, or a local news story. Mediocre ad targeting was good enough to buy an entire Batmobile.

Forget the apparent “war” between Cook and Zuckerberg personally, or even between the companies they chair. Both Apple and Facebook believe that many users, when presented with the option of whether to allow third parties to track their activity, will say no. But the new thing is not the tracking, it is the request for explicit permission — and Facebook appears to think that it will struggle to convince people it should be allowed to strip-mine their behaviour. We ought to be asking whether this was ever ethical. It seems most people would disagree.

Ads can keep funding the internet; Apple is not eradicating advertising from its platform. It is only requiring that users give consent to how much they would like to be surveilled. It speaks volumes about Facebook that it believes those are necessarily the same thing.


  1. A non-exhaustive list of privacy commitments: device encryption; masking Bluetooth and MAC addresses; Safari’s tracking prevention mechanisms, including ITP and share button tracking; local categorization of images in Photos; privacy labels in the App Store; non-specific location data in apps; and background location notifications. Many of these features are not recent. For example, since the mid-2000s, Safari defaulted to allowing only first-party cookies and cookies from websites you visited. ↥︎

Alfred Ng and Maddy Varner, the Markup:

All in all, we found 25 companies whose combined spending on federal lobbying totaled $29 million in 2020. Many of the top spenders were not pure data brokers but companies that nonetheless have massive data operations. Oracle, which has spent the past decade acquiring companies that collect data, spent the most by far, with disclosure documents showing $9,570,000 spent on federal lobbying.

For comparison, of the Big Tech firms with heavy lobbying presences, Facebook spent $19,680,000, Amazon $18,725,000, and Google $8,850,000 in the same period, according to the Center for Responsive Politics. Public Citizen, a consumer advocacy group, found that Big Tech spent $108 million collectively on lobbying in 2020.

Oracle has its own data collection arm but has also built its portfolio by buying up companies like DataRaker, Compendium, and Crosswise. The companies, which were acquired in 2012, 2013, and 2016, respectively, take data from a variety of sources. DataRaker gets data from millions of smart meters and sensors for utilities companies, while Compendium delivers targeted ads. Crosswise allows Oracle to track people across devices, claiming to process data from billions of devices every month.

The data broker industry is not new to frequent readers of this website, but it does not receive nearly as much public attention as Facebook and Google. That is probably because data brokers deliberately avoid a public presence, while Facebook and Google have many public-facing products.

Another feature of the data broker industry is its ubiquity. While it is extraordinarily difficult to opt out of Facebook and Google’s tracking mechanisms, it is effectively impossible to eliminate yourself from the data broker industry — especially in the United States. The Office of the Privacy Commissioner of Canada put together a great 2019 report on the data brokers in Canada:

The data brokerage industry occupies in a region of the economy that is opaque to consumers, its objects of commerce. It is difficult for consumers to appreciate the mechanisms by which data brokers collect, use and trade in consumers’ personal information, and so the usual mechanisms by which markets discipline businesses are not in place. The industry is complex, with multiple kinds of actors collecting, processing, and aggregating data to create and use consumer profiles. Reporting by [the Canadian Internet Policy and Public Interest Clinic] and others on the activities of the industry are insufficient to overcome this difficulty.

This report recommended more investigation and oversight, but it has limited effect. At the very least, Canadians’ personal information has some national and “substantially similar” provincial protections through legislation; in the United States, a 2014 report found, this is not the case, so far more private data is collected, traded, combined, and sold.

Patrick McGee and Hannah Murphy, Financial Times:

According to recent internal documents seen by the Financial Times, Snap wanted to gather data from companies that analyse whether people have responded to ad campaigns, including aggregated IP addresses, the labels that identify devices connected to the internet.

It hoped it could take that data and cross-reference it against the information it holds on its own users to identify and track them, in a technique known as “probabilistic matching”, according to several people familiar with its plans.

After being contacted by the FT about its plans, Snap acknowledged it had run a probabilistic matching programme for several months to test the impact of Apple’s new policies, but said it had always intended to discontinue the program after Apple introduces its changes, as such a system would not be compliant.

Expect to see a lot more of this sort of thing as marketing companies and data brokers you’ve never heard of try to find surreptitious ways of tracking users instead of just asking permission.

This piece from the New York Times editorial board just three days ago sets the tone for the main topic, I think:

Americans have become inured to the relentless collection of their personal information online. Imagine, for example, if getting your suit pressed at the dry cleaner’s automatically and permanently signed you up to have scores of inferences about you — measurements, gender, race, language, fabric preferences, credit card type — shared with retailers, cleaning product advertisers and hundreds of other dry cleaners, who themselves had arrangements to share that data with others. It might give you pause.

[…]

One straightforward solution is to let people opt in to data collection on apps and websites. Today, with few exceptions, loads of personal data are collected automatically by default unless consumers take action to opt out of the practice — which, in most cases, requires dropping the service entirely.

Drew FitzGerald, Wall Street Journal:

T-Mobile US Inc. will automatically enroll its phone subscribers in an advertising program informed by their online activity, testing businesses’ appetite for information that other companies have restricted.

[…]

AT&T Inc. automatically enrolls wireless subscribers in a basic ad program that pools them into groups based on inferred interests, such as sports or buying a car. An enhanced version of the program shares more-detailed personal information with partners from customers who opt into it.

Verizon Communications Inc. likewise pools subscriber data before sharing inferences about them with advertisers, with a more-detailed sharing program called Verizon Selects for users who enroll. Its separate Verizon Media division shares data gathered through its Yahoo and AOL brands.

Ask pretty much anyone about their modern-day privacy concerns and you will get an earful about Facebook and Google. That’s understandable — they run a two-sided economy of users and advertisers, and have little competition in many of their markets. But the ad tech ecosystem is so gigantic that it is insufficient to focus solely on those two companies.

I have been writing for years that the market for private data needs to be curbed or even eliminated. Now that personal information is available in unfathomable supply, has huge demand, and is an effectively unregulated market, everyone seems to want in on it. Scott Brinker tracks the companies involved in marketing technologies. In 2020, the sector with by far the greatest growth was in data.1 Even the example the Times editorial board used in that lede is pretty much identical to an agreement between Google and Mastercard.

Even as Facebook and Google have become bywords for creepy online behaviour and have begun to spin privacy narratives around isolationist changes, the anti-privacy business is booming. There are thousands of companies only too eager to buy and sell whatever data they can get their hands on and “enrich” it by matching identifiers in different data sets. I maintain that this entire industry is illegitimate but, at the very least, it needs regulation and clear user protections.

This is also a reminder that antitrust investigations solely focused on tech companies is woefully inadequate.


  1. Within the data category, Brinker recorded a 68% growth in “governance, compliance, and privacy” firms. However, that does not mean that the data category grew primarily because of a large increase in compliance companies, perhaps spurred by increased regulation. If you actually look at the infographic, that subcategory went from a handful to a much larger handful — but it is vastly overshadowed by the number of analytics, “customer intelligence”, and “data enhancement” companies. ↥︎

Andrés Arrieta of the Electronic Frontier Foundation:

In reality, a number of studies have shown that most of the money made from targeted advertising does not reach the creators of the content—the app developers and the content they host.  Instead, the majority of any extra money earned by targeted ads ends up in the pockets of these data brokers. Some names are very well-known, like Facebook and Google, but many more are shady companies that most users have never even heard of.  

Bottom line: “The Association of National Advertisers estimates that, when the “ad tech tax” is taken into account, publishers are only taking home between 30 and 40 cents of every dollar [spent on ads].” The rest goes to third-party data brokers who keep the lights on by exploiting your information, and not to small businesses trying to work within a broken system to reach their customers.

Flawed and insufficient as current privacy legislation may be, this is one reason I remain a supporter of its principles — even though giants are virtually unaffected. The bright spotlight placed on Amazon, Facebook, and Google allows the proliferation of shadowy ad tech companies that are orders of magnitude less valuable and, therefore, are virtually invisible to most web users. The market for bulk surveillance should not be a legitimate one. These laws may make it harder for startups to compete with giants, but vigorous competition between providers of creepy tracking should not be the goal.

Taking on the giants requires a comparable giant. Most of the biggest companies in tech treat privacy violations as a core product offering for advertisers. Apple is an exception. It may be taking this stance because it does not affect its own business, but it is only able to be such an aggressive campaigner for user privacy because it does not build its business around being creepy. That is not an accident. Further legislation will take a while, and antitrust lawsuits will be batted between expensive lawyers for years before they reach trial. But this is something users can choose to opt into or out of starting next year because we are, at last, being given a choice. Good.

Overall, AppTrackingTransparency is a great step forward for Apple. When a company does the right thing for its users, EFF will stand with it, just as we will come down hard on companies that do the wrong thing. Here, Apple is right and Facebook is wrong. Next step: Android should follow with the same protections. Your move, Google.

I like your optimism, Arrieta.

Byron Tau, Wall Street Journal:

Apple Inc. and Alphabet Inc.’s Google will ban the data broker X-Mode Social Inc. from collecting any location information drawn from mobile devices running their operating systems in the wake of revelations about the company’s national-security work.

The two largest mobile-phone platforms told developers this week that they must remove X-Mode’s tracking software from any app present in their app stores or risk losing access to any phones running Apple’s or Google’s mobile operating systems.

Both Apple and Google disclosed their decision to ban X-Mode to investigators working for Sen. Ron Wyden (D., Ore.), who has been conducting an investigation into the sale of location data to government entities.

For months, Tau has been covering the purchase of X-Mode location data without a warrant along with Joseph Cox of Motherboard. X-Mode says that it is just one of many advertising and location data providers, and it is right: I hope there are continued efforts to crack down on abuses of data collection. But those efforts should be coming in the form of legislation and regulation, not by companies playing Whack-a-Mole after reports like these.

Hamed Aleaziz and Caroline Haskins, Buzzfeed News:

In an internal memo obtained by BuzzFeed News, the DHS’s top attorney, Chad Mizelle, outlined how ICE officials can look up locations and track cellphone data activity to make decisions on enforcement.

[…]

The document says that ICE and CBP purchased people’s mobile data from a data broker, although the document does not identify which one. All of the data is stored in an indexed, searchable database accessible through a “web portal.”

ICE and CBP buy advertising identifier data, or “AdID,” which typically includes information about where a person is located, what device they’re using, what language they use, which websites they’re visiting, and which websites they buy things from. All of this information isn’t linked to a person’s name, but to a randomly generated string of characters.

[…]

The document states that the DHS purchased AdID data that is anonymized and only shows “timestamped signal location(s) within a specific time period” — or where one device has been, and when. This in and of itself doesn’t tell ICE and CBP who a person is. But the document notes that it’s possible to “combine” the data “with other information and analysis to identify an individual user.”

The idea that bulk data collection can be anonymized is a lie, as is the notion that it is complex to de-anonymize it. This has long been known in industries able to exploit it, to everyone’s detriment. Now, it is powering a vast public-private partnership of surveillance by an organization that claims jurisdiction over a generous amount of U.S. territory with increasingly invasive capabilities and little accountability.

There is a remarkable series of stories that Joseph Cox of Motherboard has been reporting over the past couple of months, describing the ways location data, IP addresses, and other private information is being sold to vendors and, eventually, law enforcement. I think these articles are best presented together, for the fullest context.

Here’s the first — police are purchasing illegally-obtained website data through intermediaries:

Hackers break into websites, steal information, and then publish that data all the time, with other hackers or scammers then using it for their own ends. But breached data now has another customer: law enforcement.

Some companies are selling government agencies access to data stolen from websites in the hope that it can generate investigative leads, with the data including passwords, email addresses, IP addresses, and more.

Motherboard obtained webinar slides by a company called SpyCloud presented to prospective customers. In that webinar, the company claimed to “empower investigators from law enforcement agencies and enterprises around the world to more quickly and efficiently bring malicious actors to justice.” The slides were shared by a source who was concerned about law enforcement agencies buying access to hacked data. SpyCloud confirmed the slides were authentic to Motherboard.

Here’s another — the United States Secret Service purchased a license to Babel Street’s Locate X. You may remember that name from a story that appeared in the Wall Street Journal last month, which I covered, that showed multiple U.S. government agencies had contracts with private location tracking companies. Cox:

The Secret Service paid for a product that gives the agency access to location data generated by ordinary apps installed on peoples’ smartphones, an internal Secret Service document confirms.

The sale highlights the issue of law enforcement agencies buying information, and in particular location data, that they would ordinarily need a warrant or court order to obtain. This contract relates to the sale of Locate X, a product from a company called Babel Street.

Finally, published yesterday, a story about a private spy company that buys location data:

A threat intelligence firm called HYAS, a private company that tries to prevent or investigates hacks against its clients, is buying location data harvested from ordinary apps installed on peoples’ phones around the world, and using it to unmask hackers. The company is a business, not a law enforcement agency, and claims to be able to track people to their “doorstep.”

[…]

Motherboard found several location data companies that list HYAS in their privacy policies. One of those is X-Mode, a company that plants its own code into ordinary smartphone apps to then harvest location information. An X-Mode spokesperson told Motherboard in an email that the company’s data collecting code, or software development kit (SDK), is in over 400 apps and gathers information on 60 million global monthly users on average. X-Mode also develops some of its own apps which use location data, including parental monitoring app PlanC and fitness tracker Burn App.

Many of these apps are distributed by a developer called Launch LLC. So you think you’re downloading a simple app from some no-name developer, and it’s actually from this X-Mode data brokerage company that sells your data to HYAS which, in turn, distributes it to law enforcement and intelligence agencies to mine without a warrant.

The fact that these marketplaces are even possible is absurd and outrageous. A lack of strict regulations for the collection and use of personal data — particularly in the United States, given the number of tech companies based there — puts everyone at risk.

Just a couple of months ago, a massive Oracle BlueKai database was found to be leaking data from an estimated 1% of all traffic on the web. A report released last week indicated that just a handful of often-visited websites are needed to reliably “fingerprint” someone, and dozens of companies have the potential to do so.

We constantly generate so much private data on the smartphones we carry everywhere. Yet the collection, use, and resale of that data is basically unregulated. The scale of it is unknown, since many of the organizations responsible go out of their way to hide their activities. I am sure that all of this has the potential to catch criminals, but at what cost?

Yesterday, the U.S. Court of Appeals for the Ninth Circuit unanimously confirmed that the NSA’s bulk collection of Americans’ phone records was illegal, and found no evidence that it ever found or convicted a single terrorist. But, even if it had helped, the program would still have been illegal because bulk surveillance is antithetical to a healthy democracy. If anything, this decision demonstrated that federal agencies are more constrained than private companies in their ability to collect information like this. That makes sense — the state should not be spying on citizens — but Cox’s reporting shows that the private sector has provided a convenient workaround.

Perhaps it is possible to update the law to require a warrant for surveillance by proxy, and for it to be more targeted, but it is highly unethical to be collecting this much information in the first place for the purposes of stockpiling and bulk sales. This circumstance should not be possible — even in theory. That is not for the purposes of making legitimate investigations harder, but to ensure privacy and security for everyone. The software we use should not be snitching our location to some two-bit private intelligence firm for resale to whomever they determine to be an agreeable customer. You might be comfortable with the U.S. Secret Service buying access to your location; maybe you’re fine with other law enforcement agencies and private companies that may have similar contracts. But, sooner or later, I am certain we will find out that some disagreeable entity — maybe a company that behaves unethically, or maybe some authoritarian state — also tracks people around the world. Then what? Stopping this data brokerage industry is not paranoia, it is pragmatic.

Byron Tau, Wall Street Journal:

Anomaly Six LLC, a Virginia-based company founded by two U.S. military veterans with a background in intelligence, said in marketing material it is able to draw location data from more than 500 mobile applications, in part through its own software development kit, or SDK, that is embedded directly in some of the apps. An SDK allows the company to obtain the phone’s location if consumers have allowed the app containing the software to access the phone’s GPS coordinates.

App publishers often allow third-party companies, for a fee, to insert SDKs into their apps. The SDK maker then sells the consumer data harvested from the app, and the app publisher gets a chunk of revenue. But consumers have no way to know whether SDKs are embedded in apps; most privacy policies don’t disclose that information. Anomaly Six says it embeds its own SDK in some apps, and in other cases gets location data from other partners.

Tau reports that Anomaly Six tracks the location of “hundreds of millions of mobile phones” but, citing the opacity of the data brokerage world, was not able to determine which apps include its SDK. Tau also reports that the founders of Anomaly Six used to work for Babel Street, which offers a similar privacy hellscape called “Locate X”:

Babel Street doesn’t publicly advertise Locate X and binds clients and users to secrecy about even its existence, according to contracts and user agreements reviewed by the Journal. Developed with input from U.S. government officials, according to court records, Locate X is widely used by military intelligence units who work on gathering “open source” intelligence, or information taken from publicly available sources. Babel Street also has contracts with the Department of Homeland Security, the Justice Department, and many other civilian agencies, federal contracting data shows. Babel Street didn’t respond to a request for comment.

So the U.S. government is comfy with the risk of starting another Cold War over apparent mass privacy violations by foreign actors on moral absolutist grounds, and is also content with having location back doors into hundreds of millions of phones. Got it.

Bruce Schneier, in an op-ed for the New York Times:

Regulating this system means addressing all three steps of the process. A ban on facial recognition won’t make any difference if, in response, surveillance systems switch to identifying people by smartphone MAC addresses. The problem is that we are being identified without our knowledge or consent, and society needs rules about when that is permissible.

Similarly, we need rules about how our data can be combined with other data, and then bought and sold without our knowledge or consent. The data broker industry is almost entirely unregulated; there’s only one law — passed in Vermont in 2018 — that requires data brokers to register and explain in broad terms what kind of data they collect. The large internet surveillance companies like Facebook and Google collect dossiers on us more detailed than those of any police state of the previous century. Reasonable laws would prevent the worst of their abuses.

Finally, we need better rules about when and how it is permissible for companies to discriminate. Discrimination based on protected characteristics like race and gender is already illegal, but those rules are ineffectual against the current technologies of surveillance and control. When people can be identified and their data correlated at a speed and scale previously unseen, we need new rules.

This is a timely article — not only because of the publicity Clearview AI has received, but also because the European Commission is considering a ban on the use of facial recognition in public, as London’s Metropolitan Police have announced they will be using widely.

There’s a silly dismissal of privacy laws that goes something like this: because these laws require that data processors get opt-in consent from users, they empower Facebook and Google, which means these laws are failures on a grand scale. I thought this argument was absurd when it first appeared last year in relation to Europe’s GDPR, but California’s new CCPA has made it ripe and juicy again.

Mike Masnick of Techdirt covered a story by Nick Kostov and Sam Schechner of the Wall Street Journal last year:

We warned folks that these big attempts to “regulate” the internet as a way to “punish” Google and Facebook would only help those companies. Last fall, about six months into the GDPR, we noted that there appeared to be one big winner from the law: Google. And now, the Wall Street Journal notes that it’s increasingly looking like Facebook and Google have grown thanks to the GDPR, while the competition has been wiped out.

“GDPR has tended to hand power to the big platforms because they have the ability to collect and process the data,” says Mark Read, CEO of advertising giant WPP PLC. It has “entrenched the interests of the incumbent, and made it harder for smaller ad-tech companies, who ironically tend to be European.”

So, great work, EU. In your hatred for the big US internet companies, you handed them the market, while destroying the local European companies.

Antonio García Martínez:

The result is that not only is there a privacy/convenience tradeoff that users must navigate, there’s a privacy/competition one that regulators must navigate as well.

You want users to have transparent, wide-ranging choice in how their data is used, with companies they know?

Then you’ve got to limit data use to first-party companies with a big public brand and lots of public scrutiny, rather than a complex ecosystem of many data producers and vendors.

There is absolutely nothing wrong with making it harder for any company — large or small, American or European — from abusing users’ privacy. Besides, it isn’t as though most big websites carry only one tracker. The fewer companies that are able to build highly personalized profiles, the better.

More relevant, though, is that you probably can’t name many of these smaller ad tech companies, but you can name the three biggest ones: Google, Facebook, and Amazon. That’s probably because you have a profile with at least one of them, if not all three, so of course it’s easier for them to get consent from you. If you have a user account, they already have your consent.

I doubt that compliance costs — in the sense of documentation or technical support — prevent smaller firms from competing with the big three. It’s the first-party relationship that these companies have with their users. Remember: Google is not a software and services company, it is an advertising company with several interactive and useful features. Facebook is not a family of social networks and chat apps, but a personalized advertising company that entices you to give them as much data as you can. Amazon — well, they’re everything, but they’re also a big fan of advertising to you Amazon listings for the things you just bought off Amazon.

Complying with GDPR really is much harder for a company nobody has ever heard of that asks permission to keep a copy of your name, phone number, email address, and anything else you submit to an unrelated service. But why shouldn’t it be?

These privacy laws are not perfect, yet they’ve had an immediate impact. In the year-and-a-half since GDPR has been in effect, hundreds of millions of Euros worth of fines have been issued. Plenty of companies have had to tighten their privacy and security measures as a result. But, yes, Google, Facebook, and Amazon have become stronger as a result of their ease of compliance.

And that’s probably why the E.U. also has antitrust concerns about all three of these companies. There are currently open investigations into Amazon and Facebook. Google was fined a billion-and-a-half Euros for abusing its dominance in online advertising, which is particularly important since they have controlled the most widely-used ad exchanges even before GDPR went into effect.

GDPR and CCPA are largely good — if imperfect — first steps towards regulating the unhinged worlds of advertising technology firms and data brokerages. We should encourage our public representatives to set broad expectations about how our data may be collected and used. We also ought to fight for more people-friendly interpretations of antitrust law. It isn’t a failure that privacy laws fail to address antitrust concerns any more than it is a failure that restaurant sanitation requirements don’t rein in corn subsidies.

It’s possible to do both, and it isn’t indicative of poor policy that we should do both. Well, it isn’t indicative of poor privacy regulations, anyhow; it absolutely does point to missed opportunities for decades. Now is as good a time as any to fix those shortcomings.

Adam Walser, ABC Tampa Bay:

I-Team Investigator Adam Walser obtained records showing the state sold information on Florida drivers and ID cardholders to more than 30 private companies, including marketing firms, bill collectors, insurance companies and data brokers in the business of reselling information.

The Florida Department of Highway Safety and Motor Vehicles raked in more than $77 million for driver and ID cardholder information sales in fiscal 2017.

The I-Team wanted to know how much of that money came from marketing firms, but the agency in charge of driver information estimated it would take 154 hours of research and cost nearly $3,000 for the state to give taxpayers an answer.

TechCrunch reporter Sarah Perez pointed to several similar stories from South Carolina, Pennsylvania, Alabama, and other states.

It’s no wonder policymakers are loathe to strictly regulate the use and dissemination of private data — they’re in on the grift.

Katie Notopoulos, BuzzFeed News:

Facebook launched a transparency tool this week that will give people a little more information about how their targeted ads work (good!). Now you can see more details about why you’re seeing an ad in your feed, how it is linked to an ad agency or data broker, and how to opt out of interest-based ad campaigns run by businesses that have your information. The bad news is that looking at it may end up just making you feel worse about how your data is passed around by third-party data brokers — credit reporting bureaus and marketing agencies — like Halloween candy.

This should at least partially solve the mysterious presence of cross-country car dealerships and furniture stores — typically, other clients associated with these data brokers — appearing on the advertising settings page for many users. But this doesn’t go far enough. If we’re going to put up with behaviourally-targeted advertising — and we should not, because it is deeply corrosive to our privacy, unethical, and not particularly effective — but if we are, then these ads should be required to list every single targeting method they’re using, plus all of the companies that had a hand in placing that ad on your screen.

Jeremy Burge:

For years Facebook claimed the adding a phone number for 2FA was only for security. Now it can be searched and there’s no way to disable that.

Facebook 2FA numbers are also shared with Instagram which prompts you ‘is this your phone number?’ once you add to FB.

Zack Whittaker, TechCrunch:

Last year, Facebook was forced to admit that after months of pestering its users to switch on two-factor by signing up their phone number, it was also using those phone numbers to target users with ads. But some users are finding out just now that Facebook’s default setting allows everyone — with or without an account — to look up a user profile based off the same phone number previously added to their account.

This isn’t just yet another example of Facebook behaving outrageously when it comes to the company’s pathological need to slurp up everything about its users’ every living moment. It also has the potential to reduce the likelihood that users will adopt two-factor authentication. Technically-literate people have been preaching two-factor authentication for a long time, but average users have been slow to enable it; if they get the impression that it’s yet another piece of data that creepy companies can use to track them, they will be even more hesitant.

Also, I’d like to address something about two-factor authentication that’s been bugging me for a while. Ever since fears about SIM hijacking began spreading, some people have been claiming that using SMS-based two-factor authentication is worse than not using two-factor at all. I think that’s silly and myopic. It is worth noting that SIM hijacking is pretty easy for someone who has access — directly or indirectly — to a carrier’s SIM backend. But the circumstances under which someone’s phone number would be hijacked are pretty rare for the vast majority of us. People who are connected with low character count or high-valued social media accounts, higher-ranking employees, activists, journalists, wealthy individuals, and public figures are more susceptible to these kinds of attacks. Most of us, however, are not any of these things, and will likely benefit from using any kind of two-factor authentication. You should use a code generator or a hardware mechanism like a YubiKey wherever you can, but SMS authentication is not necessarily terrible, and is likely not worse than using no verification at all.

However, that is entirely theoretical, and there’s an enormous caveat you should be aware of: while you may have loads of email accounts and it’s trivial to create a throwaway one, but you probably only have one phone number. Therefore, it is critical that you only give your phone number to services and apps you really trust. Many unscrupulous apps will include your phone number in information they send to data brokers and advertising companies like Facebook. You should, therefore, be extremely careful when providing your phone number anywhere. Treat it as you would a unique personal identifier, like a Social Security Number or a Social Insurance Number. Assume it has been compromised, but protect it nevertheless.

Tim Cook in Time:

Meaningful, comprehensive federal privacy legislation should not only aim to put consumers in control of their data, it should also shine a light on actors trafficking in your data behind the scenes. Some state laws are looking to accomplish just that, but right now there is no federal standard protecting Americans from these practices. That’s why we believe the Federal Trade Commission should establish a data-broker clearinghouse, requiring all data brokers to register, enabling consumers to track the transactions that have bundled and sold their data from place to place, and giving users the power to delete their data on demand, freely, easily and online, once and for all.

Setting aside the actual point of this essay — which, by the way, is an excellent summary of why privacy legislation for user data is sorely needed — something that’s kind of interesting is how it reads in a very Tim Cook kind of way. Much as Steve Jobs had a unique voice in both his speaking and writing, so, too, does Cook. It doesn’t feel like it’s an essay produced by some copywriter to which Cook’s name is later affixed; it’s clear to me that this is something that truly matters to him, and which he probably wrote himself.

Nilay Patel linked to the essay on Twitter and attached a screenshot of the top apps in the App Store, clarifying in a reply:

The iPhone’s value is built on these services. Pushing for a law is great, but it is telling that they won’t use their own platform dominance to forbid these practices.

“These services” refers to apps like Gmail and Instagram, both of which are run by companies that show virtually no respect for users’ privacy. This seems to come up frequently in conversations about Apple, and it’s something lots of people mention, so I don’t mean to single out Patel here. But I think it’s a horribly lazy take.

Can you imagine the scale of the shit fit that would be thrown if Apple completely prohibited apps from Google and Facebook in the App Store? Not just from users, either: tech publications, mainstream newspapers, and regulators would be apoplectic, given the obvious antitrust questions that would likely be provoked by this kind of power move.

For what it’s worth, Apple has strict guidelines on the collection and use of users’ data. These rules require that developers collect opt-in permission, prohibit apps from requiring personal data unless it’s necessary for the app’s core functionality, and encourage minimization of data collection overall. That’s not to say the company is perfect. For example, you cannot post photos to an Instagram story if you’ve denied microphone access to Instagram, despite Apple’s developer guidelines using basically this scenario as a prohibited example. I wish Apple were stricter in enforcing the rules they already have, but their reluctance to create a PR and antitrust catastrophe is understandable.

In general, however, enforcement of online privacy should not be Apple’s job. Cook is right in stating that this should be dealt with in policy, not on individual companies’ terms. Frankly, Apple’s ability to use privacy as a differentiating characteristic is embarrassing for the tech industry and its regulators. Users should not have to be wary of compromising their privacy or worried that they will lose control over their personal details every time they use a tech product, an app, or a website.

David McCabe and Scott Rosenberg, Axios:

For several years it has made sense, in some quarters, to lump together the tech giants — chiefly Google, Facebook, Apple, and Amazon, sometimes also including Netflix or Microsoft. But talking about “big tech” is beginning to offer diminishing returns.

[…]

As different pressures come to bear on each of these companies, they are likely to end up taking roads that differentiate them from their competitors — and make “big tech” less useful as an idea or a category.

A suspicion I’ve long harboured is that the bad actors in the tech industry make it much harder to trust any company. I know a few people who refuse to use Touch ID or Face ID on their devices because they’re convinced that their fingerprints and faces are being sent to Apple. The company is also increasingly focused on health, which makes some people skittish. And there’s a fair reason for that; users should be cautious about which companies they’re sharing their most personal details with.

Yet sales of in-home devices from Amazon and Google — with microphones and, in many cases, cameras — are up every year. User tracking is becoming more pervasive and difficult to avoid, and huge data brokers aggregate even more information but are not household names. In a survey last year, more people believe Amazon and Google care about user privacy than Apple. This situation is getting worse, not better, and it is eroding confidence that any part of the tech industry can be good.

The last time the tech industry was the subject of widespread worries about trust was in the wake of Edward Snowden’s NSA disclosures. This isn’t external; it can’t be smoothed over by denials or press releases bragging about how secure the databases are. This is internal, and it has effects throughout the industry on good actors and bad. But this discussion needs a greater level of specificity and nuance.

Aliya Ram and Madhumita Murgia, Financial Times:

Data brokers mine a treasure trove of personal, locational and transactional data to paint a picture of an individual’s life. Tastes in books or music, hobbies, dating preferences, political or religious leanings, and personality traits are all packaged and sold by data brokers to a range of industries, chiefly banks and insurers, retailers, telecoms, media companies and even governments. The European Commission forecasts the data market in Europe could be worth as much as €106.8bn by 2020. 

“The explosive growth of online data has led to the emergence of the super data broker — the ‘privacy deathstars’, such as Oracle, Nielsen and Salesforce, that provide one-stop shopping for hundreds of different data points which can be added into a single person’s file,” says Jeffrey Chester, executive director of the Center for Digital Democracy based in Washington. “As a result, everyone now is invisibly attached to a living, breathing database that tracks their every move.”

Over the past five years, the data broker industry expanded aggressively in what amounted to a virtual regulatory vacuum. The rise of internet-connected devices has fuelled an enhanced industry of “cross-device tracking” that matches people’s data collected from across their smartphones, tablets, televisions and other connected devices. It can also connect people’s behaviours in the real world with what they are doing online. 

The reluctance in virtually every country to restrict the purchase and sharing of user data without explicit consent is a complete regulatory failure. Nobody would tolerate someone asking them to submit a list daily of everything they’ve bought, every page they’ve seen online, every ad they’ve viewed, and everywhere they’ve been — not because that would be a lot of work, but because it would feel invasive. There shouldn’t be a “data market” at all.