Month: January 2020

One of the things I touched upon in my initial commentary on today’s Reuters report is that Apple uses “encrypted” to mean different things in different contexts. For years now, I have been frustrated by the context switching in this one knowledgebase article on Apple’s support website:

This article can help you decide which backup method is best for you. In case you ever need an alternative backup, you can make a backup in iCloud and another using your computer.

iCloud

[…]

  • Always encrypts your backups

[…]

Computer

[…]

  • Offers encrypted backups (off by default)

A related article about encrypted backups makes a similarly misleading claim:

[…] To encrypt a backup in the Finder or iTunes for the first time, turn on the password-protected Encrypt Backup option. Backups for your device will automatically be encrypted from then on. You can also make a backup in iCloud, which automatically encrypts your information every time.

There is nothing technically incorrect about this explanation. iCloud backups are, indeed, encrypted every time; local backups have encryption as an option. But whether a backup is “encrypted” is not enough information to decide which method is more secure. Apple holds the keys to iCloud backups, but only users know their local backup key.

It’s worth noting that Apple has been evaluating whether to offer encrypted iCloud backups since at least February of 2016, and possibly sooner. Today’s Reuters report suggests that Apple dropped that plan at some point in early 2018, though an October 2018 interview with Tim Cook in Spiegel indicated that the company was still working on it. I’m not sure what the correct timeline is, but I hope that renewed public pressure can encourage the company to make it a priority. It is imperative that users know exactly how their data is being used, and there is no reason that enabling backups should compromise their security and privacy.

Update: Lawrence Velázquez:

This reminds me of the Facebook 2FA fiasco, a more egregious case of something positive (2FA) being needlessly tainted (abuse of SMS numbers for non-2FA purposes).

This is exactly right. One of the effects of Apple’s confusing language around backups and encryption is that some people may not trust either. It’s not a great day when Apple is getting unfavourably compared to Facebook on privacy and security matters.

Nelson Minar building on top of a report published last month in Bellingcat about Yandex’s superior reverse image search:

Right now an ordinary person still can’t, for free, take a random photo of a stranger and find the name for him or her. But with Yandex they can. Yandex has been around a long time and is one of the few companies in the world that is competitive to Google. Their index is heavily biased to Eastern European data, but they have enough global data to find me and Andrew Yang.

If you use Google Photos or Facebook you’ve probably encountered their facial recognition. It’s magic, the matching works great. It’s also very limited. Facebook seems to only show you names for faces that people you have some sort of Facebook connection to. Google Photos similarly doesn’t volunteer random names. They could do more; Facebook could match a face to any Facebook user, for instance. But both services seem to have made a deliberate decision not to be a general purpose facial recognition service to identify strangers.

At the time that I linked to the Bellingcat report, I wondered why Google’s reverse image recognition, in particular, was so bad in comparison. In tests, it even missed imagery from Google Street View despite Google regularly promoting its abilities in machine learning, image identification, and so on. In what I can only explain as a massive and regrettable oversight, it is clear to me that the reason Google’s image search is so bad is because Google designed it that way. Otherwise, Google would have launched something like Yandex or Clearview AI, and that would be dangerous.

Google’s restraint is admirable. What’s deeply worrying is that it is optional — that Google could, at any time, change its mind. There are few regulations in the United States that would prevent Google or any other company from launching its own invasive and creepy facial recognition system. Recall that the only violation that could be ascribed to Clearview’s behaviour — other than an extraordinary violation of simple ethics — is that the company scraped social media sites’ images without permission. It’s a pretty stupid idea to solely rely upon copyright law as a means of reining in facial recognition.

Mike Masnick of Techdirt responds to the Reuters report from earlier today claiming that Apple dropped a plan to implement end-to-end encryption on iCloud backups at the behest of the FBI:

Of course, the other way one might look at this decision is that if Apple had gone forward with fully encrypting backups, then the DOJ, FBI and other law enforcement would have gone even more ballistic in demanding a regulatory approach that blocks pretty much all real encryption. If you buy that argument, then failing to encrypt backups is a bit of appeasement. Of course, with Barr’s recent attacks on device encryption, it seems reasonable to argue that this “compromise” isn’t enough (and, frankly, probably would never be enough) for authoritarian law enforcement folks like Barr, and thus, it’s silly for Apple to even bother to try to appease them in such a manner.

Indeed, all of this seems like an argument for why Apple should actually cooperate less with law enforcement, rather than more, as the administration keeps asking. Because even when Apple tries to work with law enforcement, it gets attacked as if it has done nothing. It seems like the only reasonable move at this point is to argue that the DOJ is a hostile actor, and Apple should act accordingly.

Even though Apple attempts to explain how iCloud backups work, I don’t think they do a good job, and it is one reason the Reuters report today had such a profound impact: a lot of people have been surprised that their iCloud backups are less private than their phone. Yet, as bad as this is for Apple, it is equally a poor look for the Department of Justice, who have publicly been whining about their inability to extract device data while privately accepting Apple’s cooperation.

Joseph Menn, Reuters:

More than two years ago, Apple told the FBI that it planned to offer users end-to-end encryption when storing their phone data on iCloud, according to one current and three former FBI officials and one current and one former Apple employee.

Under that plan, primarily designed to thwart hackers, Apple would no longer have a key to unlock the encrypted data, meaning it would not be able to turn material over to authorities in a readable form even under court order.

In private talks with Apple soon after, representatives of the FBI’s cyber crime agents and its operational technology division objected to the plan, arguing it would deny them the most effective means for gaining evidence against iPhone-using suspects, the government sources said.

When Apple spoke privately to the FBI about its work on phone security the following year, the end-to-end encryption plan had been dropped, according to the six sources. Reuters could not determine why exactly Apple dropped the plan.

Apple describes both local iPhone storage and iCloud backups as “encrypted”, but those words mean different things depending on their context. An iPhone’s files cannot be decrypted unless the passcode is known, which typically means that only the device’s user has full access. But an iCloud backup’s key is held by Apple, so they have just as much access as the user would. Importantly, it also means that there is a way of recovering the data should the user’s key fail for some reason. It is possible that part of the reason Apple scrapped a plan for end-to-end encryption of iCloud backups is because it would lead to customers frustrated that they cannot recover their backup in some circumstances.

However, it is more troubling if such a plan never came to fruition because of government pressure. I don’t think it should be the goal of Apple or any company to deliberately make the work of law enforcement impossible, but decisions like these should be made in the best interests of users. And I would expect many users believe that storing their device’s backup in iCloud should not compromise their security and privacy. At the very least, encrypting backups using a secret known only to the user should be an option for iOS users; after all, it is apparently an option on Android. Apple also ought to make it plainly obvious who holds the key to encrypted data at every level to help reduce moronic takes like that from David Carroll:

Apple’s new position on protecting iCloud data from the United States government is now remarkably similar to its position on protecting iCloud data stored in the People’s Republic of China.

This simply isn’t true on any level. For a start, this is not a “new position”, and it does not solely apply to the United States government. Apple makes public what it can and cannot supply to law enforcement, and how they respond to those requests (PDF):

All iCloud content data stored by Apple is encrypted at the location of the server. When third-party vendors are used to store data, Apple never gives them the keys. Apple retains the encryption keys in its U.S. data centers. iCloud content, as it exists in the subscriber’s account, may be provided in response to a search warrant issued upon a showing of probable cause.

For law enforcement agencies outside the U.S. (PDF), the last sentence is replaced with this paragraph:

All requests from government and law enforcement agencies outside of the United States for content, with the exception of emergency circumstances (defined above in Emergency Requests), must comply with applicable laws, including the United States Electronic Communications Privacy Act (ECPA). A request under a Mutual Legal Assistance Treaty or Agreement with the United States is in compliance with ECPA. Apple Inc. will provide subscriber content, as it exists in the subscriber’s account, only in response to such legally valid process.

The worry in China is not necessarily that the government can subpoena for iCloud data; the worry is that user data is stored on servers belonging to a company run, in part, by a corrupt single-party regime. The government of the U.S. and its various criminal justice and national security branches are worrying for myriad reasons, but they cannot accurately be compared to the situation in China.

If you are understandably worried by this report, you can back up your iPhone to your Mac, with the option of creating an encrypted backup. Unlike an iCloud backup, you control the key. And, as of recent versions of iOS, you can migrate data directly between devices.

Kashmir Hill, New York Times:

His tiny company, Clearview AI, devised a groundbreaking facial recognition app. You take a picture of a person, upload it and get to see public photos of that person, along with links to where those photos appeared. The system — whose backbone is a database of more than three billion images that Clearview claims to have scraped from Facebook, YouTube, Venmo and millions of other websites — goes far beyond anything ever constructed by the United States government or Silicon Valley giants.

Federal and state law enforcement officers said that while they had only limited knowledge of how Clearview works and who is behind it, they had used its app to help solve shoplifting, identity theft, credit card fraud, murder and child sexual exploitation cases.

Until now, technology that readily identifies everyone based on his or her face has been taboo because of its radical erosion of privacy. Tech companies capable of releasing such a tool have refrained from doing so; in 2011, Google’s chairman at the time said it was the one technology the company had held back because it could be used “in a very bad way.” Some large cities, including San Francisco, have barred police from using facial recognition technology.

But without public scrutiny, more than 600 law enforcement agencies have started using Clearview in the past year, according to the company, which declined to provide a list. The computer code underlying its app, analyzed by The New York Times, includes programming language to pair it with augmented-reality glasses; users would potentially be able to identify every person they saw. The tool could identify activists at a protest or an attractive stranger on the subway, revealing not just their names but where they lived, what they did and whom they knew.

And it’s not just law enforcement: Clearview has also licensed the app to at least a handful of companies for security purposes.

This investigation was published on Saturday. I’ve read it a few times and it has profoundly disturbed me on every pass, but I haven’t been surprised by it. I’m not cynical, but it doesn’t surprise me that an entirely unregulated industry motivated to push privacy ethics to their revenue-generating limits would move in this direction.

Clearview’s technology makes my skin crawl; the best you can say about the company is that its limited access prevents the most egregious privacy violations. When something like this is more widely available, it will be dangerous for those who already face greater threats to their safety and privacy — women, in particular, but also those who are marginalized for their race, skin colour, gender, and sexual orientation. Nothing will change on this front if we don’t set legal expectations that limit how technologies like this may be used.

Susan Heavey and Andrea Shalal, reporting for Reuters in an article with the headline “Mnuchin urges Apple, other tech companies to work with law enforcement”:

Apple Inc and other technology companies should cooperate with U.S. investigators, Treasury Secretary Steven Mnuchin said on Wednesday as law enforcement officials continued probing last month’s fatal shooting at a U.S. naval base in Florida.

[…]

Mnuchin later told reporters at the White House that he had not discussed the issue with Apple and did not know the specifics at hand. “I know Apple has cooperated in the past on law enforcement issues, and I expect they would continue … to cooperate.”

The Reuters article notes that Apple is, in fact, cooperating with investigators by turning over everything they have on the iPhones in question, counter to Mnuchin’s claim. But the headline on this article is misleading.

Mike Masnick, Techdirt:

When framed that way, it’s obviously dumb. But anyone reading Reuters’ coverage of the issue won’t get that. They’ll think that Apple is somehow taking some sort of stand against US law enforcement. This is what Trump, Barr, and apparently Mnuchin, would like people to think, but it’s not true, and it’s fundamentally bad journalism for Reuters to frame it that way.

To be clear, it is likely not the reporters’ fault that the story was framed with this headline. But it’s unnecessarily carrying water for a Department of Justice that is exploiting a terrorist attack and public confusion over this issue to neuter encryption.

Jamie Leach of Google, announcing the new search results page design last year:

The name of the website and its icon appear at the top of the results card to help anchor each result, so you can more easily scan the page of results and decide what to explore next. Site owners can learn more about how to choose their prefered icon for organic listings here.

When you search for a product or service and we have a useful ad to show, you’ll see a bolded ad label at the top of the card alongside the web address so you can quickly identify where the information is coming from.

Danny Sullivan, tweeting as Google’s “public liaison of search”:

Last year, our search results on mobile gained a new look. That’s now rolling out to desktop results this week, presenting site domain names and brand icons prominently, along with a bolded “Ad” label for ads.

All you get, as far as identifying where a search result comes from, is a tiny 16-by-16-point favicon and small grey text with the URL. If it’s an ad, the favicon is replaced with a little “Ad” label, but there are no other advertising identifiers. Just a few years ago, Google used to indicate ads much more prominently. If the ads are truly as “useful” as Google claims, surely it doesn’t need to trick users into clicking on them instead of regular results.

Update: Google says that they’re going to experiment with different ad and search result appearances.

Julia Alexander, the Verge:

Comcast and NBCUniversal announced today that Peacock will be available in three tiers: a free option (Peacock Free) that comes with limited programming; an ad-supported complete version that is free to existing Comcast customers and $5-a-month for everyone else; and a $10-a-month ad-free subscription option that is open to anyone. That one is known as Peacock Premium.

Peacock Free consists of 7,500 hours of programming, including next-day access to current seasons of first-year NBC shows, Universal movies, and curated content such as SNL, Vault, and Family Movie Night. The two premium tiers come in at $4.99 per month with ads and $9.99 per month with no ads. Both of these tiers will include live sports and early access to late-night shows. Peacock Premium will include non-televised Premier League soccer games beginning in August.

Between weak antitrust enforcement, mergers designed to create vertical integration, the demise of net neutrality, and exclusive distribution contracts, it’s like a slow return to the old Hollywood studio system at even greater scale and scope.

A recent cold snap seems to have increased my propensity to experience bugs. I’m usually a walking commuter to my day job, but I’ve happily accepted a lift from my partner all week long as temperatures dropped below the ‑30° C mark every morning. As I got into the car this morning, I noticed a strange notification on my lock screen:

Siri Suggestion on lock screen to take day-long commute to work

This appears to be a Siri suggestion — a nudge by the system to show a hopefully-useful shortcut to a common task. As Apple puts it:

As Siri learns your routines, you get suggestions for just what you need, at just the right time. For example, if you frequently order coffee mid morning, Siri may suggest your order near the time you normally place it.

Since I go to work at a similar time every day, it tells me how long my commute will take and gives me the option to get directions. Nice, right?

Except something is plainly not right: it’s going to take me over a day to get to work? Here’s the route it thinks I should take:

Apple Maps directions across the continent

I found this hilarious — obviously — but also fascinating. How did it get this so wrong?

My assumption was that my phone knew that I commuted to work daily, so it figured out the address of my office. And then, somehow, it got confused between the location it knows and the transcribed address it has stored, and then associated that with an address in or near Rochester, New York. But that doesn’t seem right.

Then, I thought that perhaps the details in my contact card were wrong. My work address is stored in there, and Siri mines that card for information. But there’s a full address in that card including country and postal code, so I’m not sure it could get it so wrong.

I think the third option is most likely: I have my work hours as a calendar appointment every day, and the address only includes the unit and street name, not my city, country, or postal code. I guess Apple Maps’ search engine must have searched globally for that address and ended up in upstate New York.

But why? Why would it think that an appointment in my calendar is likely to be anywhere other than near where I live, particularly when it’s recurring? Why doesn’t Apple Maps’ search engine or Siri — I don’t know which is responsible in this circumstance — prioritize nearby locations? Why doesn’t it prioritize frequent locations?

If you look closely, you’ll also notice another discrepancy: the notification says that it’s going to give me directions to “12th St”, but the directions in Maps are to “12 Ave SE”. Why would this discrepancy exist?

It’s not just the bug — or, more likely, the cascading series of bugs — that fascinates me, nor the fact that it’s so wrong. It’s this era of mystery box machine learning, where sometimes its results look like magic and, at other times, the results are incomprehensible. Every time some lengthy IF-ELSE chain helpfully suggests me driving directions for going across the continent or thinks I only ever message myself, my confidence is immediately erased in my phone’s ability to do basic tasks. How can I trust it when it makes such blatant mistakes, especially when there’s no way to tell it that it’s wrong?

David Sparks, commenting on the FBI’s latest challenge to encryption:

I sympathize with law enforcement for wanting access to this data. I worked briefly in the criminal justice system and I know how maddening it would be to know you have a magic envelope with evidence in it and no way to open that envelope. I just think the sacrifice involved with creating a back door is too much to ask.

I do think this discussion isn’t over though. Apple sells into a lot of countries. Any one of them could require they install a back door as a condition of access to the market. Apple’s principals are on a collision course with a massive loss of income. Is it just a question of time before governmental regulation and market pressures make this period of time, where all citizens have relatively secured data and communications, only a temporary phase? I sure hope not.

Apple has proved itself somewhat amenable to compromise. It stores iCloud data for Chinese and Russian users on servers located in those countries. In the case of China, the data centre provider is a state-run company. Apple maintains that it holds the encryption keys, that it won’t disclose user data without legal authorization, and that the governments of both countries have no way of getting users’ data without going through legal proceedings. But, still, the legal systems of both countries are notoriously favourable to authoritarian policies, so it’s hard not to assume that Apple’s control is anything greater than theoretical.

Notably, both China and Russia have extreme restrictions on end-to-end encryption. In Russia, telecom and software companies are required to retain messages and encryption data for months; but, as Telegram offers messaging that’s encrypted end-to-end, they have no encryption keys to retain, resulting in its ban in the country. A similar law recently came into effect in China. Yet, iMessage remains available in both countries and, presumably, Apple has made no concessions on its end-to-end encryption. In 2018, meanwhile, the Australian government passed a law requiring tech companies to assist in decrypting users’ data. Apple has continued to sell products and services encrypted by default in all three countries.

Sparks is right: there will come a time that Apple will need to choose whether it will stand behind strong privacy and security, or if the monetary cost of doing so is simply too high.

Thomas Brand:

For most people watching the occasional stupid video isn’t that big of a problem because people tell Google who they are and the algorithm shows them what they want to watch. They do this by letting the Google track their browsers or by logging into Google services. But I browse the web with tracking protection turned on and never log into Google. The algorithm remembers nothing about me, and as a result I am always shown stupid videos.

In 2020 I am watching less stupid on YouTube by skipping the algorithm. Instead of letting the YouTube decide which videos it wants to show me, I am watching only the videos I want to see by subscribing to my favorite content creators via RSS.

This is a great trick. Brand explains it well, though it’s a little clunky for anyone not familiar with viewing the source of a webpage.

YouTube isn’t the only website that buries its RSS feeds in this manner. I don’t know that it’s deliberate — in the sense that they’re trying to discourage the use of RSS. I think it might be a result of product teams convincing themselves that RSS is something used only by the technically-proficient, so it’s put in a place where that group can find it. The trouble is that only the technically-proficient will end up using it, so it’s cyclical.

Update: A few people have written me to point out that you can try adding the standard channel URL to your feed reader, as many readers will automatically discover the RSS feed.

Lorenzo Franceschi-Bicchierai, reporting for Vice in 2018:

The FBI argued that it had no technical way to unlock the phone or hack into it without Apple’s help. Apple argued that helping the FBI would’ve put all iPhone users in danger because it would’ve required the company to weaken the security of all iPhones. The battle ended with a whimper when an unknown “third party” gave the FBI a way to hack in and the FBI abandoned its legal request.

As it turned out, the FBI’s own hackers didn’t start working with vendors to find a way to hack into Farook’s iPhone until “the eve” of the FBI’s initial court filing demanding Apple’s assistant on February 16, 2016. Moreover, two different teams within the FBI’s Operational Technology Division (OTD), a department tasked with giving technological assistance to investigations, didn’t communicate with each other to find a solution until late in the investigation, according to the OIG report.

The tech team initially helping with the case was the Cryptologic and Electronics Analysis Unit (CEAU). It was only after a meeting on February 11 that another hacking team within the FBI, the Remote Operations Unit or ROU, started looking into it and started contacting contractors and vendors asking for help.

It beggars belief that today’s FBI is struggling to breach an iPhone 7 Plus and an iPhone 5, the latter being a model of smartphone that is eight years old and is stuck on iOS 10. That’s especially suspicious given that investigators in another case were recently able to unlock an iPhone 11 Pro. What is it about the much older phones in this case that are proving so iron-clad against the United States’ elite digital forensics teams? Are they even trying? Or does the Department of Justice just want to fight?

In 2016, ABC NewsDavid Muir interviewed Tim Cook about why Apple was fighting the FBI’s order to create a modified version of iOS that would allow the forced unlocking of the iPhone used by one of the San Bernardino shooting perpetrators. Memorably, he called the development of any backdoor the “software equivalent of cancer”. He also described what the FBI was asking for: a version of iOS, but without the preference to erase data after ten attempts, and with the ability for the FBI to try an unlimited number of passcodes as fast as a computer could enter them. Now, they seem to be asking for something similar; the FBI, once again, wants Apple to do something to help decrypt iPhones for law enforcement.

At no point — then or now — has Cook or anyone at Apple publicly confirmed how such a backdoor may be installed, or if it’s even possible. Presumably, it would use the iOS update mechanism, but how could permission be granted if the passcode to the iPhone isn’t known? After all, you must enter your iPhone’s passcode to install a software update. When you plug an iPhone into a computer, you must enter the passcode on the phone to enable a trusted data connection. But I thought there might be a way around all of this with one of iOS’ recovery modes.

Coincidentally, I have an almost perfect environment in which to test this. I recently had to install a clean copy of MacOS Catalina on my MacBook Air1 and had not yet connected my iPhone to that laptop, so I had something which could simulate a stranger’s computer to perform an update. And, happily, Apple released a new seed of iOS 13.3.1 today, so I had something to update to.

In the interest of testing this, I risked spending all evening restoring my iPhone’s data and followed Apple’s directions to enter recovery mode.2 I was able to update my iPhone to a newer version of iOS from a local .ipsw package without once entering my passcode.

  1. I downloaded the software update package from Apple’s developer website. Presumably, this means that any software update signed by Apple could be used instead.

  2. I connected my iPhone to my MacBook Air and forced a reboot, which cleared the temporary Face ID passcode authorization from the phone’s memory. I restarted it again, this time into recovery mode.

  3. MacOS prompted me to update or restore the phone. I picked “Cancel” to close the dialog box, then option-clicked on the “Update” button in Finder so I could select the software update package instead of using one from Apple’s server. It began and competed installation, then prompted for my passcode twice before it switched to an “Attempting Data Recovery” screen. After this process completed, my iPhone booted normally.

To be clear, my iPhone still prompted for its passcode when the update had finished its installation process. This did not magically unlock my iPhone. It also doesn’t prove that passcode preferences could be changed without first entering the existing valid passcode.

But it did prove the existence of one channel where an iPhone could be forced to update to a compromised version of iOS. One that would be catastrophic in its implications for iPhones today, into the future, and for encrypted data in its entirety. It is possible; it is terrible.

Update: I’ve had a few people ask questions about what this proves, or express doubt that this would enable an iPhone to be unlocked. To be perfectly clear, a compromised software package with the workarounds the FBI has asked for would have to be signed with Apple’s key for it to be installed. The passcode would still have to be cracked for user data to be extracted from the phone. But if Apple were legally compelled to comply with the FBI’s request in San Bernardino, this proves that a software update package containing the workarounds can be installed on an iPhone without having to enter a passcode.


  1. Long story short, my MacBook Air contains a battery and an SSD from two different third-party vendors. The battery is one year old and comes from an organization well-known for their advocacy of right-to-repair legislation, and its capacity has already been reduced by over a third. I’ve been trying to get a replacement, even though it’s just out of warranty, and had to perform a series of tests to verify the age and wear on the battery. While trying to do these tests, the third-party SSD — from a different company that’s similarly well-known for their stance on repairing electronics — also started to fail. I replaced the third-party SSD with the original one that came with the MacBook Air, wiped the drive, and did a clean install of MacOS Catalina on it.

    I have two takeaways. First, I am receiving a free replacement battery, even though the one-year warranty has lapsed. I haven’t been so lucky with the SSD. I am admittedly a month and a bit outside of the manufacturer’s three-year warranty, but it is fairly disappointing that I have all sorts of SSDs and spinning rust drives that have outlived that drive.

    The second takeaway is that, even though I share some principles and sympathy with right-to-repair advocates, I would be much more convinced about its merits if they shipped higher quality products that lasted longer. It’s entirely anecdotal and probably bad luck, in part, if not in full. But this experience underscores that — in addition to environmental and ethical reasons for device repair rather than replacement — the biggest advocates are businesses that sell parts. ↥︎

  2. This documentation is ridiculous at times:

    On a Mac with macOS Catalina 10.15, open Finder. On a Mac with macOS Mojave 10.14 or earlier, or on a PC, open iTunes. If iTunes is already open, close it.

    There’s no way to read this that makes sense. If you’re using Mojave, open iTunes. If iTunes is now open, close it. is the first and most literal way to read this, but is clearly wrong. If you’re using Mojave, open iTunes. If you had iTunes open first, close it, then reopen it. is the second way to read this, but it also seems silly. ↥︎

Seb Joseph, Digiday:

Apple’s iOS 13 update, released in September, includes regular reminders when apps are sucking up a user’s location data. The pop-up gives a user a chance to choose from the following options: allowing data collection at all times, or only when the app is open — or only one time. Four months in, ad tech sources are reporting the result that some observers had predicted: There’s less location data coming from apps.

Right now opt-in rates to share data with apps when they’re not in use are often below 50%, said Benoit Grouchko, who runs the ad tech business Teemo that creates software for apps to collect location data. Three years ago those opt-in rates were closer to 100%, he said. Higher opt-in rates prevailed when people weren’t aware that they even had a choice. Once installed on a phone, many apps would automatically start sharing a person’s location data.

Apple did not dither around with some balance of allowing advertisers to keep collecting location data at will while nominally protecting user privacy. Apple didn’t even block background location access. It just changed iOS so that users must deliberately allow background access, and the system now reminds users when apps actually use that access. That’s all. Yet, these simple changes have made it difficult for companies you’ve never heard of to monetize information you didn’t know you were sharing.

Apple, not coincidentally and unlike some of its competitors, is not a company making its money off personalized advertising.

Justin Schuh of Google:

After initial dialogue with the web community, we are confident that with continued iteration and feedback, privacy-preserving and open-standard mechanisms like the Privacy Sandbox can sustain a healthy, ad-supported web in a way that will render third-party cookies obsolete. Once these approaches have addressed the needs of users, publishers, and advertisers, and we have developed the tools to mitigate workarounds, we plan to phase out support for third-party cookies in Chrome. Our intention is to do this within two years. But we cannot get there alone, and that’s why we need the ecosystem to engage on these proposals. We plan to start the first origin trials by the end of this year, starting with conversion measurement and following with personalization.

Google’s Privacy Sandbox plans still require the cooperation and support of the web’s standards bodies, which is why they are pretending to be hindered from making privacy-supportive changes. It probably is, ultimately, a privacy-friendly move, albeit undercut by suspicions that it will further entrench Google’s business.

That wouldn’t be true if the world’s most popular browser were not owned by a personalized advertising company. C’est la vie.

Twice now, the U.S. Department of Justice has pushed Apple to help decrypt iPhones involved in high-profile crimes. Twice, Apple has pushed back. And, twice, the popular press has framed these cases in terms that do not help their general-audience readers understand why Apple is refusing demands to cooperate; instead, using language that implicitly helps those who believe that our rights should be compromised to a lowest common denominator.

The first time the Department of Justice began this campaign was in the aftermath of the December 2015 mass shooting in San Bernardino, California. Two individuals murdered fourteen people in a workplace terrorist attack motivated by extremist views. The perpetrators were killed. One had an iPhone and, while Apple was able to provide investigators with a copy of the data stored in iCloud, they were unable to assist with the phone’s unknown passcode. The Department of Justice attempted to use the All Writs act to compel the company to disable any passcode-guessing countermeasures that might be enabled, and Apple refused on the grounds that it would universally undermine its products’ security and set a precedent against encryption — more on that later. The two parties fought and nearly ended up in court before the FBI enlisted a third-party vendor to crack the passcode. Ultimately, nothing of investigative value was on the phone.

It has been over four years since that case first began, and officials did not, in that time, again attempt to compel Apple into weakening the security of its products. That is, despite nearly seven thousand devices in the first eleven months of 2017 alone being apparently inaccessible, the Department of Justice did not again make any further requests of unlocking assistance from Apple.

Until recently, that is, when a case of horrible deja vu struck. In December 2019, one person motivated by extremist views murdered three people in a terrorist attack at his workplace. The perpetrator had two iPhones, one of which he shot before being killed by police. Apple has provided investigators with the data they were able to access, but is not assisting with the decryption of the iPhones in question.

Which is how we arrive at today’s announcement from U.S. Attorney General William Barr that he wants more “substantive assistance” from Apple in decrypting the two phones used by the perpetrator in this most recent attack — and, more specifically, Katie Benner’s report for the New York Times:

Mr. Barr’s appeal was an escalation of an ongoing fight between the Justice Department and Apple pitting personal privacy against public safety.

This is like three paragraphs in and it is already setting up the idea that personal privacy and public safety are two opposing ends of a gradient. That’s simply not true. A society that has less personal privacy does not inherently have better public safety; Russia and Saudi Arabia are countries with respectable HDI scores, brutal censorship and surveillance, and higher murder rates than similarly-advanced countries that lack an authoritarian anti-privacy stance.

More worrisome, however, is how easily the issue of encryption is minimized as being merely about personal privacy, when it is far more versatile, powerful, and useful than that. The widespread availability of data encryption is one reason many companies today are okay with employees working remotely, since company secrets can’t be obtained by those not authorized. Encryption helps journalists get better information from sources who must remain anonymous. Encryption is why I haven’t had a printed bank statement in ten years, and how you know you’re giving your health care information to your insurance provider. Encryption helps marginalized people bypass unjust and oppressive laws where they may travel or live. It makes commerce work better. It prevents messages from being read by an abusive ex-partner. It gives you confidence that you can store your work and personal life on a single device.

The U.S. Department of Justice is trying to compel Apple to weaken the encryption of every iOS device. That will set a precedent that those who implement encryption technologies ought to loosen them upon request. And that means that everything we gain from it is forever undermined.

Public safety would not be improved if encryption were weakened — it would be gutted.

Knowing all that helps one see why a summary like this is wildly inaccurate:

Apple has given investigators materials from the iCloud account of the gunman, Second Lt. Mohammed Saeed Alshamrani, a member of the Saudi air force training with the American military, who killed three sailors and wounded eight others on Dec. 6. But the company has refused to help the F.B.I. open the phones themselves, which would undermine its claims that its phones are secure.

Apple is not declining to decrypt these iPhones for marketing reasons. If anything, the public reaction to its stance will be highly negative, as it was in 2015. They are refusing Barr’s request because it means, in effect, that we must forego all of the benefits we have gained from encryption.

Apple says as much in a statement they gave Scott Lucas of Buzzfeed:

We are continuing to work with the FBI, and our engineering teams recently had a call to provide additional technical assistance. Apple has great respect for the Bureau’s work, and we will work tirelessly to help them investigate this tragic attack on our nation.

We have always maintained there is no such thing as a backdoor just for the good guys. Backdoors can also be exploited by those who threaten our national security and the data security of our customers. Today, law enforcement has access to more data than ever before in history, so Americans do not have to choose between weakening encryption and solving investigations. We feel strongly encryption is vital to protecting our country and our users’ data.

This is the same thing experts keep telling lawmakers, who persist in believing that it’s a matter of hard work and willpower rather than a limitation of reality — and then cast their lack of understanding in profoundly offensive terms:

“Companies shouldn’t be allowed to shield criminals and terrorists from lawful efforts to solve crimes and protect our citizens,” Senator Tom Cotton, Republican of Arkansas, said in a statement. “Apple has a notorious history of siding with terrorists over law enforcement. I hope in this case they’ll change course and actually work with the F.B.I.”

Setting aside how stupid and disliked Cotton has proved himself to be, it’s revealing that his best argument is to claim that Apple sides with terrorists. He really hasn’t got a clue.

Back to the Times report:

The San Bernardino dispute was resolved when the F.B.I. found a private company to bypass the iPhone’s encryption. Tensions between the two sides, however, remained; and Apple worked to ensure that neither the government nor private contractors could open its phones.

This is one of those instances where a reporter is so close to getting it, but ends up missing the mark and landing in dangerous territory. Apple fixed a bunch of iOS security problems; these changes simultaneously prevent investigators and criminals from gaining access to devices because both are unauthorized, as far as the security infrastructure is concerned. Any breach of that may help law enforcement, but it will also help people trying to break into, for example, the President’s iPhone. Weakening security for one weakens it for everyone.

Apple did not “ensure” that it locked law enforcement out of its products. It fixed bugs.

Apple did not respond to a request for comment. But it will not back down from its unequivocal support of encryption that is impossible to crack, people close to the company said.

This Times piece was published before Apple responded at length to reporters — as linked above — but its position has been admirably consistent. Much in the same way that it’s impossible to draw a line between security holes for good people and security holes for bad people, it’s also hard not to see this ending with encryption compromised everywhere. If the Department of Justice thinks it should be breached for this device, why not the apparently thousands of devices in storage lockers? If encryption should not apply to devices belonging to the dead, why not devices belonging to the living? If they can get into encrypted devices at rest, why wouldn’t they feel the right to decrypt communications in transit? If the relatively stable and reputable law enforcement of the United States can gain access, what about officers in other countries? Why not other branches of the justice system, or even officials more broadly? Are there any countries that you would exclude from access?

There is a point at which I expect many people will start to push back against this ever-expanding list of those allowed access to encrypted communications. From a purely technical perspective, it doesn’t matter where you stopped: if you don’t think a particular corrupt regime should be allowed to decrypt communications and devices on demand, or you object to other branches of a government having access, or you think that this policy should only apply to devices with dead owners. It simply doesn’t matter. Because, from a technical perspective, once you allow one point of access, you allow them all. Code doesn’t care whether a back door was accessed by an investigator with a warrant, a stalker, a corrupt official, or a thief.

This story is not a case of a stubborn tech company feeling like they are above the law. It is about an opportunistic Department of Justice that is making an impossible demand that devices should allow access to the authorized user, law enforcement agencies, and nobody else. They haven’t argued for that since a previous high-profile terrorist attack, so it isn’t about principle. It’s about taking advantage of a situation they know will be a hard public relations battle for Apple — in large part because the public at large doesn’t understand the unfeasibility of what is being asked. Articles like this one do nothing to explain that, and only help to push the government’s dangerous position.

After the past few years of all “big tech companies” being lumped into the same pile of public distrust, I fear they might win this time. As a result, we, the public, will lose our electronic privacy, security, and safety.

Adam Engst, TidBits:

My search confirmed my initial hunch that there is only one official remaining use of the word “Macintosh” by today’s Apple: the default “Macintosh HD” name of the internal drive on a new Mac. Many Mac users personalize that name immediately, although less experienced Mac users often don’t realize they’re allowed to change it. (If you’ve never done it, just click the name once to select it and a second time to start editing it, just like a file or folder.)

[…]

What’s most curious about this vestigial naming is that everything about it is wrong. Besides the anachronistic use of “Macintosh,” the “HD” abbreviation for “hard disk” or “hard drive” refers to a spinning disk drive, whereas most Macs rely on SSDs (solid-state drives). Even the case-less hard drive icon in the Quick Look preview window incorrectly uses an image of a spinning disk to represent an SSD.

As Engst illustrates, the CoreTypes bundle in MacOS contains icons for all kinds of drives: different external drive types, shared drives, and even fossils like ZIP drives. But MacOS stubbornly names the built-in drive “Macintosh HD” by default and still assigns it a spinning disk icon — which, by the way, has been redrawn twice since Apple launched Macs with default SSD options.

I kind of like it. Also, I still keep hard drive icons on my desktop, so maybe this is more indicative of the kind of person I am.

Anne Steele, Wall Street Journal:

U.S. music streams on services like Spotify Technology AB, Apple Music and YouTube rose 30% last year to top one trillion for the first time, according to Nielsen Music’s annual report, fueled by big releases from artists like Taylor Swift, Billie Eilish and Post Malone.

Streaming services have upended how people listen to and pay for music, and now account for 82% of music consumption in the U.S., according to Nielsen. Sales of physical albums, meanwhile, dropped off 19% in 2019 and now make up just 9% of overall music consumption.

Since 2016, streaming has been far bigger business than digital sales ever were. Meanwhile, vinyl records are projected to surpass compact discs in 2019 sales. This makes complete sense to me: if you’re passively listening to music, you’ll just stream it because you don’t have to pay more, but if you want to make your music listening an — and I am already regretting this word choice — experience, you’ll pick up a physical item with presence.

Amy X. Wang, Rolling Stone:

Of the litany of things you can buy for $10 — a sandwich, a box of pens, and a print magazine among them — unlimited access to a catalog of 50 million songs is one of the most bang-for-your-buck options out there. But that’s how much music-streaming subscriptions have cost their entire existence.

The coming year may be when that finally changes, for a number of reasons. First, Spotify, the leader of the music-streaming market, recently entered its second decade of existence towing 250 million users, 110 million of whom are paying subscribers; when tech companies hit such major growth milestones, they tend to hike up prices to begin recouping previous years’ lost revenue, which is why Uber rides, Seamless deliveries, ClassPass sessions, and the products of other startups-turned-behemoths are more expensive now than they were at the start.

I can’t imagine most people a decade ago were spending $120 per year on music; given the RIAA’s sales data, it appears that streaming now generates more revenue than all music revenues combined from 2010–2017. But now we are, and we can probably expect to spend much more than that pretty soon.

Chris Baraniuk, BBC:

Ophir Harpaz just wanted to get a good deal on a flight to London. She was on travel website OneTravel, scouring various options for her trip. As she browsed, she noticed a seemingly helpful prompt: “38 people are looking at this flight”. A nudge that implied the flight might soon get booked up, or perhaps that the price of a seat would rise as they became scarcer.

Except it wasn’t a true statement. As Harpaz looked at that number, “38 people”, she began to feel sceptical. Were 38 people really looking at that budget flight to London at the same exact moment?

Being a cyber-security researcher, she was familiar with web code so she decided to examine how OneTravel displayed its web pages. (Anyone can do this by using the “inspect” function on web browsers like Firefox and Chrome.) After a little bit of digging she made a startling discovery – the number wasn’t genuine. The OneTravel web page she was browsing was simply designed to claim that between 28 and 45 people were viewing a flight at any given moment. The exact figure was chosen at random.

I have some travel coming up, so I’ve spent a few weeks trying to get a good deal on a flight and a hotel room. I cannot imagine that any website is thirstier for you to act immediately than a travel booking website. I’d do everything I could to limit my accommodation choices to just those within my budget and in a specific location, but I’d still be offered sold-out five-star hotels nowhere near where I wanted to be — I suppose this was to encourage me to book something, anything, quickly.

Also, many of the biggest travel booking websites are owned by just a couple of companies: Bookings Holdings runs Booking.com, Priceline, Kayak, and Cheapflights; the Expedia Group owns Expedia, Hotels.com, Hotwire, Orbitz, Travelocity, and Trivago. Each group shares the same inventory, and they all use the same tactics. Users simultaneously get the impression that they’re shopping around and competing with other users, when neither is true.

Apple:

The App Store is the world’s safest and most vibrant app marketplace, with over half a billion people visiting each week. It remains the safest place for users to find software and provides developers of all sizes access to customers in 155 countries. Since the App Store launched in 2008, developers have earned over $155 billion, with a quarter of those earnings coming from the past year alone. As a measure of the excitement going into 2020, App Store customers spent a record $1.42 billion between Christmas Eve and New Year’s Eve, a 16 percent increase over last year, and $386 million on New Year’s Day 2020 alone, a 20 percent increase over last year and a new single-day record.

Big numbers. Investors sure seem pleased — the stock hit a new high.

Apple News draws over 100 million monthly active users in the US, UK, Australia and Canada and has revolutionized how users access news from all their favourite sources. Apple News+ offers an all-in-one subscription to hundreds of the world’s top magazines and major newspapers.

Apple does not disclose how many paying subscribers they have for Apple News Plus, nor for Arcade or Music. Perhaps it’s simply a matter of disclosure rules regarding the company’s upcoming quarterly earnings report. Of course, there’s another possibility.