For the past several years, Apple’s car team had explored two simultaneous paths: creating a model with limited self-driving capabilities focused on steering and acceleration — similar to many current cars — or a version with full self-driving ability that doesn’t require human intervention.
Under the effort’s new leader — Apple Watch software executive Kevin Lynch — engineers are now concentrating on the second option. Lynch is pushing for a car with a full self-driving system in the first version, said the people, who asked not to be identified because the deliberations are private.
Apple is internally targeting a launch of its self-driving car in four years, faster than the five- to seven-year timeline that some engineers had been planning for earlier this year. But the timing is fluid, and hitting that 2025 target is dependent on the company’s ability to complete the self-driving system — an ambitious task on that schedule. If Apple is unable to reach its goal, it could either delay a release or initially sell a car with lesser technology.
This is the project I am most doubtful of — not just from Apple, but from the entire industry. I will believe in the possibility of a fully autonomous car when I see one driving like a human would in mixed weather conditions, construction zones, gravel roads, and twisty mountain passes — not until then.
Other companies have loudly trumpeted their attempts at autonomous vehicles with not great results, but Apple has, as you would expect, kept its efforts mostly to itself. I wonder how it is getting along. Gurman reports that a key milestone has been achieved that puts it on the path to launching in the foreseeable future, but I still cannot shake my doubts. It is not because of what we have seen from Tesla or Waymo or others; I think the best way to view Apple is through its own work. And that is a big problem because its history of automation, cartography, and machine learning has not been encouraging. From the company that brought you Apple Maps and Siri is not a great tagline for a vehicle weighing many tonnes and travelling at high speeds with only its own programming to guide it.
But if 2025. or even 2030, is seen internally as a reasonable timeframe for public availability of this thing, it can only be seen as a promising project. I refuse to be anywhere near one — inside or out — until it has proved its capabilities, but this is intriguing.
I’d spent my morning so far in the backseat of the Model 3 using “full self-driving,” the system that Tesla says will change the world by enabling safe and reliable autonomous vehicles. I’d watched the software nearly crash into a construction site, try to turn into a stopped truck and attempt to drive down the wrong side of the road. Angry drivers blared their horns as the system hesitated, sometimes right in the middle of an intersection.
The Model 3’s “full self-driving” needed plenty of human interventions to protect us and everyone else on the road. Sometimes that meant tapping the brake to turn off the software, so that it wouldn’t try to drive around a car in front of us. Other times we quickly jerked the wheel to avoid a crash. (Tesla tells drivers to pay constant attention to the road, and be prepared to act immediately.)
Watch the video where CNN editor Michael Ballaban drives — well, is present — this thing. It looks terrifying. I am not sure about you, but I would prefer to be in control at all times, rather than relying on partial automation while I maintain a driving level of focus so I may rescue the car when it screws up.
There are caveats, certainly. This is beta software, and it is certainly impressive that it can do some basic driving on its own. But this is not a self-driving car — not even close.
I also get why people are excited about Microsoft in general. This new Microsoft surprises and delights by doing things that old Microsoft would never consider. They have Visual Studio for the Mac. They make PC hardware. They even include Linux support in Windows. This new Microsoft is exciting and different, but they’ve also been around long enough to show us who they are. Nothing exemplifies that more than Windows on ARM. I think it’s great Microsoft has spent five years pushing Windows on ARM, but no one in their right mind could say they’ve been as successful at it when compared to what Apple has just accomplished. The tech community likes to pretend that Windows on ARM and the Surface Pro X are viable, if not flawed, options when they’re really not.
Wellborn’s selection of quotes from enthusiastic press coverage of Microsoft’s lukewarm ARM efforts reminded me to go look for some reactions to the early rumours and the announcement that Apple would be switching to its own processors. I want to do this not just because these things are funny to read in hindsight, but also because they illustrate why media and analyst coverage often gets this stuff wrong in the first place — especially when it comes to Apple.
Let me take you back to springtime of 2018. Perennial speculation of a shift away from Intel processors in the Mac seemed to be confirmed when Ian King and Mark Gurman of Bloomberg reported on the in-progress transition. A flurry of responses from columnists and reporters followed.
Brian Barrett of Wired seemed to think the architecture shift would necessarily include radical software changes:
Apple could also find users flummoxed at its attempt at the MacOS-iOS mashup that would apparently accompany an ARM transition. It wasn’t so long ago, after all, that Microsoft flamed out spectacularly when it attempted to bring a mobile UI to the desktop in Windows 8, an overhaul that left users feeling mostly confused and annoyed. And while Cupertino has already made some adjustments to give its desktop and mobile operating systems some common ground—its Apple File System, introduced last spring, works across both—it will have to combat years of ingrained expectations about how Apple devices behave.
In fairness, Barrett called his imagined list of problems likely to arise during this transition “surmountable”. But, still, it is a list of doom-and-gloom thoughts about how hard it will be for Apple to move away from Intel, with the assumption that it could only be for the most lightweight and entry-level uses, sort of like ARM laptops that run Windows.
Samuel Axon of Ars Technica speculated that Macs running on Apple’s processors could fit in the lineup like ARM-based Microsoft Surfaces, at least at first (emphasis mine):
While it makes sense for Apple to start sailing on this journey now, it likely won’t arrive at its destination (total independence from Intel) for several years—likely well beyond the 2020 date that Bloomberg names as the earliest launch window for a first Intel-free Mac. If an Apple-chip-powered Mac arrives in 2020, it could be a specialized product in a Mac lineup that still mostly includes Intel-based computers.
Joel Hruska of ExtremeTech was worried about entirely the wrong customer base:
But it’s genuinely surprising that Apple would choose to abandon CPU compatibility given the significant impact x86 had on its Mac product lines. Mac adoption rates shot upwards once people knew their hardware would be seamlessly compatible with Windows. Walking away from that same compatibility now seems foolish, at least as far as good customer support is concerned.
Windows on ARM theoretically presents a solution to this problem, but the WoA OS is limited to 32-bit applications, with no support for x86 drivers, Hyper-V, and limited API compatibility. Supposedly this transition won’t take place before 2020, which gives MS and Apple another 20 months to get their ducks in a row, but 20 months isn’t actually all that much time to perfect cross-OS compatibility, especially not if the goal is to add better and more robust support for 64-bit applications and various types of system drivers.
These analyses have many flaws, but one thing they share is the idea that Microsoft tried similar things and failed, so why should Apple be any different? I buy the argument that Apple’s attempt should not have been deemed a success until the company proved its bonafides, but these predictions are ludicrous: MacOS is nearly the same on Intel and M1 processors, Apple was not timid with its M1 introduction, and I do not imagine the question of Windows’ availability made anyone at Apple blink.
After Apple announced the transition at WWDC 2020 — but, critically, before it announced or shipped any consumer hardware — Alex Cranz of Gizmodo speculated on the company’s motivations with the help of an analyst:
Profits are the likely motivation behind Apple’s biggest moves — for any publicly-traded company’s biggest moves — even when those moves have altruistic outcomes like improving customer privacy. And Apple’s main profit driver is vertical integration: the practice of keeping as many elements of a supply chain in-house as possible to drive down costs, increase revenue, and maintain a hold on the markets it dominates.
“Apple hasn’t been very successful over the past five years with the Mac and most of the innovation has come from Windows vendors,” analyst Patrick Moorhead told Gizmodo. “I think Apple sees vertical integration as a way to lower costs and differentiate. We’ll see. It’s a risky and expensive move for Apple, and right now I’m scratching my head on why Apple would do this. There’s no clear benefit for developers or for users, and it appears Apple is trying to boost profits.”
Apple undoubtably likes vertical integration, and not just because of cost reasons. (By the way, have you noticed how often columnists and analysts write about Apple’s ostensible desire for control as though that is the goal, but rarely define the possible motivations for choosing to build integrated devices instead of collections of parts?) Moorhead’s inability to see benefits for anyone other than Apple looked silly at the time and has only aged worse.1
Neither Moorhead and Cranz give serious thought to the possibility that processors of Apple’s own design could be the foundation of Macs that perform better than their Intel counterparts and get far better battery life. Cranz dances around exploring it for a couple of paragraphs — maybe Apple’s chips will be competitive with those from Intel and AMD — but most of the article is dedicated to the supposedly taller walls of Apple’s garden. There is no clear reason why this is the case: Apple has only ever officially supported the Darwin-based versions of MacOS on its own hardware, no matter what instruction set or vendor its processors use. Moorhead, on the other hand, bet on Apple only transitioning laptops and consumer hardware to its own processors, and retaining Intel for its higher-performance Macs. “Fingers crossed,” Cranz wrote in response.
I assume few Mac users are now crossing their fingers that Apple keeps Intel processors in future products, even at the high end.
Re-reading some of the press from this time in the Mac’s history and comparing it to coverage quoted by Wellborn is a heck of a head-trip. Even without the knowledge that Apple’s own processors would instantly become the benchmark for the personal computer industry, it seems like the flaws in others’ efforts — Microsoft’s in particular — are centred only once reliable rumours surface about Apple’s entry. Then, these writers oftentimes seem to view Apple’s attempts through exactly the same lens as any other company’s, somehow ignoring the vertical integration that so distinguishes it or its own history of product development.
That is not to say the press should have assumed that ARM Macs would be brilliant; skepticism is often lacking in the tech press. But it seems especially egregious in the case of this transition because Apple’s previous processor architecture change is recent memory. Why assume Apple would take a similar route as Microsoft did with Windows on ARM when it always seemed more likely that it would mimic its own past success of moving from PowerPC to Intel?
But, no, the tech press looked to the attempts of other companies as instructive of what Apple would do, despite that being a flawed speculation strategy for decades.
It’s amazing how future Microsoft products beat current Apple products time and time again, isn’t it?
An interpolation of that: it is amazing how present Microsoft problems do not match the speculated doom of similar efforts from Apple, time and time again.
In the 2018 Wired article, Moorhead is also quoted as an expert analyst voice:
“Computationally I can see a Core i3 or low-end Core i5,” says Patrick Moorhead, founder of Moor Insights & Strategy, comparing ARM’s abilities to entry-level Intel chips. “I can’t imagine that by 2020 they’d have a processor anywhere near the capabilities of a Xeon or a Core i7.”
Nobody seemed to predict the astonishing power of even a base model M1 MacBook Air. But when read alongside Moorhead’s analysis cited by Cranz in 2020, it looks like he was certain this was purely a play to spend less money with Intel rather than a serious effort to do better. Why would Apple go through the effort of switching for any other reason than because it wanted more than what Intel could offer? ↩︎