An In-Depth Look at Apple’s New Map Data justinobeirne.com

A new post by Justin O’Beirne is an immediate must-read for me, and this latest one is no exception. In fact, it’s maybe the one I would most recommend because it’s an analysis of the first leg of a four-year project Apple unveiled earlier this year. Here’s what Matthew Panzarino wrote at the time for TechCrunch:

The coupling of high-resolution image data from car and satellite, plus a 3D point cloud, results in Apple now being able to produce full orthogonal reconstructions of city streets with textures in place. This is massively higher-resolution and easier to see, visually. And it’s synchronized with the “panoramic” images from the car, the satellite view and the raw data. These techniques are used in self-driving applications because they provide a really holistic view of what’s going on around the car. But the ortho view can do even more for human viewers of the data by allowing them to “see” through brush or tree cover that would normally obscure roads, buildings and addresses.

O’Beirne:

Regardless of how Apple is creating all of its buildings and other shapes, Apple is filling its map with so many of them that Google now looks empty in comparison. […]

And all of these details create the impression that Apple hasn’t just closed the gap with Google — but has, in many ways, exceeded it…

[…]

But for all of the detail Apple has added, it still doesn’t have some of the businesses and places that Google has.

[…]

This suggests that Apple isn’t algorithmically extracting businesses and other places out of the imagery its vans are collecting.

Instead, all of the businesses shown on Apple’s Markleeville map seem to be coming from Yelp, Apple’s primary place data provider.

Rebuilding Maps in such a comprehensive way is going to take some time, so I read O’Beirne’s analysis as a progress report. But, even keeping that in mind, it’s a little disappointing that what has seemingly been prioritized so far in this Maps update is to add more detailed shapes for terrain and foliage, rather than fixing what places are mapped and where they’re located. It isn’t as though progress isn’t being made, or that it’s entirely misdirected — roads are now far more accurate, buildings are recognizable, and city parks increasingly look like city parks — but the thing that frustrates me most about Apple Maps in my use is that the places I want to go are either incorrectly-placed, not there, or have inaccurate information like hours of operation.