Apple and Google continue to wage an arms race in mapping. Google is the incumbent with a dominant market share, while Apple is motivated to catch up with rapid-fire feature rollouts. Apple also has an advantaged position, at least on iPhones, and is investing to gain ground.

This arms race includes emerging tech that’s increasingly infused in mapping apps. Among other things, we’re talking immersive 3D navigation and wayfinding, as well as the related ability to identify places encountered along those routes (think: storefronts and business details).

For example, Google is able to offer features like Live View AR navigation by tapping into its vast street view database. It can “localize” a given device by matching what the device sees with Street View imagery, similar to ways that autonomous vehicles use LiDAR to “see” the road.

Once localized, the routing software does its thing and overlays wayfinding arrows to get you to your destination. This has become useful in things like urban walking directions, orienting oneself when emerging from an underground transit station, or being guided to your airport gate.

The AR Space Race, Part I: Google

Planet-Scale AR

As for Apple, it doesn’t have Street View but is starting to catch up with a competitive data asset: Look Around. This is Apple’s answer to Street View and stems from efforts to rebuild its mapping capabilities and data. This includes sending Street-View-like cars & cameras out on the roads.

That last part goes back to the “investing considerably” part. But even an Apple-sized bank account is challenged by Google’s head start and longer tenure in Street View. So how will it catch up? One way recently came to light: Collecting data while people use Apple Maps itself.

In other words, as users hold up their phones for 3D wayfinding, the camera ingests data to improve Apple’s 3D maps and point clouds. Similar to, again, LiDAR (which some iPhones have), it scans a given urban landscape to develop machine-readable 3D maps of building contours.

If this sounds familiar, Niantic employs a similar strategy to crowdsource 3D map construction by arming legions of Pokemon Go players with the ability to scan popular locales while they play the game. This is a key component of Niantic’s “planet-scale AR” ambitions.

Back to Apple, it could pull this off given that scanning every street corner in America is aided by owning the software in everyone’s pocket. Apple will look to accomplish that through fleets of cars and vans, as well as armies of iPhone users simply trying to get where they’re going.

The AR Space Race, Part V: Niantic

Privacy-First

This new data-ingestion strategy is rolling out with iOS 17.2. And because it’s the privacy-first Apple, the company is quick to point out that it’s only collecting machine-readable data such as 3D point clouds… versus human-readable fare such as faces and license plates.

Apple knows this is important to specify, as moves like this quickly raise privacy red flags. Apple continues to double down on painting itself as the privacy-safe player in big tech – a believable position because it doesn’t need personal data the way that Meta and Google do.

Meanwhile, we’ll see if Apple can pull this off and catch up to Google. It may have an opportunity to future-proof itself with such emerging technologies. If it can leapfrog Google in some of these areas, it might convert some users – especially among the camera-native Gen Z.

Lastly, anything that Apple does in the immersive realm triggers questions about how it plays into Vision Pro. The answer is that there may be direct integrations, such as navigating to destinations. But these won’t be apparent in the near term, given AVP’s stationary use case.

More likely, Apple continues to plant several seeds in its spatial computing master plan – from VisionOS to Apple Watch’s gestural tap to AirPods low-latency spatial audio to the iPhone’s spatial video capture. These moving parts will converge in various phases in the long term.

More from AR Insider…