“Trendline” is AR Insider’s series that examines trends and events in spatial computing, and their strategic implications. For an indexed library of spatial computing insights, data, reports and multimedia, subscribe to ARtillery PRO.


One area of AR we continue to bet on is 3D navigation. Sort of a cousin of classic forms of AR that get more airtime, we’re talking about using your smartphone to scan your surroundings, localize itself, then overlay relevant graphics such as directions or storefront information.

Several tech giants are working towards different flavors of this vision. Snapchat is rolling out ways to leave location-anchored graphics for friends to discover. As we examined recently, this could be a step towards more practical and commerce-based use cases like storefront reviews.

Going back further, Google has long revealed its intentions to use the smartphone as a visual search tool for local discovery. Google Lens lets users hold up their phones at various objects to contextualize them, while Live View lets them navigate urban areas in the ways described above.

But the latest company to signal moves in this direction is Apple. Separated from its recently-announced GeoAnchors, Apple now lets users scan surroundings to localize their device. This is to refine one’s location in 2D Apple Maps, but it’s a possible step towards live AR navigation.

Is Visual Mapping Apple & Google’s Next Battleground?

Internet of Places

Before diving into Apple’s latest move, a bit of background is in order. Among all of the flavors of AR being pursued by tech giants, local search and navigation have a semi-clear business case. For one, they’re high-frequency utilities, which are attributes that tend to map to killer apps.

The local commerce components are also attractive, as AR is fit for local search and discovery. In fact, one of the rallying cries of the AR cloud and “AR everywhere” is the technology’s ability to anchor and invoke spatially-relevant graphics that are persistent across sessions and users.

This especially applies to Google as the company’s DNA is about indexing things. It wants to index the physical world just like it indexed the web: a sort of “internet of places.” Support for this theory comes from triangulating its moves and its interest in future-proofing its core search business.

Payoffs include monetization potential — through advertising, affiliate revenue, or other models — to facilitate local offline commerce (at least in normal times). This can be seen as a logical extension of the path Google has been on for years to facilitate local commerce through search.

Where Do AR and Local Commerce Converge?

Where GPS Fails

Back to Apple, its latest move employs the camera to help users get a more accurate location reading, as noted. Specifically, In iOS14, Apple Maps will let users raise their phones to scan surrounding buildings, which will in turn recognize where they are and improve navigation.

This will come in handy in urban canyons where GPS loses precision as the signal bounces off buildings. Apple is solving the issue by letting the camera take over where GPS fails. If this sounds familiar, it’s Google’s same explanation for using the camera to localize its Live View function.

Also like Live View, Apple uses street-level imagery as a visual database against which to match live scans. This imagery comes from its LookAround feature in Apple Maps. This is its version of Street View; and a data source with several potential AR outcomes (more on that in a bit).

Though these parallels abound, a key difference between Apple’s new localization feature and Google Live View, is that the former is still used for 2D mapping. Though computer vision is used to localize one’s device, the result is location accuracy for good-ol’ 2D navigation… for now.

XR Talks: Lessons From Building AR for Google Maps, Part I

Acclimation Play

Apple’s latest mapping feature follows ongoing efforts to reboot its underlying maps data. This is primarily to modernize a core iOS function, and alleviate the black eye still felt from last decade’s Mapgate. But it also could signal bigger moves, as we examined when the effort first launched.

LookAround is one component of that mapping effort. As noted, it’s a Street View-like feature that goes a step further with dimensional attributes (depth data) compared with Street View’s patchwork of 2D images. These 3D maps could play a key role in Apple’s AR master plan.

The immediate benefit of this 3D data is simply for LookAround itself. But the byproducts could parallel Google’s use of Street View imagery for Live View navigation. In other words, that data can be repurposed in baseline functionality for AR and computer-vision based navigation.

So Apple’s latest visual-localization feature is likely one of many computer-vision based moves we’ll see. And they’ll surely tie in with Apple’s other orbiting AR moves. The goal, at least in part, is to acclimate the world to a more visual UX so that its eventual AR glasses hit the ground running.

 

More from AR Insider…