As you likely know, one of AR’s foundational principles is to fuse the digital and physical. The real world is a key part of that formula….and real-world relevance is often defined by location. That same relevance and scarcity are what drive real estate value….location, location, location.

Synthesizing these factors, one of AR’s battlegrounds will be in augmenting the world in location-relevant ways. That could be wayfinding with Google Live View, or visual search with Google Lens. Point your phone (or future glasses) at places and objects to contextualize them.

As you can tell from the above examples, Google will have a key stake in this “Internet of Places.” But it’s not alone. Apple signals interest in location-relevant AR through its geo-anchors and Project Gobi. Facebook is building “Live Maps,” and Snapchat is pushing Local Lenses.

These are a few utilitarian, commerce, and social angles. How else will geospatial AR materialize? What are its active ingredients, including 5G and the AR cloud? This is the theme of our new series, Space Race, where we break down who’s doing what….continuing here with Amazon.

AR’s ‘Space Race’ Revs Up

Point & Learn

Picking up where we left off in the last Space Race installment, how does the mighty Amazon fit into geospatial AR? Admittedly, its moves and motivations aren’t as big as others in the series — namely Google and Niantic — simply because offline local commerce isn’t its core business.

It has dabbled in AR over the past few years, such as its partnership with Snap to power product-based visual searches. Using the Snap Scan feature, users can point their phones at items to identify them. Amazon inserted itself in this flow to drive transactions from visual searches.

Amazon is also increasingly chasing offline commerce in other ways to diversify revenue amidst a maturing core e-commerce business. Part of that involves physical-world commerce such as its Amazon Go stores and its landmark acquisition of Whole Foods, among other moves.

The latest is its experimental tech-fueled salon. Occupying 1,500+ square feet on Brushfield Street in London’s Spitalfields, it will serve as a test-bed for Amazon’s “Point and Learn,” technology. As it sounds, this technology reveals information when shoppers point at a given product.

What’s Behind Snapchat’s Visual Search Play?

Retail as a Service

Going deeper on Point & Learn, it employs optical and motion sensors to detect when a shopper points at a product. Information is then invoked through flat panel displays that flank the product, or audio messaging. Consumers can then scan QR codes to order items on Amazon.

Doubling down on AR, the salon will feature Echo Look-esque smart mirrors that let shoppers virtually try on various cosmetics for a more informed purchase. This has been a leading AR use case — especially during retail lockdowns — to guide online cosmetics purchases.

But anyone following Amazon’s moves over the past few years can detect that it isn’t interested in getting into the Salon business. This is purely an experimental play for Point & Learn, just as Amazon Go stores are primarily to incubate Amazon’s “just walk out” technology.

This is all part of Amazon’s broader “retail-as-a-service” (RaaS) play. Following the AWS playbook, it’s cultivating an internal need that will then get spun out as a service for third parties. It’s a major expansion move for Amazon, and several parts of it will involve AR and computer vision.

Amazon Goes Full “Retail as a Service”

Palm Payment

More evidence of the RaaS approach comes from the Amazon One palm-reading POS payments that will speed up transactions at grocery stores. It recently announced that it’s rolling out the technology at a handful of Whole Foods in the Seattle area (another incubation play).

This biometric tracker can authenticate users with a palm scan. In some instances, users scan their palm when entering a store, then “just walk out” when done. Once scanned, that shopper is associated with their payment method and Prime status, set up on the first use.

The other way that the technology works is directly at the point of sale. Eschewing the above palm scan-upon-entrance, this will simply have shoppers scan their palm at the POS, just like they would enter a credit or debit card while going through a checkout aisle.

The second method has less friction and is a first step to assimilate the technology. That goes for both user habits and comfort levels, as well as retailer logistics. Though Amazon is aggressive with tech implementation, it knows the world isn’t ready to forego checkout aisles altogether.

Phygital Fusion

The timing for all of the above retail innovations are notable. Both Point & Learn and Amazon One come at a time when the retail world opens back up to potentially apprehensive shoppers. Much will depend on post-Covid demand for physical shopping, and new in-store protocols.

As we’ve examined, retailers in the post-Covid world may encounter demand signals for touchless in-aisle interaction technologies. That will depend on several unknowns, such as consumers’ desire to get back to physical shopping, and guidance on behavior in public spaces.

If there is indeed demand for touchless technologies in retail environments, AR is a natural fit. It can overlay everything from product descriptions to brand spokespeople that come to life through your smartphone viewfinder. We’ll also see AR-adjacent RaaS tech like Point & Learn.

Speaking of adjacency, AR continues to broaden. Its definitions transcend graphics on the physical world to include any physical-digital fusion. That’s everything from Zoom backgrounds to situationally-aware audio. Amazon will have a big part in pushing these boundaries.

More from AR Insider…