Because AR’s inherent function is to enhance the physical world, its relevance is often tied to specific locations. This is what we call geo-local AR. It’s all about AR experiences whose value is tied to locations. We’re talking informational overlays on storefronts and waypoints.

If this sounds familiar, it’s the foundational principle behind the AR cloud – a conceptual framework for AR’s future. For those unfamiliar, the AR cloud is a data mesh that’s anchored to the physical world to inform and empower AR devices to invoke the right content and experiences.

This concept also may sound familiar as it aligns with a buzzword that’s run rampant: the metaverse. Though the term is often used in the context of online fully-digital experiences, it can also involve content that brings digital depth to real-world places: Call it AR’s metaverse.

This is also what we call the metavearth and it’s the topic of a recent report from our research arm ARtillery Intelligence. Entitled Geolocal AR: The Metavearth Materializes, it breaks down the drivers and dynamics of this emerging area and is the topic of our latest excerpt (below).

Geolocal AR: The Metavearth Materializes

Synchronous and Persistent

To pick up where we left off in the last report excerpt, the components of geo-local AR sit under a broader conceptual framework: the metaverse. This term is a bit vague, as its meaning has been conflated through overuse. But it does have some ties to geo-local AR.

Backing up, ‘metaverse’ defines digital worlds that host synchronous interaction. Mark Zuckerberg – intent on making Facebook a metaverse company – describes it as an “embodied internet,” offering the connectivity, utility, and entertainment of the web, but fleshed out in 3D.

This is sometimes discussed in VR terms. For example, we have digital domains where synchronous interaction takes place between place-shifted participants. These include Altspace VR, Rec Room, VRChat, Horizons Workrooms, and non-VR environments like Second Life.

But the Metaverse concept also applies to AR. For example, companies like Niantic are building platforms to create digital enhancements to a physical world that are synchronous (experienced together at the same time) and persistent (anchored to locations).

This is AR’s metaverse – though the definition and manifestation of that vision will continue to evolve over the next several years. Speaking of which, we must be realistic when future-gazing about the metaverse. Some of these principles are years from a fully-mature state.

XR Talks: What is Niantic’s ‘Real-World Metaverse?’

Real-Wold Metaverse Stack

But though we’re years away, the “real-world metaverse’s” components are developing today. They include a sort of metaverse stack, with devices, sensors, 5G connectivity, LiDAR spatial mapping, AR cloud rendering and an app (or web) layer. These are all at very early stages.

Even use cases are unknown, though everyone likes to speculate. Novel use cases often aren’t devised until new platforms seep into the developer mindset. Apps like Uber — utilizing the mobile form factor and 4G — weren’t imagined when the iPhone was first launched.

Rather, it took time and acclimation before developers could start thinking natively. Only then could they build experiences that tap into the unique capabilities of a new platform. The same process will unfold for the metaverse, meaning we have ample innovation to look forward to.

Meanwhile, platforms like Niantic Lightship and Snap’s Custom Landmarkers are leading the way. Use cases could involve everything from artistic expression to utilities. The latter could hold potential killer apps such as “captions for the physical world,” that help us live our daily lives.

We’ll pause there and circle back in the next report excerpt with more…

More from AR Insider…