Spatial computing – including AR, VR, and other immersive tech – continues to alter the ways that we work, play, and live. But there have been ups and downs, characteristic of hype cycles. The pendulum has swung towards over-investment, then towards market correction.

That leaves us now in a sort of middle ground of reset expectations and moderate growth. Among XR subsectors, those seeing the most traction include AR brand marketing and consumer VR. Meta continues to advance the latter with massive investments and loss-leader pricing.

Beyond user-facing products, a spatial tech stack lies beneath. This involves a cast of supporting parts. We’re talking processing muscle (Qualcomm), experience creation (Adobe), and developer platforms (Snap). These picks and shovels are the engines of AR and VR growth.

So how is all of this coming together? Where are we in XR’s lifecycle? And where are there gaps in the value chain that signal opportunities? This is the topic of ARtillery Intelligence’s recent report Reality Check: the State of Spatial Computing, which we’ve excerpted below.

Reality Check: The State of Spatial Computing

Purpose Built

In the last few installments of this series, we profiled AR devices such as Magic Leap 2 and Snap Spectacles. But beyond AR glasses themselves, there’s a tech stack forming. Like other computing categories, AR glasses need hardware, software, silicon, and all points between.

In fact, the need for integrated systems is even greater, given AR’s technical complexity. And one company that’s integral to this stack is Qualcomm. With its Snapdragon Spaces developer platform and purpose-built chips, it’s a key accelerant in AR glasses and VR headsets.

On the AR side, Qualcomm advanced this mission through the AR2 – the first chip built just for AR. It has also released a referenced design for smartphone-connected AR glasses. Powered by the Snapdragon XR2 chipset and WiFi 6/6E, it’s a north star for AR glasses manufacturers.

The design offers dual 1920 x 1080 micro-OLED displays that run at 90 frames per second with a 40-degree diagonal field of view. Three cameras offer 6-degrees of freedom and hand tracking. The idea is to guide OEMs toward hardware that makes the best use of Qualcomm chips.

But the biggest highlight in Qualcomm’s reference design is cutting the cord. It relies on wireless connectivity to the smartphone, representing the next evolution in untethered AR glasses. This is a key step to bring AR glasses to the next level of traction among enterprises and consumers.

Snapdragon AR2 is the First Chip Built Just for AR

AR Continuum

Stepping back, Qualcomm’s approach can be better understood by looking at the AR hardware continuum. There are standalone devices like Microsoft Hololens 2 and tethered “viewers” that offload processing and other key functions to a compute host (e.g., smartphone).

The latter is further subdivided, explained by Qualcomm’s XR lead Hugo Swart when the reference design was released. They include “simple viewers” like Nreal Light which are basically display peripherals. In this case, most or all computing is handled by the corded host.

The second category is “smart viewers” which handle some of these functions on the glasses themselves. Known as split processing, the compute host does the heavy lifting like processing and rendering, while the glasses handle other functions like positional tracking.

This split processing approach has several advantages including power efficiency and a richer UX. And this is what defined Qualcomm’s previous XR1 reference design. The XR2 design now has all the benefits of split rendering but without the cord – a key evolutionary step for AR.

“While they’re not yet like my prescription glasses or my sunglasses,” said Swart, “they’re getting to a form factor that you can wear for longer. And without the cable, they are more comfortable for various use cases.”

We’ll pause there and circle back in the next report excerpt with more analysis of XR’s evolutionary path…

More from AR Insider…