One striking realization about spatial computing is that we’re almost seven years into the sector’s current stage. This traces back to Facebook’s Oculus acquisition in early 2014 that kicked off the current wave of excitement….including lots of ups and downs in the intervening years.

That excitement culminated in 2016 after the Oculus acquisition had time to set off a chain reaction of startup activity, tech-giant investment, and VC inflows for the “next computing platform.” But when technical and practical realities caught up with spatial computing….it began to retract.

Like past tech revolutions – most memorably, the dot com boom/bust – spatial computing has followed a common pattern. Irrational exuberance is followed by retraction, market correction, and scorched earth. But then a reborn industry sprouts from those ashes and grows at a realistic pace.

That’s where we now sit in spatial computing’s lifecycle. It’s not the revolutionary platform shift touted circa-2016. And it’s not a silver bullet for everything we do in life and work as once hyped. But it will be transformative in narrower ways, and within a targeted set of use cases and verticals.

This is the topic of ARtillery’s recent report, Spatial Computing: 2020 Lessons, 2021 Outlook. Key questions include, what did we learn in the past year? What are projections for the coming year? And where does spatial computing — and its many subsegments — sit in its lifecycle?

Spatial Computing: 2020 Lessons, 2021 Outlook

Enablers & Accelerants

Picking up where we left off in the last installment in this series, there are several underlying technologies for AR to operate the way we all envision. That includes computer vision, object recognition, sensor fusion in our smartphones (and someday AR glasses), and the AR cloud.

But one of the biggest enablers and accelerants took a step forward recently: LiDAR. Though it’s currently limited to the iPhone 12 Pro and Pro Max, LiDAR will trickle down to the rest of the iPhone lineup in the coming years. As this happens it could unlock AR’s next generation.

As background, LiDAR is short for light detection and ranging. It involves sensors that track how long it takes light to reach an object and bounce back. This is the state of the art for depth sensing and is how autonomous vehicles achieve computational vision to “see” the road.

Apple integrated the technology in the iPad Pro last year, signaling that it would soon arrive at an iPhone near you. This aligns with Apple’s AR master plan but more immediately has photography benefits — a key focal point in the iPhone’s horse-race against Samsung and Google flagships.

With smartphones maturing and each innovation cycle getting more rote and “incremental”, the camera has been one sexy component on which Apple has focused innovation and device marketing. That applies to AR and photography, but the latter is a much larger market today.

Meanwhile, smartphone cameras are all about innovating around space constraints and achieving DSLR quality with only millimeters of focal length. LiDAR now joins Apple’s multi-lens and software-fueled systems for better autofocus and “seeing in the dark” in low-light scenes.

XR Talks: How Will LiDAR Unlock AR’s Next Era?

Believable AR

Beyond LiDAR’s primary photography goals, what about its longer-term AR angle? As noted, it unlocks sharper and more acutely-tracked AR experiences. Exceeding the capabilities of the RGB cameras in the iPhone’s last few generations, LiDAR will enable more functional AR.

This will manifest in the mostly-unseen computational work that happens before AR graphics are shown, such as spatial mapping. LiDAR is better equipped to quickly scan room contours, which is the first step towards “believable” and dimensionally accurate AR that occludes physical objects.

Besides knowing that Apple is heading in this direction — based on LiDAR’s inherent alignment with its AR trajectory — the company has explicitly mentioned AR and LiDAR in the same breath. Perhaps more notable, today’s consumer AR leader, Snapchat, is already on top of it.

Specifically, Snap has announced plans to integrate LiDAR into its Lens Studio AR creator platform. As it has stated publicly, LiDAR elevates capabilities in its Lens Studio, unlocking a new creative range for AR developers — a central principle of its broader AR strategy.

LiDAR will also benefit Snap’s AR efforts by upgrading both the underlying capability and user-friendliness. The former involves better object tracking, which translates to graphics that interact with real-world items in more realistic ways. It can especially boost these capabilities in low light.

Apple Steps Further Into AR’s Future

Elevating AR

The above moves engender new use cases for Snapchat lenses. For one, it means more indoor activations, such as augmenting your office or bedroom. It brings lenses more meaningfully to the rear-facing camera to augment the world (versus selfie lenses) – a path Snap was already on.

As for user-friendliness, LiDAR can not only perform more accurate spatial mapping, but it can do so faster than standard RGB cameras. To activate AR experiences, users don’t have to wave their phones around – a more approachable UX that could appeal to a wider base of mainstream users.

And Snap isn’t alone. Niantic and other AR leaders are incorporating LiDAR to elevate their AR platforms, such as making Pokemon Go more world-immersive. Some are even integrating it with software like Unity’s physics engine for realistic AR that follows the laws of physics.

These integrations will continue as developers get creative with LiDAR and formulate machine-learning magic for several combinations of experiences that haven’t been imagined yet. Put another way, with the advent and assimilation of LiDAR, things could get a lot more interesting in AR.

More from AR Insider…