XR Talks is a series that features the best presentations and educational videos from the XR universe. It includes embedded video, narrative analysis and top takeaways. Speakers’ opinions are their own. For a full library of indexed and educational media, subscribe to ARtillery PRO.


The past month saw a cluster of tech-giant events with keynote unveilings for the latest batch of products. And given tech giants’ ongoing interest and investment in spatial computing, the field was represented well. So we’ve extracted the spatial highlights for this week’s XR Talks.

The most notable updates came from Google, Apple and Facebook, in order of recency. This includes Google’s updates to its visual search efforts; Apple’s Lidar-powered iPhone 12 Pro; and Facebook’s developments in Live Maps, experimental AR glasses … and of course Quest 2.

We’ll tackle these in the same order, including narrative takeaways, common themes, and time-stamped video embeds. The idea is to connect the dots and triangulate the future of spatial computing through the moves of tech giants that are paving the roads. Let’s dive in…

Google 

Starting with the most recent, Google’s Search On 2020 event last week (video below) featured rapid-fire announcements around the ongoing evolution of search. This included continued machine-learning (ML) integration into natural language search, and inferring user intent.

But more to the theme of AR, the company likewise wants to make its visual search efforts more intelligent and ML-powered. This includes Google Lens‘ expanded capability to identify real-world objects and symbols. For example, like Snapchat, Google Lens can now solve math problems.

For more commercial use cases, Lens can now identify a wider range of products as well as the relationship between different products. For example, it has better semantic understanding of style items — and recommended matching items — to send you to the right place to buy them.

Lastly, Live View AR navigation can now identify storefronts along a navigated route. This is a feature Google has teased for years in concept videos, and is now available. It pulls listings data from GMB to show store attributes like hours of operation — a feature set that will likely expand.

See the keynote below, or start right at the AR and visual search segment here.

Apple 

Apple’s recent iPhone-centric hardware event likewise had rapid-fire announcements from Cupertino. As you’ve probably seen, this includes the iPhone 12 and all its variations. But the headline that most relatest to AR is lidar, which is integrated in the iPhone 12 Pro and Pro Max.

Lidar will unlock robust AR through faster and more comprehensive spatial mapping. Used by autonomous vehicles to achieve computational vision to “see” the road, lidar is the state of the art in spatial mapping that replaces RGB depth sensing in previous (and lower-end) iPhones.

This will manifest in the mostly-unseen computational work that happens before AR graphics are shown. For example, Lidar is better equipped to quickly scan room contours, which is the first step towards “believable” and dimensionally accurate AR that occludes physical objects.

Many smartphones can do this with the RGB camera, as noted. But experiences are uneven, dependent on lighting conditions, and require waving the phone around to achieve many scan points. Lidar is a more capable and purpose-built technology that could catapult user-friendly AR.

See the keynote below, or start right at the AR and visual search segment here.

Facebook

Last in order of recency — but not in spatial comprehensiveness — Facebook Connect 7 late last month had the most AR and VR focus of all the above events. This stands to reason as the event is expressly devoted to Facebook’s spatial initiatives, rather than just including them.

So what stood out? Getting the obvious out of the way, Quest 2 was launched with much fanfare, and subsequent praise from reviewers. After rumors about a potential Quest “lite” or Quest “pro”, we ended up getting the best of both: a pro-level device at with a lite ($100 less) price tag.

In terms of AR, Mark Zuckerberg’s announced that Facebook V1 smart glasses will arrive in 2021. Facebook is working with Ray-Ban maker EssilorLuxottica, signaling a design path that — like we project for Apple — will prioritize wearability and social sensibilities over advanced optics.

Also under the AR umbrella is Project Aria. This is Facebook’s field research for AR glasses. Devoid of optical displays, these research frames have cameras and sensors so Facebook can gain insight about social dynamics and ethics around AR glasses as they’re worn in public.

See the keynote below. Also see our separate writeup about Facebook Reality Labs’ Chief Scientist Michael Abrash’s and his technically in-depth keynote segment.

What Does it All Mean?

Connecting the dots on these keynotes, several themes emerge. For one, they validate the ongoing commitment and investment in spatial computing. Each of these tech giants — and others in the “big five” continue to see spatial computing as a way to future-proof themselves.

But the “how” is the most important part. They’re each developing different flavors of spatial computing. But the common thread is that those efforts support or pave a future path for their core businesses — Search for Google, Hardware for Apple, and Social connection for Facebook.

This goes back to our “follow the money” construct, which helps us triangulate tech giants’ spatial computing road maps. Knowing what they’re motivated by can help us extrapolate where they’ll end up, with the thesis that protecting tens of billions in revenue is a strong motivator.

We’ll be back next week with more XR talks from around the spatial spectrum.

XR Talks: Follow the Money for AR’s Trajectory

More from AR Insider…