News

ARCore Depth API Takes Android AR Experiences To A Whole New Level

Android’s new depth tool allows developers to create a full depth map with a simple RGB camera. 

Look out ARKit, ARCore is catching up. Earlier today, Android announced the release of ARCore Depth API, a new tool that will allow developers to access ACore’s depth-from-motion algorithms to create more realistic AR experiences. Simply put, this Depth API lets creators record a depth map of their environment using a single RGB camera, such as the ones featured on most modern smart devices.

The tool captures multiple images from a variety of angles and then references those images as you move throughout your physical environment in order to track the distance between your device and any real-world objects. 

Image Credit: Google

How exactly is this useful? Well, the ability to track depth offers new possibilities in terms of occlusion. This means that digital objects represented in AR will now appear in front of and behind real-world objects. 

The team states that occlusion will be available in their Scene Viewer developer tool, which is responsible for Google Search’s AR capabilities. As part of a collaboration with home decor specialists Houzz, the team will also be offering a Depth API as part of their “View My Room” experience, which allows you to sample 3D furniture in your environment before you purchase.

Imagine this: you’re enjoying a fun afternoon of Pokemon Go at your local park when you get a notification that a wild Scyther is roaming somewhere in your vicinity. You open up AR mode on your app and scan the area with your camera; as you pan past a large tree, you see a green tail protruding from behind the large trunk.

As you approach, the bladed monster emerges from behind the foliage. Without any occlusion, that same Scyther would have just rendered awkwardly over the tree. Instead, depth-sensing technology offered a more immersive experience by blending the digital character naturally with the real-world environment. 

Image Credit: Google

“Using the ARCore Depth API, people can see a more realistic preview of the products they’re about to buy, visualizing our 3D models right next to the existing furniture in a room,” states Sally Huang, Visual Technologies Lead at Houzz, in an official release. “Doing this gives our users much more confidence in their purchasing decisions.”

In addition to occlusion, the ARCore team has been tinkering around with several improvements that will soon offer more realistic physics, surface interaction, and path planning. 

The ARCore team promises a regular series of improvements as it continues to refine its Depth API; this includes the potential addition of time-of-flight (ToF) sensors, as well as new depth-measuring software and devices. 

The release of ARCore Depth API follows on the heels of ARCore’s Environmental HDR update, which added dynamic real-world lighting to AR objects and environments.

Feature Image Credit: Google

About the Scout

Former Writer (Kyle Melnick)

Send this to a friend