https://i0.wp.com/9to5google.com/wp-content/uploads/sites/4/2019/05/Pixel-3a-XL-camera.jpg?w=2500&quality=82&strip=all&ssl=1

ARCore phones can now detect depth with a single camera

by

ARCore is Google’s platform for augmented reality available to Android and iOS apps. To allow for more immersive experiences, a new ARCore Depth API allows any compatible phone to create depth maps from a single camera.

The ARCore Depth API allows developers to use our depth-from-motion algorithms to create a depth map using a single RGB camera. The depth map is created by taking multiple images from different angles and comparing them as you move your phone to estimate the distance to every pixel.

One principle that makes AR more realistic is occlusion, or the “ability for digital objects to accurately appear in front of or behind real-world objects.” It lets applications make sure objects are not just floating in space or placed in a physically impossible position. This is particularly useful for making apps that let you demo furniture in your living room more realistic.

The example below is an updated version of AR animals in Google Search. The cat — with its hind legs hidden — appears behind the furniture instead of just standing in front of wherever you point the camera. It’s treating the background as being 3D with depth, not just as a flat surface. This more realistic experience will begin rolling out to some of the 200 million ARCore-enabled Android devices with the Google app today.

Occlusion off
Occlusion on

“Having a 3D understanding of the world” makes possible a number of other use cases. Google has explored (examples below) realistic physics, path planning, and surface interaction with ARCore’s new depth capability.

This one lens approach lowers the barrier to the technology by not requiring specialized cameras and sensors. That said, the Depth API will only improve with better hardware in phones. Google is letting developers collaborate on the new ARCore Depth API today.

For example, the addition of depth sensors, like time-of-flight (ToF) sensors, to new devices will help create more detailed depth maps to improve existing capabilities like occlusion, and unlock new capabilities such as dynamic occlusion — the ability to occlude behind moving objects.

FTC: We use income earning auto affiliate links. More.