Apple today introduced its latest lineup of smartphones, including the iPhone 12 Pro and iPhone 12 Pro Max, both of which are equipped with a LiDAR scanner which will bolster AR capabilities.

Like the iPad Pro introduced earlier this year, Apple is now bringing a LiDAR scanner to its high-end smartphones, the new iPhone 12 Pro and 12 Pro Max.

LiDAR is a so-called ‘time of flight’ depth-sensor which measures how long it takes for light to bounce off of objects in the scene and return to the sensor. With precise timing, the information is used to judge the depth of each point. With rich depth information, augmented reality experiences can be faster and more accurate.

Image courtesy Apple

While existing iPhones are already capable of pretty good AR tracking, the current approach derives depth from computer-vision techniques like SLAM, which tracks points in the scene over time to infer depth. Typically this means that the system needs a few seconds and some movement from the camera before it can understand its frame of reference and begin to assess the depth of the scene.

Apple says that LiDAR in the iPhone 12 Pro and 12 Pro Max means the phones will be capable of “instant AR.” That’s because LiDAR captures depth information in the equivalent of a ‘single photo’, without any phone movement or the need to compare images across time.

Image courtesy Apple

One way to think about it is to think about the pixels in a photograph. When you take a picture, every pixel captures color and brightness information. Conversely, every pixel of a ‘LiDAR snapshot’ captures a distance value. So rather than needing to wave your phone around for a few seconds before an AR app can establish accurate tracking, tracking can start immediately.

Of course, you can also compare LiDAR depth data over time so that instead of a simple snapshot of depth you can build an entire depth-map of the scene. With LiDAR, you can ‘scan’ a space to create an accurate 3D map which can be very useful for augmented reality experiences. Building such 3D maps was possible before, but the increased depth accuracy of LiDAR will make them faster and more accurate.

 – – — – –

You can be sure that this same tech will find its way into Apple’s upcoming AR glasses. Seeing the sensor come to the company’s latest iPhones means that Apple is one step closer to shrinking the tech and making it power efficient enough to fit into a head-worn device. We also wouldn’t be surprised to see other companies in AR and VR begin building LiDAR sensors into their own devices.

Apple’s iPhone 12 Pro is priced at $1,000 and launches on October 23rd, while the larger iPhone 12 Max is priced at $1,100 with a release date of November 13th. The company’s other newly introduced phones, the iPhone 12 and iPhoner 12 Mini, do not include the LiDAR sensor.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • Very cool!

  • wheeler

    Will this technology deployed at this scale reduce the cost of lidar for other applications as well?

  • knuckles625

    Hey, it’s Project Tango 4 years later at twice the price! But in all seriousness, as usual Google was premature and abandoned it after providing little support. Apple will probably stick with it, drive it with developers, and hit mass adoption. Wonder when Apple Daydream comes out…

    • silvaring

      Hololens also uses infrared projection for SLAM and hand tracking. I think its main benefit for VR will be hand tracking and helping with depth mapping for home use (e.g 3D scanning home objects / components etc). It’s weird to talk about this stuff and know its happening now, I used to envision it as some weird far future technology just 10 years ago…. exciting times.

  • nejihiashi88

    i would love to see their processor on a VR headset it is far more advanced than snapdragon, and the lidar i think is excellent for VR, hope they don’t miss the train.

  • Bumpy

    I don’t see much gain having it on a phone, but with mobile VR or AR glasses sign me up.

  • deHavilland

    We’ll see photos where you define depth of field at any time after
    shooting them (by blurring the rest according to its depth, thus
    optically correct). But you may also break optical laws and define 2 or 3
    depths of field. And this is just for starters…

    I never was a big fan of Apple, but must concede it always did a lot for artists and creativity.

    • silvaring

      Agreed, its very nice they are pushing this kind of tech into the prosumer market. I don’t know if it was the compression on the stream though but the videos they showed off with the Pro / Pro Max looked quite messy in motion.

  • guest

    Can anyone tell me how many of these things can be used in the same room, pointing in the same direction?

    • a

      No limit. It’s like asking how many cameras can be operated in the same room.

  • batmobil

    Sounds nice on paper, but it doesn’t work with reflective materials. Metal, shiny plastic, windows, mirrors, varnish, edges.. Its all going to mess up the depth measurements making them not just inaccurate but completely off. Also, the sensor has very limited reach.