What Is the iPhone 12 Pro LiDAR Scanner and What Can You Do with It?

Posted on
3D Insider is ad supported and earns money from clicks, commissions from sales, and other ways.

One of the more intriguing new features of the iPhone 12 Pro is its LiDAR scanner. Integrated into the phone’s camera array, the LiDAR scanner looks just like a single dot. Its unassuming appearance was quite dissonant from the way that Apple hyped this new feature.

What exactly is LiDAR and what does this sensor do? Is it a real game-changer for the iPhone 12 Pro? Are there apps that use the LiDAR sensor? What does the sensor do that other older features cannot?

What is LiDAR and how does it work?

Light Detection and Ranging (LiDAR) is a technology that has been around for several decades. It was initially developed for use as a mapping tool by the military but has since then found its way to more consumer-level goods such as drones and self-driving cars.

In a nutshell, LiDAR is a technology that allows a device to create a 3D map of its environment. This is achieved by emitting continuous pulses of non-visible light from a point source. These pulses of light then bounce off the objects in the area. Based on how quickly they are detected by a sensor, a LiDAR module can deduce the location and dimensions of any “obstacle” in the area being surveyed.

The result of LiDAR data processing is a point cloud representing real objects in 3D space. On its own, LiDAR cannot collect surface data such as colors and lights. However, image data from cameras can be overlaid on LiDAR data to create “true” 3D images.

Most uses of LiDAR have focused on its ability to create highly accurate and detailed maps. Compared to other mapping methods, LiDAR is faster, generally more accurate, and has the extra benefit of not relying on visible light. However, LiDAR sensors are typically also a lot more expensive compared to alternative technologies.

This isn’t the first time that Apple has integrated LiDAR sensors into their products. Earlier in 2020, Apple launched a new iPad Pro with upgraded cameras and a LiDAR sensor, among several other new features. With LiDAR makings it way to the iPhone, it will now be introduced to a much larger user-base.

How LiDAR works in the iPhone 12 Pro

Obviously, there are a few differences between the large LiDAR sensors used for aerial mapping and the tiny one embedded in the iPhone’s camera array. With no moving parts, this means that the iPhone’s LiDAR sensor has a limited field of view, although that can be easily addressed by simply moving the phone around. It still has benefits inherent to LiDAR technology such as enhanced range and the ability to capture ‘occluded’ spaces (we’ll get to that later).

A distinct characteristic of LiDAR is that it’s a very hardware-intensive process. To collect data, a LiDAR module files off several pulses of light simultaneously which then bounce back in picoseconds. These have to be done very quickly and repeatedly to create the dense point cloud that the device aims for. It’s easy to see how much power it takes to process such a large data set.

According to Apple, data from the LiDAR sensor isn’t processed independently. Instead, it also collects data from the camera and the phone’s motion sensors. This is likely to augment the point cloud with surface data and to determine the position of the objects in 3D space.

Fortunately, the iPhone 12’s A14 Bionic processor is more than up for the task. According to the iPhone 12 Pro press release, all the data from the LiDAR sensors, camera, and motion sensors are enhanced using “computer vision algorithms” to provide more details to every scenery. Basically, a lot is going on under the hood to make the iPhone 12’s LiDAR sensor work as well as it is.

What you can do with the LiDAR sensor

All this begs the question – what you are going to do with a LiDAR sensor? Unfortunately, there aren’t a lot of applications to get excited about just yet. After all, we’re less than a year from when the iPad Pro first come out with a LiDAR sensor.

The concept of “occlusion” is one of the new things that the iPhone 12 Pro can do thanks to LiDAR technology. This means being able to place virtual objects behind real objects – something that was not possible using camera-based AR apps. How does the iPhone 12 Pro take advantage of this concept? Let’s look at what we have right now:

AR Gaming

Hot Lava

Using LiDAR sensors for gaming may seem like a strange concept but that’s only if you don’t consider the potential of AR gaming. AR stands for Augmented Reality and is a term that refers to combining Virtual Reality objects and environments with the real world. LiDAR sensors come into the equation because they can create a “virtual” version of real-world objects.

Apple demonstrated this concept with Hot Lava, which is basically a 3D platform game spin on “the floor is lava.” While the game can be played normally on your screen, the LiDAR sensors will make it possible to make an entire Hot Lava course out of your own living room. The obstacle course will include objects both real and virtual.

Another game that uses LiDAR sensors to build an AR environment is RC Cars. As the name implies, the game has you driving miniature cars around an environment modeled around your surroundings. The car interacts and bumps into your actual walls and furniture. To spice things up, you can augment the AR environment with virtual ramps, obstacles, and traffic cones.

There are about a handful of other AR games that use the iPhone’s LiDAR sensors, but at best you have a selection of around ten titles. Quite disappointing, but this only means that there is much room for game development.

Interior design

AR Home Design

A majority of the apps that use the iPhone’s LiDAR sensors have to do with some sort of interior design. This is a field where LiDAR sensors could shine over the next few years.

Canvas is the most basic AR app that uses LiDAR sensors. First launched for the new iPad Pro, Canvas can create a 3D model of any room as you walk around it. Once the 3D model has been built, you can rotate it, zoom at any of the details, or even look at it from a top-down view. Do you need a floor map for your living room? Just let Canvas do the model and manipulate it on your phone.

AR Home Design is for more general interior decorating. By starting with a 3D model of an interior space, AR Home Designs allows users to swap in different paint, tile, and wallpaper options virtually. Since LiDAR creates actual solid models, you can see how paint or wallpaper wraps around a wall rather than simply being laid over it.

The more brand-specific Ikea Place allows you to virtually place Ikea furniture in your living space. With LiDAR’s occlusion ability, you can even place a virtual lamp behind that couch or a chair under a table. The virtual furniture are placed according to scale aided by LiDAR’s accurate measurement capabilities.

Measurements

Speaking of measurement, LiDAR sensors on your phone could make it possible for you to never use a tape measure again. Using the suitably named Measure app, you can just scan any object with your iPhone 12 Pro, and it will provide accurate measurements for you. Older iPhones have been able to do this but only using a photogrammetry-based technique. LiDAR-based measurements are significantly more accurate.

3D scanning

Polycam

One of the most exciting things to explore with the iPhone’s LiDAR scanner is 3D scanning. This means taking a real-world object, scanning it with the sensors, and obtaining a 3D model of the object. This 3D model can then be manipulated and reproduced using a 3D printer.

The most popular app for 3D scanning right now is Polycam. The app is specially designed to make use of the LiDAR sensors of the iPad Pro and iPhone 12 Pro. 3D models can be measured or cleaned up on the app itself. If you’re a 3D modeling professional, you can also export the models in standard models such as OBJ or STL.

Better low-light focus

A good reason for the LiDAR sensor sitting right beside the iPhone’s cameras is that it can also help image quality. Take note that LiDAR works even in the absence of visible light. When capturing photos in low-light conditions, the LiDAR sensor can jump in with depth-sensing capabilities. This helps the camera get better and faster focus.

With the LiDAR camera, you can now shoot portrait-style photos in low-light with good separation between the subject and the background.

Again, app developers have probably only skimmed the surface of what is possible with the iPhone’s LiDAR sensors. A few more years down the line and we may start seeing apps that we never imagined could exist. That has been par for the course considering the rate at which technology has been evolving recently.

Final thoughts

While we don’t imagine most people upgrading to the iPhone 12 Pro solely because of the LiDAR sensors, it’s still an amazing feature. This is just the latest example of a technology that used to be exclusively used and available to professionals becoming accessible to mainstream tech users.

With more people now holding LiDAR sensors in their hands, we expect more developers to try and take advantage of this relatively novel technology. Will LiDAR sensors eventually become staple features of smartphones? This seems very likely although the added cost is still a hurdle that phone makers need to overcome.