The new iPhone 12 Pro was revealed at today’s Apple event. As expected, it includes a suite of features and upgrades over previous models, including a new LiDAR scanner. This technology is meant to revolutionize photography and augmented reality, but how it works is still a mystery to many. What is LiDAR, and what sort of benefits can you expect to see from it? That’s what we’re here to discuss.
What is Apple iPhone 12 Pro LiDAR scanner?
The LiDAR scanner is a new feature introduced on the latest iPhone 12 Pro and iPhone 12 Pro Max models. LiDAR stands for Light Detection and Ranging, which is a technology that uses laser light to measure distance and reflections. On the iPhone 12, LiDAR is used to take better photos and increase the performance of AR software.
How this system works is a bit complicated to explain. In a nutshell, the new scanner will illuminate the environment with laser light beams, then analyze the data to determine relative distances. Not only does this give the iPhone 12 camera array a better sense of its surroundings, it also bolsters the efficacy of the ARKit functionality.
Since augmented reality experiences rely on scanning the local environment, this scanner will be used in conjunction with other sensors to basically get a feel for the room. Cameras will be able to use the data to get a better sense of depth, which will theoretically increase low-light performance in photographs. The data scanned will also have a significant impact on augmented reality projections, since the software will better understand the scene on which it will overlay graphics.
For most users, the biggest benefit will be in photography. Apple is making a big push for better low-light captures with the iPhone 12 Pro, and LiDAR plays an important part in that process. However, the use of augmented reality is also growing, and the LiDAR scanner means developers will have “endless opportunities” (quoted from Apple) to take advantage of the new technology.