This article is based on an article from the Japanese edition of Engadget and was created using the translation tool Deepl.
At an event on October 13, Apple announced all four models of the iPhone 12. Two of them, the higher-end iPhone 12 Pro and iPhone 12 Pro Max are the first iPhones to feature the LiDAR Scanner, which is not available on any other model.
Located alongside the rear cameras, the LiDAR scanner, which uses infrared light to measure depth, is also on the fourth generation of the iPad Pro.
One of the benefits is that the depth of the area you point the camera at, or the three-dimensional shape, can be determined with high accuracy, making it faster and more accurate for AR apps that superimpose real and virtual.
It's also very useful for photography, as autofocusing in dark places is six times faster than with conventional cameras, and you can use Night Mode Portrait to create a beautiful bokeh out-of-focus background even when shooting in dark places.
What is a LiDAR scanner?
LiDAR stands for "Light Detection and Ranging" using light. In short, it's a light version of radar.
In the case of the iPhone, it uses the ToF (Time of Flight) method, which emits infrared light into a certain area in front of the device and measures the time it takes for the infrared light to reflect off an object and return to it in nanoseconds, to measure the depth of each point, or the three-dimensional shape (depth map) of the certain area in front of the device.
(As an aside, the Kinect V2 sensor, which was included in Microsoft's Xbox One gaming console but was obliterated in a policy change, was the same infrared pulse ToF method.)
The first thing that's exciting about knowing the three-dimensional shape in front of you is that it's useful in AR applications that superimpose reality and virtual reality. Conventional AR uses complex calculations to estimate the floor and vertical surfaces by combining two-dimensional images captured by a regular camera with the smartphone's attitude sensor and other factors.
When you summon a Pokémon to a room in Pokémon Go and take a photo of it, or measure the length or area of something with the iPhone's standard "Measurement" app, you first have to perform a mysterious ritual such as "move your iPhone in a slow, circular motion" or "move to a brighter place" because it's difficult to detect floor surfaces and other standards from these camera images.
On the other hand, the fourth-generation iPad Pro, which already has a LiDAR scanner, can recognize the depth and start layering AR or measuring with almost no waiting as soon as you point the camera at it.
Also, while traditional AR apps can recognize floors and vertical surfaces, they're not very good at recognizing furniture and three-dimensional objects, which can cause AR to get bogged down in real-world objects and break up the front-to-back relationship, making them all look fake and difficult to understand at once.
Real objects have all sorts of irregular shapes, and it is very difficult to accurately and quickly recognize the original colors and patterns, or whether they just happen to look like that due to lighting, from still images and videos. LiDAR, on the other hand, can recognize the shape of the object directly, independent of the camera image. (although it uses infrared reflections, which may not be ideal for some materials and lighting conditions).
If LiDAR can recognize 3D shapes with high accuracy, it will make it possible to create more realistic expressions, such as an AR character peeking at a face from the other side of a real-life object or climbing on furniture, for example.
In the standard AR applications for room redecorating, interior design, and furniture previews, you can not only place virtual furniture to match, but you can also accurately measure the distance between the furniture and walls, and naturally preview the relationship between them.
Enhanced camera features such as shooting in the dark
Like the "Pro" models of previous years, this year's iPhone 12 Pro and 12 Pro Max are also marketed as more capable and feature-rich as cameras than the standard iPhone 12.
According to Apple, the iPhone 12 Pro and 12 Pro Max have six times faster autofocus speed in dark places thanks to LiDAR. It allows you to take quicker, less out-of-focus photos in the dark.
In addition, the ability to acquire depth maps makes it possible to take portraits in night mode, which was previously unsupported. This means that even in night scenes and darkened rooms, you can take photos with accurate foreground subjects and appropriately blurred backgrounds.
While the "larger" iPhone traditionally has a higher camera performance than the regular model, the iPhone 12 Pro Max has a 47% larger image sensor and an 87% improvement in nighttime shooting performance.
It also features a new sensor-shift system used in other single-lens reflex cameras, enhanced optical image stabilization to 5,000 adjustments per second, a 2.5x optical zoom lens (5x from ultra-wide), and a newly designed seven-element f/1.6 wide-angle (standard) camera lens that is brighter. These features reinforce the "photographer's iPhone" and "high-performance camera you can always carry" aspects of the Pro model.
Both the Pro and Pro Max are equipped with LiDAR, and both can take advantage of faster autofocus and darkroom portraits.
The camera features only on the Pro Max include a wide-angle (standard) camera with a large 1.7 μm pixel sensor (1.4 μm on the Pro), sensor-shift OIS, and 2.5x telephoto (2X on the Pro).
Deep Fusion, which captures fine detail in low-light areas, used to be a feature that users didn't know when and under what conditions it would be enabled or disabled, chanting "Just Works" and hoping it would work, but with the iPhone 12 Pro, it now works on all cameras from ultra-wide to telephoto.
Speaking of camera performance, computing power, which has become just as important as optics, has also been greatly improved in the iPhone 12 generation, with the neural engine now having more cores, 80 percent faster than the previous generation, and a new image signal processor.
The Deep Fusion and other Computational Photography-type features, that produce the final image via neural engine and image processor calculations, has never been a good match for the RAW format, which outputs the raw information from the image sensor and develops it in other apps. However, Apple has also announced that it will soon offer the Apple ProRAW format, which includes RAW plus various information about the image processing pipeline.
It's not just with the iPhone that new devices are often described by some as not being new or innovative, losing sight of their "essence" and getting stuck, changing things that aren't necessary, and that it should be more about the user experience than its performance. However, there still seems to be plenty of room for progress in terms of cameras and AR capabilities - devices that capture and understand the real world outside of the smartphone.
This article is based on an article from the Japanese edition of Engadget and was created using the translation tool Deepl. The Japanese edition of Engadget does not guarantee the accuracy or reliability of this article.