This article is based on an article from the Japanese edition of Engadget and was created using the translation tool Deepl.


With four models announced at the same time - the iPhone 12 mini, 12, 12 Pro and 12 Pro Max - this year's iPhone was, in some ways, as advanced as we imagined it would be, but there were parts of it that were more than we expected. What I personally didn't expect was a further improvement in camera quality.

iPhone 12

In the iPhone 11 generation, the built-in camera quality had improved significantly with the enhanced capabilities of the A13 Bionic, but now the iPhone 12 generation has improved its image quality so much that the camera that made the iPhone 11 a best-seller seems outdated.

The HDR video recording feature, which is said to be Dolby Vision compatible, is also great. This feature provides a highly realistic experience as if you are looking at the real world through a window by viewing it on an OLED display with up to 1200 nits of boost.

No other HDR video capture system is as complete and easy to use as this one, and Apple's dedication to supporting HDR video in its products (iPad, Apple TV, Mac, etc.) has made it possible for us to enjoy HDR video captured on the new iPhone on the ability of each device.

However, I'm not going to praise it all.

iPhone 12 Pro

The LiDAR scanner, on the 12 Pro and 12 Pro Max only, seems to be used in a much smaller area than I had imagined. The LiDAR scanner doesn't contribute to the experience of shooting in the light, although it does make shooting in the dark a bit lighter.

I'd like to take a look at the iPhone 12 and iPhone 12 Pro's built-in camera with some examples of actual work.

The difference between the 12 and 12 Pro isn't just the presence of a telephoto camera

Most of the examples, in this case, were shot with the 12 Pro, as the differences between the 12 and 12 Pro cameras are the telephoto camera and LiDAR, and there are no differences that affect the image quality as much as the 12 Pro Max, such as the larger sensor size of the wide-angle camera.

However, other than that, it's not exactly the same.

The difference between the two is mainly due to the computational photography process that the iPhone can perform. The iPhone 12 has 4GB of RAM, while the iPhone 12 Pro has 6GB of RAM, which is 2GB more than the 12, allowing more room in memory for the information from the camera and the analysis layer processed by the A14 Bionic.

At this point, there are no restrictions on the still camera capabilities of both, but there are for video recording, and the iPhone 12 cannot shoot 4K/60P HDR (up to 30P). This is presumably because the iPhone 12 cannot get enough buffer to extract the information from the sensor and process it properly in HDR.

The Apple ProRAW format, which will be added before the end of the year, is also only available for 12 Pro (and 12 Pro Max). This format will be the camera RAW plus the results of the computational photography process analyzed by the A14 Bionic as layer information.

It is assumed that a large amount of memory is required to buffer both the RAW data and the analysis layers and store them together in a single file. This difference in functionality is also thought to be due to the amount of memory onboard. This onboard memory difference should not affect or be almost imperceptible during normal use of the application (considering the possibility of the camera function always running). This is because some amount of memory must be reserved for the camera.

And despite these differences, I think the difference between the two is generally well within the range of not much to worry about.

The iPhone 12 series is compatible with the new MagSafe charging accessory.

LiDAR works well in the dark, but does not work effectively in the light

I'll be praising the camera quality of the 12 and 12 Pro after this, but before I do, I have one disappointment to report.

With LiDAR coming to the 12 Pro, there was one thing I was hoping for. I expected that scanning with LiDAR would allow for some degree of a subject and background separation, subject shape recognition (not high resolution), and provide a more detailed Depth Map, similar to that of using the Face ID camera, which would allow for shorter minimum focal lengths in portrait mode and more accurate background separation.

However, at least in the current version, when shooting in bright light, there is no change from the previous method of using parallax information from the three cameras (there is a portrait mode using LiDAR in dark areas).

As a result, the minimum shooting distance in portrait mode has not changed, and it must be at least as far away as the iPhone 11 generation to function. The background separation also doesn't use LiDAR information, and the phenomenon of glass rims and wine glass legs dissolving didn't change, as it has in the past. Nevertheless, the accuracy of the separation itself has increased by the amount of computational power due to improvements in the capabilities of the Neural Engine and ISP.

In the example above, the center edge of the wine bottle, which is supposed to be at the same distance, is blurred, and the depiction of the wine glass is unnatural. The wrong blur of the wine bottle may be a result of misidentifying the wrinkles in the etiquette (label). Background separation through machine learning and neural network processing still has challenges with flexibility for such unexpected images.

On the other hand, it is possible to use LiDAR information to take portraits in dark places, so there may still be the possibility of expanding the scope of LiDAR's use in the future. At the moment, however, we should only assume that the accuracy of the portrait mode operation has improved by the amount of computing power.

White Balance, Tone Map, and Dark Tones are the three major improvements

When I started using the iPhone 12 Pro's camera, I immediately noticed how accurate the white balance was. The white balance can make a huge difference in the look of a photo. Since we are not shooting in a studio, most of the time we will be shooting in a complex mix of light, but the white balance is just right, with warmth in every scene, and it makes a great impression.

For example, the following example was taken in the evening, in the shade of a valley between buildings at sunset, but if you compare it with a photo taken with the iPhone 11 Pro Max, you can clearly see the difference in white balance.

iPhone 12 Pro
iPhone 11 Pro Max

There were a few other places I photographed in the shade, and while it was properly shaded, the skin tones turned out to be pleasant and healthy colors.

I'm also impressed with the richness of the tones and the good color transfer in the areas that are drawn into the shadows, such as the contour of the face and the shadowed areas of the neck. In the photos taken with the iPhone 11 Pro Max, not only did it pull in the shadows strongly, but the colors faded rapidly, creating a picture with an emphasis on contrast.

Next, let's take a look at a landscape photo at an ultra-wide angle.

iPhone 12 Pro
iPhone 11 Pro Max

The sky is the one to look out for here, and since two generations ago, the iPhone has had a feature called Smart HDR that uses information from two different frames at different exposures to adjust the tone map for the right depiction of the subject. This is a machine-learning process, which is appropriate for each subject, and in the photo taken with the iPhone 12 Pro, the sky was recognized, creating a lush sky.

The difference between this process and other common processes is that it is not a "fill-in" process. It is all about how the information is extracted and tone-mapped from the RAW data of different exposures, so both the sky and the shadows are properly depicted, even in scenes with a large contrast difference. The A14 Bionic does what it used to do in the past by manually developing two different exposures of RAW data while making adjustments.

Finally, here's a photo I took looking up at the sky while capturing the model at an ultra-wide angle.

The brick wall in the shade, the figure of the model, the cityscape in the sun, and the sky. The range of brightness in each of these areas must be vastly different, but they are all well portrayed, and the results are convincing and three-dimensional. It could be said that this is a photograph that could not be produced by composting alone.

Use a more aggressive range of light and dark

The overall impression of color transfer is good, perhaps due to the improvement in white balance and dark gradation, but the highlights and point light sources are drawn in a way that seems to stretch out and create a strong sense of brilliance.

This is especially true of the photographs of the food. It is partly because the lighting in the restaurant is beautiful, but also because the reflections of the dishes and the luster of the food are well expressed and the colors are rich, resulting in a freshness that comes through in the photos.

The fact that this is not just an automatic retouching process is evident from the fact that the photo of the steamy soup is properly portrayed.

While these slight shimmering effects are achieved, the areas that should be depicted with smooth tones are smoothly graduated. In the photo inside the café, as mentioned above, there are differences in tone and white balance in the shadows, but also in the depiction of detailed point light sources, such as the lighting and the shine of the glasses, which gives the photo a crisp overall impression.

I think this is because the camera actively uses a range of light and dark colors to clearly depict the local contrast for each subject. It seems to me that this is a case by case process with very detailed discrimination, rather than a straightforward uniform processing of the entire image.

I would be happy if you could take a look at some of the other examples shown here.

There are some selfies mixed in as well, but as far as in-camera is concerned, it's exactly the same specifications as the iPhone 11. Despite this, the image quality looks clearly better, which is a result of the A14 Bionic's computational photography.

Easy HDR video recording that actually surprised me more than still images

In fact, the thing that surprised me the most this time was the HDR video recording.

Normally, HDR videos are shot in RAW or log format, and the tones are assigned later in the editing process (a grading process), but the A14 Bionic performs this grading process automatically, and at the presentation, it was announced that it supports Dolby Vision.

We've only been testing it for a short time, so we don't have a complete picture of it, but it does HDR grading so well that users don't have to think about HDR consciously. Simply shoot and it will automatically record in HDR and convert to SDR (Normal Dynamic Range) if needed for transfer.

The iPhone 12 and 12 Pro's displays are HDR-enabled, capable of displaying up to 1200 nits, so you'll enjoy the rich colors and realistic images of HDR as long as you watch the footage you shot on the device. The iPhone 11 Pro is also capable of displaying HDR-like videos, though not to this extent, and it's the same with the iPad Pro. It depends on the capabilities of each display, but the data is properly synced via iCloud between Apple products to maintain the integrity of the video.

So, can you use the HDR video to create your own HDR video productions? I can conclude that this can be done, but the data available is not the Dolby Vision video stream.

Dolby Vision is a stream of HDR superimposed data based on SDR data, but when I tried transferring it to my Mac via AirDrop, the video data was 10-bit hybrid log gamma (HLG) compressed with HEVC. Dolby Vision is a 12 bit and PQ curve, so it is a completely different format file. (The color gamut is the 4K standard BT.2020)

But since it's an industry-standard format, you can edit HDR videos by loading them in an application that supports HDR video editing as is.

I won't go into the nitty-gritty details here, but if you use YouTube sharing plugins, the HDR information is dropped. It's the same with iMovie for iOS. Even if the video is in HDR mode on the editing screen, the HDR information is lost on YouTube and the tone map looks a little strange.

In order to upload an HDR video to YouTube as a proper HDR video, you need to export your video to a format that can handle HDR, such as Apple ProRes, and then upload it directly to YouTube.

The picture quality in the dark is good, but so is the excellence of the microphone

In the video example, we had a model who said she was trying to become a radio personality speak while shooting a video. Although the video was shot with a simple handheld device, the quality of the microphone was very impressive in terms of the way it picked up dark noise and the balance between the tone of the voice. The HDR effect was also evident in the point light sources such as the lights in the background.

The video of DJing at the club doesn't fail despite the fairly low light conditions. The colors of the illuminations are well done in this HDR video. If you convert it to SDR, it's a mess in no time.

The SDR video is also shown separately, but if your device doesn't support HDR playback, it won't look right, so you'll have to use an HDR-capable TV, or use an HDR-capable smartphone, tablet, or computer to check.

Extra: one last remaining mystery

One thing that was not made clear during the coverage, and that I'm personally curious about, is

that still images may also be recorded in HDR.

The iPhone 12 and 12 Pro's display boosts brightness "only" when displaying HDR content, and you can see a momentary delay in the high brightness mode during HDR video playback, so you'll know right away when you get your hands on the device.

However, this momentarily delayed, high-intensity area boosting behavior can also be seen when displaying a still image. I'm not sure if it's just to improve the look of the photo, or if it's because the photo is recorded in HDR.

By the way, when you transfer files via AirDrop, it's a very ordinary HEIF file and not exactly HDR (HIEF is HDR10 compatible and can record HDR photos).

I don't see any indication of "HDR" in the "Photos" app either (HDR videos are marked as HDR photos in videos), so I don't think they are HDR photos, but I'm not sure why they behave this way, partly because of the aforementioned differences in picture creation.

Gallery: iPhone 12 Pro作例 | of 22 Photos

  • Image Credit: Engadget Japan
  • Image Credit: Engadget Japan
  • Image Credit: Engadget Japan
  • Image Credit: Engadget Japan
  • Image Credit: Engadget Japan
  • Image Credit: Engadget Japan
  • Image Credit: Engadget Japan
  • Image Credit: Engadget Japan
  • Image Credit: Engadget Japan
  • Image Credit: Engadget Japan
  • Image Credit: Engadget Japan
  • Image Credit: Engadget Japan
  • Image Credit: Engadget Japan
  • Image Credit: Engadget Japan
  • Image Credit: Engadget Japan
  • Image Credit: Engadget Japan
  • Image Credit: Engadget Japan
  • Image Credit: Engadget Japan
  • Image Credit: Engadget Japan
  • Image Credit: Engadget Japan
  • Image Credit: Engadget Japan
  • Image Credit: Engadget Japan

Related Articles:

iPhone 12 and 12 Pro hands-on review: The first 5G support will change our lives dramatically

The first thing you should try when you get your iPhone 12 or 12 Pro

Feeling the difference between the iPhone 12 and 12 Pro. Which to choose: price or potential


Cooperation for portrait photography

Seiko Hashimoto: Twitter / Instagram / YouTube

Cooperation for DJ booth filming

VRAIMENT: Tatsuya Shibasaki


This article is based on an article from the Japanese edition of Engadget and was created using the translation tool Deepl. The Japanese edition of Engadget does not guarantee the accuracy or reliability of this article.