YouTuber Marques Brownlee, also know as MKBHD, shared the results of his 2022 Smartphone Awards last month. And although the iPhone 14 Pro won in the Best Camera System category, the YouTuber pointed out some flaws regarding the photos taken with Apple’s latest smartphone. Now MKBHD is back with a video in which he details why some iPhone photos are getting worse – and the answer is: post-processing.
Before the results of the 2022 Smartphone Awards, MKBHD also shared the results of its blind camera test. In this one, Google’s Pixel 6A took first place, while the Pixel 7 Pro came in second. This led the YouTuber and many people to wonder what’s going on with the photos taken with the iPhone.
Image post-processing is becoming exaggerated
In order to take a good picture, it’s important to have a good sensor capable of capturing as much light and detail as possible. However, since camera sensors found in smartphones are very small compared to DSLRs, phone manufacturers have been introducing new tricks every year to improve these images with post-processing.
Pretty much any modern smartphone uses a combination of hardware and software to adjust images after they’ve been taken in an attempt to make them look better and compensate for the lack of a large sensor. This includes things like reducing the noise level, adjusting the white balance, and increasing the brightness to show more detail in dark scenes.
But in recent years, Apple and other companies have been taking this to the next level. On the iPhone, Smart HDR combines multiple photos in different settings into one. This allows the phone to choose the best aspects of each of them to result in a better photo. But when there’s a lot of post-processing going on, these images can look unrealistic. And this is what has been happening with the iPhone camera.
As pointed out by MKBHD, most phones handle well in favorable scenarios, such as a clear sky or a subject in front of a clear background. But when you have different colors and textures in the same scene, the post-processing must be smart enough to understand what will be the best setting for all these elements.
But the thing is, while companies like Google are doing it the right way, Apple is definitely not. As shown by the YouTuber, the iPhone 14 Pro always tries to lighten the shadows, especially on people’s faces, making the photo look very artificial. The iPhone also exaggerates the sharpness of the photos compared to other smartphones. MKBHD even complains that his skin tone looks quite different on the iPhone camera.
Apple is ruining the iPhone camera with all these smart features
Even if the iPhone has great camera hardware, it’s being ruined by all the smart features like Smart HDR that Apple has been introducing in recent years. Every year, the company adds even more steps to the camera post-processing. But instead of making the photos better, they just make it more unnatural.
In the iPhone 14 Pro camera review by Sebastiaan de With, developer of the popular camera app Halide, he also pointed out multiple flaws in Smart HDR. For example, every time there’s a very bright background, the iPhone also tries to boost the brightness of the people in the photo, making them look very white. “I have honestly never seen it make for a better photo. The result is simply jarring,” he said.
This effect is part of Apple’s Smart HDR, which ‘segments’ human subjects in photos and boosts their brightness significantly when backlit post-capture.
We’ve illustrated the subject detection and a likely ‘how it looked’ to the camera:
(This does not occur when capturing RAW) https://t.co/5APCtqKu7t pic.twitter.com/nKjaYQgVnc
— Halide (@halidecamera) September 20, 2022
In another example, the iPhone camera applies a lot of “bizarre artifacts” to selfies taken in really low-light environments to try to save the image, but this ends up resulting in an “absurd watercolor-like mess” instead of a regular dark photo with a lot of noise.
Personally, I’ve also been noticing how Smart HDR is ruining some of my photos, which also get too sharp and with exaggerated colors. On Reddit, many iPhone users seem to agree with this.
iOS feature request: An option to turn off Smart HDR. Sometimes it just ruins the photos (in this case, it destroyed the sky compared to the Live Photo without the same processing). pic.twitter.com/Zb4cPS6qO4
— Filipe Espósito (@filipeesposito) October 5, 2022
Apple should give users the option to take natural photos
For years, iPhone users made fun of other smartphones because their photos looked too artificial. Now we have reached the point where iPhone photos look very unnatural. While I hope the company improves Smart HDR, I would prefer an option to reduce or completely turn off image post-processing in the iPhone camera.
You can, of course, take a RAW photo using apps like Halide (it’s worth noting that ProRAW photos are still post-processed), but then you’ll have a much larger image file just to get a more natural result.
What about you? Has the iPhone camera’s exaggerated post-processing been ruining your photos too? Let us know in the comments section below.
This content was originally published here.
Recent Comments