YouTuber Marques Brownlee, also known as MKBHD, shared the results of his 2022 Smartphone Awards last month. And although the iPhone 14 Pro won in the category of best camera system, the YouTuber pointed out some shortcomings regarding the photos taken with Apple’s latest smartphone. Now, MKBHD is back with a video in which he explains why some iPhone photos get worse—and the answer is: post-processing.
Ahead of the 2022 Smartphone Awards results, MKBHD also shared the results of its blind camera test. In this one, Google’s Pixel 6A takes the top spot, while the Pixel 7 Pro comes in second. This has left the YouTuber and many people wondering what happens to the photos taken with the iPhone.
Image post-processing becomes exaggerated
To take a good photo, it is important to have a good sensor capable of capturing as much light and detail as possible. However, since the camera sensors in smartphones are very small compared to DSLRs, every year phone manufacturers introduce new tricks to improve those images with post-processing.
Almost every modern smartphone uses a combination of hardware and software to adjust images after they’ve been taken in an attempt to make them look better and compensate for the lack of a large sensor. This includes things like reducing the noise level, adjusting the white balance, and increasing the brightness to show more detail in dark scenes.
But in recent years, Apple and other companies have taken this to the next level. On the iPhone, Smart HDR combines multiple photos with different settings into one. This allows the phone to select the best aspects of each to produce a better photo. But when a lot of post-processing is done, these images can look unrealistic. And that’s what’s happening with the iPhone’s camera.
As MKBHD pointed out, most phones do well in favorable scenarios, such as a clear sky or an object against a clear background. But when you have different colors and textures in the same scene, post-processing needs to be smart enough to figure out what will be the best setting for all those elements.
But the thing is, while companies like Google are doing it the right way, Apple definitely isn’t. As the YouTuber points out, the iPhone 14 Pro always tries to lighten the shadows, especially on people’s faces, making the photo look very artificial. The iPhone also exaggerates the sharpness of photos compared to other smartphones. MKBHD even complains that his skin tone looks quite different on the iPhone camera.
Apple ruins the iPhone camera with all these smart features
Even if the iPhone has great camera hardware, it’s ruined by all the smart features like Smart HDR that Apple has been introducing in recent years. Every year, the company adds even more steps to camera post-processing. But instead of making the photos better, they just make them look more unnatural.
In the iPhone 14 Pro camera review by Sebastiaan de With, developer of the popular Halide camera app, he also pointed out a number of flaws in Smart HDR. For example, whenever there is a very bright background, the iPhone also tries to increase the brightness of the people in the photo, making them look very white. “Honestly, I’ve never seen him take a better picture. The result is simply staggering,” he said.
This effect is part of Apple’s Smart HDR, which “segments” human subjects in photos and significantly increases their brightness when backlit after capture.
We’ve illustrated the object recognition and what it likely “looked like” to the camera:
(This doesn’t happen when shooting RAW) pic.twitter.com/nKjaYQgVnc
— Halide (@halidecamera) September 20, 2022
In another example, the iPhone’s camera applies a lot of “weird artifacts” to selfies taken in very low-light environments to try to preserve the image, but it ends up with an “absurd watercolor-like mess” instead of a normally dark photo with a lot of noise.
Personally, I also notice how Smart HDR ruins some of my photos, which also become too sharp and with exaggerated colors. On Reddit, many iPhone users seem to agree.
iOS Feature Request: Option to turn off Smart HDR. Sometimes it just ruins the photos (in this case it destroys the sky compared to Live Photo without the same processing). pic.twitter.com/Zb4cPS6qO4
— Filipe Esposito (@filipeesposito) October 5, 2022
Apple should give users the ability to take natural photos
For years, iPhone users scoffed at other smartphones because their photos looked too artificial. We have now reached the point where iPhone photos look very unnatural. While I hope the company improves Smart HDR, I would prefer an option to reduce or completely turn off image post-processing in the iPhone camera.
You can of course take a RAW photo using apps like Halide (it should be noted that ProRAW photos are still processed), but then you’ll have a much larger image file just to get a more natural result .
What about you? Is the excessive post-processing of the iPhone camera ruining your photos too? Let us know in the comments section below.
Read also:
FTC: We use automatic affiliate links that earn revenue. More ▼.
Check out 9to5Mac on YouTube for more Apple news:
Add Comment