mobile phones are taking steps in the wrong direction
Mobile photography is at a crossroads. Most users understand this: photos are full of exposure, color and contrast, even if they are not realistic. The vibrant look is ideal both for social media and for lovingly preserving those memories we save on our mobile phone without the need for editing.
But This speech conflicts with the names “Pro” and “Ultra”. what we see so much of today is cameras that shoot RAW, record in LOG, have pro modes and multiple lenses to try and get the idea that we need a pro camera out of our heads.
Recently, for professional reasons, I upgraded my old Canon 77D to a new photo and video camera. The one that already arrives with artificial intelligence, noise reduction algorithmsshooting in RAW format using Dual Pixel technology… many features that mobile photography enthusiasts usually criticize for the lack of realism they ultimately bring. My conclusion was clear: we can implement software that will help us improve results without destroying the photograph.
I want to make it clear: I do not and will not expect the phone to have the performance of a professional camera. However, I think it is necessary to point out that if mobile phones want to be marketed as “Professional” in the field of photography, they should not go in the opposite direction of professional cameras.
The problem of modern mobile photography. For some reason, it seems that it is almost impossible to get a realistic photo of good quality. Or that’s what cell phone manufacturers want us to think, because it’s technically incorrect. One of the main problems today is the watercolor effect.
I can count on one hand the number of modern phones that don’t make a mess when zoomed in. It’s not the sensor, the lenses that are bad, or that they don’t have enough power to provide a good level of detail. This is, plain and simple, a bad process. In order to completely eliminate noise and increase contrast, photographs are completely colorized.
This wouldn’t be a problem if it weren’t for the fact that by using third-party apps, we understand the damage manufacturers cause during processing. We can have much more natural looking photos, but we don’t get there.
Skins of non-existent colors: chinole yellow, with very pronounced contours. These are problems that cut across the spectrum and that we’ve been complaining about for generations. Yes, cell phones take great photos, but we could get results very similar to professional cameras if the processing wasn’t so aggressive.
How modern cameras solve this problem. You might think that with so much artificial intelligence, so much image segmentation, and so much software involved, mobile photography would inevitably become something artificial. But this is not at all true, and modern cameras are proof of this.
The first myth to dispel is that noise abatement is necessarily problematic.
This is a great example of how cameras, both entry-level and semi-pro, solve this problem. It’s something that’s been done for generations, and thanks to new processors, it’s getting better and better. Here is an example of a 100% increase, which shows that there is no noise, but there is not the slightest watercolor effect.
“Of course, with a camera, anyone!” We’re going to do the same thing with one of the worst high-end cameras right now, the iPhone 16 Pro’s ultra-wide camera.
Besides brightness, pay attention to processing. The edges of the image on the right are completely artificially sharpened. In the first image, it would have been enough to increase the exposure a little to get a more than usable photo. Problem? To achieve the brutal brightness that modern cameras usually have (we all want HDR) one photo is not taken, many are takenand everything ends up being a mixture of some photographs and others. Despite this, data processing problems on phones can be solved with a very simple law: process less.
Another implementation in which cameras perform better than mobile phones is the implementation of artificial intelligence. Instead of focusing on modifying the final photos, AI processors are used to better track objects, recognize the human eye for better focusing – something that Sony phones inherited from the Alpha division and that Xiaomi also does, or just to improve scene recognition. in the cases that we indicate.
In other words, use AI to improve what already worksmore than to produce increasingly less realistic photographs. The cameras on our phones have been stagnant for several years now, and without changes in processing philosophy it will be increasingly difficult to keep moving forward.
Image | Kanon and Hataka
In Hatak | How to Take Better Photos on a Cheap Cell Phone: Seven Tricks That Compensate for Tech Limitations