Best of Android 2017: Which camera IS the best?
One of the most important areas for every smartphone is how good its camera is, but there are two ways to judge a camera: by whether it is technically good, or whether it looks good. Often, the most accurate smartphone cameras aren’t the ones that produce pictures that look good, so how do you decide which is the best smartphone camera?
For Best of Android 2017, we introduced an all-new method of testing smartphone cameras objectively, but we also wanted to see which smartphone camera looks the best. To do so, we split the camera into two parts. Below we’ll go into which is technically the best, based on all of our data. If you’re interested in which looks the best, check our 10 phone camera shootout post and vote in our poll.
What we tested
Given that modern photography is digital, objectively assessing image quality should be pretty straightforward, right? Wrong.
As recent controversies with scoring objective image data have highlighted, it’s very tough to come up with scores that mean something to the average consumer. To non-enthusiasts, taking a dive into test results can be boring and stressful, and nobody wants that. While the data we collected is much more comprehensive and complicated than what we’re showing here, we picked a few different basic measurements to compare the cameras of our candidate smartphones. Don’t worry, we’ll guide you through what we found without making it any more complicated. No scores out of ten, no hiding measurements behind skewed graphs, just the data and expert analysis from yours truly.
Bear in mind that photography is also an art form; what looks great often isn’t objectively great. For example: Instagram and a bunch of Lightroom presets will add imperfections and characteristics of “bad” cameras for artistic reasons. To most, “perfectly” processed photos will look drab, a bit soft, and somewhat lifeless. I went on phones where that suffered due to either a shortcoming of smartphone cameras as a whole, or a limit to human perception.
While we could rip into the hundreds of pages of esoteric results, we don’t really need to go beyond camera sharpness, color performance, noise performance, and video performance.
How we tested
Testing a camera unit objectively means removing as many variables as possible. In short: we had to create a lab for specifically this purpose. If you’d like to know more about that process, you can delve into all the nerdy details here.
With our perfectly blacked-out testing lab done, we then needed the right equipment. For this, we partnered with imaging specialists Imatest in Boulder, CO. I’ve used their systems in the past for other outlets, and their off-the-shelf solution gives users a time-tested way to get rock-solid objective camera test results. It’s our desire to be as accurate as possible, so instead of banging our heads against the wall in creating our own wrapper for a MATLAB analysis—we got the right software for the job.
Our data is collected from only a handful of shots of test charts. Here’s a quick rundown:
1:The Xrite Colorchecker is a 24-patch chart containing a 6-patch greyscale and an 18-point color range. From this chart, we can measure color error (ΔC 00, saturation corrected), color saturation, white balance, shot noise, and more.
2:The SFRPlus chart is a multi-region slanted-edge resolution chart, capable of revealing all sorts of fun performance data. This is how we test the sharpness capabilities of our cameras, but it also allows us to quantify distortion, lens defects, chromatic aberration, and more. We store all of this data, though we’re only covering sharpness here. If its warranted later on, we can dredge up our other findings.
3:The DSCLabs Megatrumpet 12 is a chart designed to test the video sharpness capabilities of any 4K-capable image sensor. By panning the camera during recording, the incredibly tiny lines disappear, leaving only a blotchy grey area. This gives us a fairly reliable quantification of how much data a given camera can resolve in a unit called line pairs per picture height (LP/PH).
4:A randomly-generated spilled-coins chart created with Imatest’s chart generator function allows us to expose the weaknesses of the noise reduction algorithm that’s present on all consumer cameras. Ever notice how photos you take in low light look blotchy and strange? That’s the noise reduction feature getting confused about what’s noise and what’s detail. With lots of hard, round edges and bright colors, this chart shows how a camera is likely to remove detail in the name of noise reduction.
Results
After testing our candidate phones, it was striking how similar performance was across most cameras. It’s possible this is because a lot of the image sensors in mobile devices are manufactured by Sony, but it could also have a lot to do with the fact that there are very clear limitations to image sensors that small. Sure, image processing has come a long way—it’s incredible that these units can even take a picture, really—but many of the variations in performance seem to have a lot more to do with software than hardware.
Comments
Post a Comment