DxOMark scores shouldn’t be your definitive camera rating system

Over the last few years flagship devices have pushed the boundaries of what a mobile camera can offer. Every year sensors have improved and the addition elements like better image stabilization and dual cameras added new shooting possibilities to already robust offerings.

Keeping track of the these upgrades, many of which are tough to appreciate without an acute eye, has gotten trickier every year. It’s fallen to rating scores provided primarily by DxOMark to quantify these improvements.

Those keeping an eye on the scores may have noticed that every major flagship smartphone from one of the big manufacturers achieves a new high score, matching what many perceive to be happening with camera hardware improvements. The iPhone 8 and Galaxy Note 8 topped the DxOMark charts not long ago, but have already been surpassed by the Google Pixel 2. If you’re in the market for the latest and greatest smartphone camera, then consulting the rankings doesn’t immediately seem like a bad way to go.

DxOMark’s testing process is very robust. They test a camera’s exposure, color, texture, noise, artifacts, zoom, and others. Scores are given to each element tested, along with an overall score for each camera. It’s these overall scores that get shared by the likes of Google, Samsung, and Apple to show that their phone cameras are the newest, best options.

However, glossing over these finer details in favor of a single headline score leads to a bit of a problem with picking out the best hardware. DxOMark’s weights for the final scores can be quite controversial, especially following the introduction of a new zoom score, which benefits some telephoto phones, and a more subjective bokeh test. After all, who’s to say what the weighting of noise versus zoom should be in coming up with the overall score?

Despite its robust testing methodology, DxOMark's weightings for its final smartphone camera score can be quite controversial.

The new Pixel 2 scores 99 for photo quality, yet the Galaxy Note 8 obtained a 100. However once you throw in video scores, the Pixel 2 comes out on top with a 98 versus the Note 8’s 94. Most people are going to pay attention to the headline score, yet the Note 8 appears to be the slightly better phone for mobile photographers.

Either way, the increasing reliance on one company’s score to judge camera quality is a little problematic for the industry, especially once you realise that DxOMark isn’t just in the business of ranking phone cameras.

What does DxOMark do?

DxO Labs, the company which runs the DxOMark testing suite, is primarily a consultancy company. In other words, the company charges fees to advise camera hardware companies on how to improve their products based on its own analysis and expertise in the camera industry.

Related
No review site is guaranteed to be free from bias, but DxO’s business revolves around attracting big companies to it to make use of its expertise, which adds a lot of baggage to their reviews. Ranking test results in a way that encourages consumers to buy certain phones over others complicates everything.

The company claims to run an independent test, but is that really possible when it offers for-profit consultancy, too? There’s no reason to believe DxOMark is in anyway rigging results — after-all the company’s business model depends on its reputation and its results tend to roughly fit with the broader consensus on camera hardware. However, manufacturers that tune their cameras against the testing suite are likely to score higher than those who don’t. So does this decrease the value of the test itself?

The new Google Pixel 2 has nudged ahead of the Note 8 and iPhone 8 in DxOMark’s rankings, with the score announced of the day of Google’s unveiling. LG’s V30 hasn’t been rated.

The pay-to-win problem

In addition to consulting with other companies, DxO Labs also sells its DxO Analyzer solution for testing and measuring cameras. Obtaining a license to use the suite is expensive, especially when you factor in the costs of installation and training to familiarize companies with its functions. There’s nothing wrong with this in principle, however one would assume that a company, say a smartphone manufacturer, that refines its camera hardware using the DxO Analyzer will score highly when DxOMark comes to test the final product.

There is nothing inherently wrong with a company paying for a service that will result in better quality cameras in their smartphones. Helping to create superior photography results is in everyone’s interests. However, there’s a reliance in the media on DxoMark scores to judge camera quality, which gives the company a lot of influence over not just industry imaging quality, but also how consumers perceive smartphone products. Those who pay to work closely with DxOMark will likely score more highly in the company’s tests, which is then quoted by many other review sites. There’s pressure on smartphone OEMs to pay for DxO’s services simply to keep near the top of the pack.

The company proudly notes that “all top ten DSC manufacturers and all top smartphone and camera module makers are DxO Analyzer customers.”

DxO stakes its reputation on helping companies improve image quality and would have a lot to lose by fiddling its scores.

That appears to be the situation we’re looking at, as many of the biggest brands in the smartphone and professional camera markets are customers of DxO. HTC, Huawei, Samsung, and Foxconn are all DxO Analyzer customers, but we don’t know if other companies are also paying for consultancy. These companies look to be getting their money’s worth, with each new generation achieving a higher score than the last. But perhaps most importantly, can we be sure that these latest products are really offering tangible improvements to us consumers?

As we outlined at the beginning, structure of the weighting system used in DxOMark’s latest smartphone scoring system is highly debatable. Phones can score additional points based on minor or more niche use cases such as software bokeh, zoom or video, while wide-angled, RAW, or monochrome capabilities aren’t considered in the final score. Its scoring system is problematic as some smartphone camera designs can more easily score points than others.

This leads us to perhaps the biggest issue of all with the industry’s reliance on DxOMark. If companies are shaping their camera development around these tests, DxOMark is thereby partly shaping the development trajectory of smartphone products. However, as the tests aren’t entirely comprehensive and weigh certain features ahead of others, we’re seeing more importance placed on features that consumers may not care so much about.

Closing thoughts

All of the above considered, we should definitely take DxOMark’s scores with a pinch of salt. A company working closely with smartphone manufacturers to improve picture quality is surely a good thing for consumers, and DxO clearly knows what it’s talking about when it comes to camera quality. However it’s important to recognize the potential for bias from a company that has a need to sell services to camera developers, while also scoring the results from companies that it works closely with against those it doesn’t. More so when the tests aren’t completely comprehensive or evenly weighted for all possible features.

Do the Pixel 2, Galaxy Note 8, and iPhone 8 all feature best in class cameras? Absolutely. Is DxOMark’s ranking system reflective of camera quality? Probably, depending on the weighting of the results. If some OEMs working closely with the test provider are benefiting with better scores, that’s not inherently bad if its producing better cameras. But if we want more transparent testing and results for smartphone camera quality, consumers, reviewers, and those in the industry should want to consult a wider range of sources.

Next: Google Pixel 2 vs Google Pixel: what’s changed?

No comments:

Powered by Blogger.