Opinion – DXOMark’s camera scoring makes ZERO sense!

The RED Helium 8K (approx. APS-H size sensor) just leapt to the top of the DXOMark charts with a score of 108.

DXOMark is a software lab test that gauges the RAW stills performance of our sensors.

Personally, I find DXOMark’s testing methodology a bit strange.

First of all they mix cinema cameras and stills cameras on the same chart, but make no mention of any non-RED cinema cameras.

Why not test other RAW shooting cinema cameras or those from Blackmagic? Where is the Sony F65? Arri Alexa? Canon C500?

It’s well known RED is interested in courting the high end stills and medium format market and it strikes me as odd that they are the only ones to appear on the most well known stills benchmark chart.

Then there’s the ranking scores themselves. I decided to take a look at the overall chart to see who and what ranks where.

In the top 50 (view the full chart at DXOMark), the Samsung NX500 is placed higher than the full frame Canon 5DS, Phase ONE P40+ medium format digital back and Sony A7S II. What the hell!!?

  1. Sony A7R II (98)
  2. Nikon D810 (97)
  3. Sont RX1R II (97)
  4. Pentax K1 (96)
  5. Nikon D800E (96)
  6. Sony A7R (95)
  7. Nikon D800 (95)
  8. Nikon D600 (94) – anomaly 3
  9. Nikon D610 (94)
  10. Nikon D750 (93)
  11. Sony RX1 (93)
  12. Sony A99 II (92)
  13. Phase One IQ180 (91)
  14. Sony RX1R (91)
  15. Canon 5D Mark IV (91)
  16. Sony A7 (90)
  17. Sony A7 II (90)
  18. Nikon Df (89)
  19. Nikon D4 (89)
  20. Phase One P65 Plus (89)
  21. Nikon D4s (89)
  22. Sony A99 (89)
  23. Leica SL (88)
  24. Canon 1D X Mark II (88)
  25. Nikon D3X (88) – anomaly 2
  26. Nikon D5 (88)
  27. Sony A7S (87)
  28. Nikon D7200 (87)
  29. Samsung NX500 (87) – anomaly 1a
  30. Canon 5DS (87)
  31. Phase One P40 Plus (87)
  32. Canon 5DS R (86)
  33. Nikon D3400 (86)
  34. Sony A7S II (85)
  35. Sony A6300 (85)
  36. Leica Q (85)
  37. Sony A6500 (85)
  38. DxO ONE SuperRAW Plus (85) – anomaly 4
  39. Nikon D5500 (84)
  40. Leica M Typ 240 (84)
  41. Nikon D500 (84)
  42. Nikon D5200 (84)
  43. Nikon D7100 (83)
  44. Nikon D5300 (83)
  45. Samsung NX1 (83) – anomaly 1b
  46. Pentax 645D (82)
  47. Nikon D4s (82)
  48. Pentax K-5 IIs (82)
  49. Sony A6000 (82)
  50. Sony A77 II (82)

Further investigating the top 50 you will find quite a lot of anomalies.

Whilst the cheap Samsung NX500 is at 29, the higher end Samsung NX1 is way lower at 45 despite having exactly the same sensor.

Apparently, the Nikon D3400 produces as good an image as the Canon 5D R as they share exactly the same score (86).

The DxO ONE add-on for smartphones ranks just 1 point below the Canon 5D R and above the Leica M Typ 240. Really!?

DXOMark have somehow even managed to have the 8 year old Nikon D3X end up higher than the D5.

I am also confused as to how the Nikon D600 ends up at position 8 in the chart, that’s a full 16 places ahead of the Canon 1D X Mark II.

Clearly if you take these scores at face value they are heavily flawed, in my opinion.

Then there’s the sub-scoring. Sort the chart by the Sports score and DXOMark will claim the Sony A7S is your top sports camera. Obviously you wouldn’t use the A7S for sports as the raw buffer isn’t big enough, the burst rates not fast enough and the autofocus not fast enough – way too many people are attaching great significance to DXOMark and minor differences in sensor performance.

The sports score on DXOMark is actually a measure of low light performance but not the amount of noise on a per-pixel level, rather, the measurement is taken after downsampling the RAW stills to a lower resolution. How accurate is this? Well it certainly ends up with the A7R II getting a score of 3434 well ahead of the latest low-light optimised A7S II on 2993. We all know the A7S II has less noise than the old model as well, yet the original A7S is on 3702! And the 12MP Nikon D3s / D700 score from 2009 still competitive today vs the latest sensors for low light? Nope. In my experience it is not better than a 5D Mark III at high ISOs!

As for dynamic range, RED officially claimed almost 17 stops for the Helium sensor. For your fifty nine thousand dollars DXOMark claims the Helium only has a 1.5 stop dynamic range advantage over the $3000 Sony A7R II. Didn’t stop the internet going wild over its chart-topping score though.

DXOMark claims to measure RAW sensor performance but actually some sensors process the RAW data in special ways. The Arri Alexa for example has dual gain architecture to boost dynamic range by reading two signals per pixel out on a low and high gain. Does this count as RAW sensor performance or image processing? Similarly if RED’s low light score has been achieved with LSI processing does it count as sensor performance or image processing performance?

The pixel pitch on the Helium 8K sensor is approximately 3µm which compares to 6µm on most full frame cameras and 8.4µm on the Sony A7S… but the low light score given to the RED camera on DXOMark is astronomical, the highest ever recorded at 4210. The Panasonic GH4 has a larger pixel pitch of 3.6µm but scores just 791 on the DXOMark chart for low light. How did DXOMark and RED achieve that Helium score with such a densely packed megapixel count on a crop sensor?

It’s well known RED is interested in courting the high end stills and medium format market.

Finally when it comes to colour, nothing really stands out about Canon and the medium format backs in the chart with Sony continually ranked higher for portrait shots even though we all know Canon’s skintones are more pleasing, even in RAW files and medium format remains the benchmark for professional portrait shooters.

In my opinion DXOMark have some explaining to do…