Originally Posted by randal_r
This issue goes deeper then OLEDs vs LCDs . One has to also take into account of the color decoder, various internal circuitry, back lighting methods, ...., etc. This could be a whole thread in it's self.
Not really. All that should be accounted for in measuring the spectrum of light emitted by the display. If, on the other hand, the particular display primary spectra happen to emphasize a discrepancy between the CIE1931 standard observer and real life observers, then it's all about the CMF being used.
The 1931 observer does have some acknowledge technical flaws in the S curves (blue region), and these were largely corrected in the 1964 10 degree observer CMF measurements - but it is 10 degree, not 2 degree, so is probably not appropriate for imagery. In spite of the flaw in the 1931 CMF, the error was minor enough in practical terms that it was better to stick with it as a standard than attempt to switch everyone to something new. A display technology aiming at a wider gamut may well have a blue primary that lands right in the region where the 1931 CMF has the largest error, hence a visual discrepancy between what is measured as white, and what most people will see as white. So you either need to switch to a better CMF, or add some correction factor to the 1931 XYZ.
If you have a spectrometer, you can use a different CMF. If you have a colorimeter with calibration computed from the instruments sensitivity curves (such as the i1 Display Pro or Spyder 4), then you can also use a different CMF. If you have a colorimeter that only has calibration matricies, then you either need to create a new calibration matrix against a spectrometer using a different CMF, or you need some sort of correction matrix to use on top of the standard calibration matrix for that type of display.