Originally Posted by rowland.johnson
I'm sure this has been asked and answered before. Maybe someone can point me to the right place. My question is why does a probe need to be calibrated to the display type. If the probe is measuring the quantity and wavelength of photons why does it matter how they were produced?
Tristimulus colorimeters read color by employing photo sensitive diodes that have filters between them and the light source. These filters simulate the response of the human eye. The problem is that this response is quite complex, so it cannot be mimicked precisely by filters, at least ones that are reasonably affordable. To solve this problem, manufacturers build calibration tables into the probes to improve their accuracy. These tables are based on readings from a spectroradiometer that does not use filters, but that reads the spectra of the light directly. Color is mathematically calculated from this spectrographic data using color matching functions established by CIE in 1931.
The Chroma 5 PRO builds on this strategy by creating calibrations for colorimeters that are specific to each unit and that are derived from a variety of display types, not just generic CRT and LCD. Also, we use a 5nm spectroradiometer, which is more accurate than the standard 8nm spectroradiometer used by manufacturers.
To understand why a filter-based colorimeter requires different calibrations for different devices, consider two displays: Display 1 and Display 2.
Assume that because of different design parameters, the light coming from each of these has a unique spectrographic profile. A spectroradiometer--if it is sensitive enough--will discern the difference between them, while a filter-based device whose internal calibration table is based on a third display with its own spectrographic properties may not.
If you are interested in learning more about this, see the following X-Rite white paper.http://www.chromapure.com/xrite/Xrit...ionDevices.pdf