I have a medium form of protanopia (color deficiency for the color red in my visual system) and as such should be more sensitive to luminance errors, but less sensitive to chromaticity errors in the red spectrum.
I have to put this up front as the following is based on my visual impressions of images vs. measured color errors.
What I have found though, is that my visual recollection of whats on the screen more closely resembles what is measurable within the CIE76 color error formula than the CIE94 or 2000 formulas.
I've tested this with two different "near perfect" (saturation tracking always significantly below dE 3 panels (Sony 42W650A and 55W905A).
On the W650A I see a VERY noticeable pink tint on certain pastel colors - and sure enough - once I switch to CIE76 for the saturation tracking - cyan (dE 5) and magenta (dE 4) at >85% saturation are breaking out. Also on the same device there is a more significant color error hidden at red 95% saturation which does not show, when you are only looking at the usual 5 steps per color.
With CIE2000 - all errors (except red at 95% (dEca4)) are below dE 3.
On the W905A It is even more apparent as the device - according to CIE2000 has close to perfect greyscale, color and saturation tracking across the board, with all dEs (even in color checker) near or below dE 2. On that device I see a slight green tint in critical viewing - and sure enough - once CIE76 is applied - it shows green, magenta and to a lesser extent yellow breaking out at > 90% saturation. Also on the same device there is a more significant color error hidden at green 87%, which again, doesnt show up, when you are looking at the usual 5 steps per color. This time this error is also below dE 3 if you are measuring it in CIE2000, but it breaks out to dE 4,2to5 (one steping the color slider) in CIE76.
When reading up on what changed in between CIE76 and CIE94 ColorWiki shows that at that time a "fudge factor" was introduced specified as "RIT/DuPont tolerance data derived from automotive paint experiments".
Does anyone have more information and or data on how this changed the "usual" color targets in the CIE triangle?
I'm asking because I can see that the difference introduced at that stage was significant and resulted in - from my visual point of view - MASKING certain color errors that would have been measurable otherwise.
The different weighing of L* ("lightness") introduced in dE2000 has far less of a measurable impact then the changes that took place in between CIE76 and CIE94.
Regardless of what "RIT/DuPont tolerance data derived from automotive paint experiments" turns out to be and keeping in mind that I personally see colors slightly different that the majority of the population -
I found it more and more common in reading up on reviews vs. critical viewing impression that not me alone, but also other people notice certain, unmistakeable color errors (after calibration), that dont show up in "normal" measurements provided by review sites f.e.
On the one side there are faults hidden "between the lines" (5 saturation measurements per primary/secondary color are just not enough), on the other side - I visually recognize color tints, that strangely enough correspond with what the display measures under CIE76 - in two separate cases.
If there are any impressions on your part on the difference between CIE76 and CIE94/2000 in practice, or if there is any one that can "illuminate" whats behind the "RIT/DuPont tolerance data derived from automotive paint experiments" fudge factor data - I would appreciate it.
As for now, just take it as a personal experience report of one guy who saw a green color tint in critical viewing on a avg dE 0.80 (CCSG) device - with a masked green error of dE 5 at 87% and arround dE 4 at 100%, when measured at CIE76, which both totally vanished (below dE 2) from the measurement results when measured at CIE2000.