AVS Forum Special Member
Join Date: Jan 2003
Location: Springfield, MO
Mentioned: 24 Post(s)
Tagged: 0 Thread(s)
Quoted: 1010 Post(s)
Let's be real specific about what we are talking about.
Assume a 2.2 gamma, a Rec. 709 gamut, and a white balance of x0.319, y0.329.
Let's also assume no gamma errors.
Using CIELUV as the color difference method, if you include actual luminance values in the calculation, the dE value for this shade of gray will vary as the video level varies.
By any standard I am aware of, the values from 70% up indicate unacceptable white balance error. What I think is a reasonable standard indicates that 30% and below fall within tolerances. In fact, at 10% the color difference is close to unmeasurable. Between 30% and 70% is a jump ball, depending upon how demanding one's tolerances are.
But here's the thing, no competent professional would EVER judge a white balance error of x0.319, y0.329 at ANY video level between 10%-100% as acceptable. It is much too red, and easily visibly so. So the question is, what possible value is there to employing dE in a way that suggests otherwise? I really don't get it. Since x0.3127, y0.329 is the target, why not use dE in a way that reflects this? Doing so means that x0.319, y0.329 has a dE of 5.8 regardless of the video level being measured. What's the argument against this?
ChromaPure Software/AccuPel Video Signal Generators