After burning the HD-DVD IRE-Window test patterns and testing them, I'm wondering if they are accurate for HD-DVD calibration.
My system consists of a Brillian 6580iFB LCoS HDTV, Lumagen Vision ProHDP video processor, and XBOX-360 w/ HD-DVD option (plus PS3, Denon 3910, etc., etc.). Since the Lumagen gives much greater control over the calibration, and can output 1080p to the HDTV, I have the XBOX-360 HD-DVD output going into the Lumagen, which deinterlaces, and outputs the 1080p. A very nice feature of the Lumagen is the ability to read the IRE value of the frame buffer at the center of the screen. This information is invaluable for HDTV calibration.
On the calibration front, I own all of the necessary ISF calibration gear -- signal generator, colorimeter, and analysis software. It's not bottom-of-the-line junk, as this stuff cost more than my TV, Lumagen, Denon, PS3, and XBOX-360 combined. The XBOX-360 component input to the Lumagen is already calibrated with a perfectly flat grayscale and 2.4 gamma; but I want to calibrate to actual source material -- such as an HD-DVD calibration disc. This paragraph is really just background, and doesn't have any bearing on my actual question -- so keep that in mind.
Now, enter this thread, and the 1080 test patterns at w6rz.net. Since Clarence was kind enough to teach us how to create an HD-DVD disc suitable for XBOX-360, I used this exact method to create such a calibration disc using the w6rz.net IRE-Window test patterns. After following the disc creation procedures verbatim, I had a perfectly playable HD-DVD on the XBOX-360 HD-DVD player.
My first (and only) test was to compare these IRE-Window patterns with the values reported by Lumagen Video Processor. For reference purposes, the GetGray DVD measures absolutely perfect through the Lumagen in all windows from 0-100 IRE. I would expect these test patterns to yield similarly perfect results; but they didn't -- not even close.
Displaying the test patterns and using the Lumagen CTMP command to read the IRE Values at the center of the IRE Window showed values that were *FAR* off of the expected values. These are the values I read:
W6RZ / Measured
100 / 91
90 / 81.5
80 / 72.5
70 / 64.5
60 / 54
50 / 45
40 / 36.5
30 / 27
20 / 17.5
10 / 8
0 / 0
This is a near linear function of difference. So my question concerns how these IRE Window files were created, and whether or not they are accurate. Were they created with PC Levels, not Video Levels? I'm told by Lumagen, that difference could certainly explain the reading errors.
Any ideas?