I took some measurements this morning and results are below. Unfortunately the LCD display I used to test resolution would not accept a 1080i input (only 720p) so even though it has a native resolution of 1900x1200 the test pattern resolution measurement is inconclusive. I measured vertical resolution at 720 (as expected) and horizontal resolution at 680 x 1.78 = 1200 lines for 16x9 display. This is a bit lower than expected (1280) but probably within the accuracy of this method.
The gray scale was measured using the patches shown during the last test pattern using a spot meter (i1pro) in contact with my display. I compared the HDnet patterns from the moto6416 box via HDMI input to the getgray DVD patterns via the same input.
Correlated color temperature of the HDnet patterns agrees within the measurement scatter so these patterns can definitely be used for CCT calibration. Solid line=HDnet, dotted line=getgray
For gamma measurements I set the contrast so that the 100% HDnet pattern was the same luminance as the 100% getgray pattern. Overall, gamma was lower using HDnet patterns compared to the getgray patterns.
I can get the gamma responses to agree if I shift the HDnet stimulus scale by a flat 3.4%
I see three possibilities for this difference:
1. The HDnet patterns have a 3.4% scale shift (i.e. 10% is really 13.4%)
2. The HDnet patterns use a slightly different gamma encoding function compared to the getgray DVD.
3. Difference in mpeg decoding between the dvd player and the moto box.
There is no way I know of to tell the difference between these possibilities at the user end.
The color patch measurements were interesting showing a significantly smaller gamut. Getgray results on left, HDnet on right. The reference gamut shown is Rec709, the HDnet gamut is closer to a SMPTE-C gamut. The green luminance as a percent of white (lightness) was also much lower (-36%) than what it should be.