Originally Posted by Jeroen1000
Just for informational purposes (you seem to own every piece of kit worth having). Normally every "YCbCr" source should send something that gets decoded to 235,235,235 or with RGB no decoding is required. They should all end up at the same Y when measured with the same meter. 4 cd/m² difference between sources is too much to be a meter issue. I've also checked the TV does RGB and YCbCr equally well and verified the correct video levels. But I'll check again with the Accupel.
Anyway I'm wondering, how does the hdmi analyzer spot the error? Will you read pixel values like 232,232,232?
About digital level testing, if you see the links I posted with player test, I'm saying ''DVDO AVLab TPG Color Checker function (where it displays the digital level of the selected pixel on screen)''.
The cursor box provides key information about the current color format as well as the specific color values for the currently selected pixel.
For more details:
About Bit Depth; Color Checker can recognize 8, 10, or 12 bits per pixel. Upon receiving a video stream with properly formatted info frames and video information, the bit depth indicator will change to indicate the number of bits per pixel. The number of digits in the triplet values will also reflect this.
For 8 bits per pixel, the range will be 0-255 if displaying the value in decimal or 0-FF if displaying in hex.
About Decimal/Hex; For user convenience, the display can be adjusted to show the triplet values in either decimal or hex format. This is only a display setting and does not affect the video data.
About RGB/YUV; This indicates the current color space sent by the source. Note that this is derived from the info frame and indicates to a sink how to interpret the triplet information. An R in this space indicates RGB, and a Y indicates YCbCr. In this case, it tells the user how to interpret the triplet information.
Triplet. The actual color triplet information always consists of three values, either R,G,B or Y,Cb,Cr.