Trying to understand black levels and gamuts
Hello, I am trying to educate myself and started learning TV calibration.
My normal TV watching environment is one older set-top box with DVB-C tuners in it and a newer Android TV based set-top box also with DVB-C tuners in it. Both are connected with HDMI to AVR (Pioneer VSX-924) and that is again connected with HDMI to TV (Sony KDL-46W4750). As I almost never use TV's own tuners to watch TV programs but use one of those set-top boxes thru AVR, my thinking was that for me the easiest way to calibrate my TV is also to hook my laptop + HCFR with HDMI to AVR and use HCFR's internal pattern generator. First I configured laptop's HDMI output to RGB limited, which I could verify from AVR's mobile app, which was showing that signal is "RGB Limit / 24bit". Then I set HCFR GDI to full range and did measurements.
TV does not have much of control to anything else than brightness / contrast / color temperature but it is not important as my point was to educate myself how to all bits and pieces comes together and out of curiosity to measure how well or bad my TV performs. In my mind, I managed to get black level just right, so picture from both source will show ok and when copied the same settings from TV's HDMI input configuration to TV's 'tv' configuration, then also picture, when watching TV channels directly from TV, matches to those coming from set-top boxes. With OK picture I mean that black seems to be black and not grey and picture is not washed out colors.
But now couple things that I cannot understand. Maybe I have misunderstood something or something else;
AVR's app shows that older set-top box sends YCbCr444/24Bit. If I have understood correctly, it is limited level so 16-235. This seems to be ok. AVR's app for Android set-top box another hand shows RGB Full/24bit. As TV black levels from both set-top boxes shows the same to me, someone somewhere needs to do some conversion, right? It is possible to run Android application in Android set-top box (which will run in full RGB range?) and I do not see black level clipping in those. So set-top box actually must convert TV broadcast to full RGB range? And then TV needs actually auto detect that it is full RGB signal and different from what older set-top box is sending even though the signal is coming to the same HDMI input in TV. There are no manual limited/full range setting in TV. Or am I completely lost here? Or are my eyes lying to me when trying to compare picture from those two sources?
From AVR's app I can see that TV lists xyYCC601 and xyYCC709 and both of those set-top boxes one of those. So in HDMI handshake devices can exchange information about color gamuts used / accepted in video signal? Are TVs really using that information?
Again if I have understood right, Full HD TV broadcast should use xyYCC709. Now I can see from my older set-to box that it sends xyYCC601 even the broadcast is 1080i like is the video output also from that box. Hmmmm. So why would it do conversion from xyYCC709 to xyYCC601? Then for the Android set-top box, AVR's app shows STANDARD, which I interpret as it does not send the information and then TV should use the one it thinks is right. And that would be xyYCC709 for HD signal, right? So does that mean that I cannot never got colors just right for the both of the sources as they are using different gamut? Or is TV really using gamut information (that AVR is able to pick up) from the signal? Is there any way how I could verify that?
Anyone can help me with this questions?