Quote:
Originally Posted by
Dominic Chan
That's not true in most cases.
Let's say you calibrate two TVs both to 150 cd/m^2 at 100% (level 235); and let's say TV1 clips at 235 while TV2 continues to respond above 235.
The whites will look identical on TV2 and TV1; one won't look any more "dirty grey" than the other. The difference between the two, is that when there is any WTW ("white-than-white") such as specular highlights, TV2 will be able to show that being brighter than white, whereas TV1 will show WTW the same as white, so in a sense it will look more "grey".
The scenario that you described would only arise if the TV needs to run "flat out" at maximum brightness, in which case reserving some heading will lead to a drop in brightness at 100% white.
You know, I find the whole WTW and BTB thing puzzling. Because their supposed existence fights against everything I as a computer nerd know (or think I know, I'm always open to new ideas and fix the old) about the way SDR graphic are outputted (be it computer graphics or video. HDR is something completely new) and how digital displays work at their basic.
There are two RGB range formats, the legacy TV one (16-235) and PC one (0-255). They are both essentially the same, both have peak white and bottom black. Only difference is that the TV signal reaches its peak white at the 235 where PC reaches it at the 255. But side by side they are essentially the same, their reference whites are equally white and blacks are equally black. The TV one just has less steps between the two extremes which techically makes it more prone to banding but I digress...
Point is that for TV signal its maxed out at both 16 and 235, there is NOTHING beyond those and in any video file encoded in that format. How calibration discs get around that is that they are actually 0-255 video files but are flagged as 16-235 or so I believe. I have not actually opened the file and checked but that would make sense to me. So the information for above 235 and below 16 is there in the file but they should NOT be visible when everything in signal chain is correct for movie watching.
Actually if I am correct Bluray movies are encoded in Ycbcr which is hard limited to 16-235. Unlike with calibration disc above there simply is no data beyond that range, no specular highlights, nothing. You can pretty much test this out with your Bluray player. Set it in Ycbcr mode (which most bluray players should have). For bluray movies it sends the signal forward unaltered and lets the TV do the RGB conversion. But in case of calibration disc like AVSHD it actually converts it to Ycbcr and see what happens, or atleast with every Samsung TV that I have seen has happened? The information beyond 235 and 16 are hard clipped away. No matter how you tweak your TV brightness and contrast slider they are gone before it reaches your TV. I believe bars 235 and 16 are called "Reference White" and "Reference Black" for a reason.
So, in case of your TV comparison the way I see it is this: TV2 has its 235 reference white at 150 nits and say, 0.05 blacks resulting in 3000:1 contrast ratio and it peaks in WTW information at 170 nits (just threw the number out of my head for comparison) for 3400:1 contrast ratio. But it is expecting a signal that never arrives in real life content and therefore is limited to 3000:1 contrast ratio for every movie disc it shows. Where as the TV1 is IMHO correctly calibrated to push the 235 reference white to the same brightness as TV2 WTW peak white, and then have backlight turned down so the whites are at 150 nits but blacks are deepened as a result too. The TV1 shows the movie accurately, it shows all the black and white detail that really exists in the movie disc and yet has the panels capabilities maxed out on both black and white end, 3400:1 contrast ratio is preserved resulting in more punchier picture compared to TV2 and with no detail loss.
What am I missing or getting wrong here? Because no matter how I look at it with the information I have the whole BTB and WTW talk to me seems to me nothing more than someone once had their RGB range mismatched somewhere in the signal chain resulting in washed out picture, then he reduced the brightness slider to get his blacks black again at 16 and then calling the dirty whites that he cant get rid of as "having more detail" because the calibration disc says so but not really understanding what he is actually seeing.
Or maybe I am not understanding something? Help me out here.