Originally Posted by limitz
Since my particular TV clips white at contrast level above 85 (86-100) in SDR mode will it most likely clip white in HDR mode as well? I don't have a way to test white clipping while in HDR mode. I assume since the TV originally set the contrast level to max 100 when it detects HDR content that white clipping may not occur.
What do you guys think and what have your experiences been like comparing SDR vs HDR contrast levels (not Dynamic Contrast)?
White test pattern shows pink when above contrast 85, RGB limited 16-235.
I find the same on SDR.
On HDR, using the HDR test patterns on Sony 4K discs (key 7669 at the movie menu to get the test pattern screens) There are some white screens with 10 vertical scales.
One screen has 100,200,300...1000 nit scales. I can see all those with any contrast.
The next screen has 1100,1200,1300...2000. 1900 and 2000 blend together with 100 contrast with and without Dynamic Contrast on. With Contrast of 99 and Dynamic Contrast off, I can start to see a difference between 1900 and 2000. With Dynamic Contrast on, I have to turn Contrast down to 94, to see a difference between 1900 and 2000. On the next screen, the nit scales go 2000,2500,3000,4000...6000 (a bump of 500 nits after the first 2000 and from 2500 to 3000 but then 1000 jumps the rest of the way.) Even at contrast of 100 I can see a difference in all scales up to 4000 (5000 and up blend together.) With lower contrast, I can see a difference up to 5000 (6000 and up blend together.) On these high nit screens, lowering contrast starts to make the lower scales turn grey-pinkish - I assume that is just the tone mapping dealing with these extreme nit scales.
Most professional reviews stick with 100 contrast in HDR. Based on those findings, it seems I should run 99 contrast to avoid clipping at very high nits and 94 if using Dynamic Contrast. Agree? Any downsides to not using 100 contrast?