Originally Posted by sgupta
It's interesting that XDR doesn't skew HDR/Dolby Vision in the same way. I guess from your previous explanation, it just allows the extra data that's there to fit in the widest possible range without needing to introduce any errors, whereas with SDR there just isn't enough data to do this without introducing skewing?
For Local Dimming, I'll stick to Medium for everything SDR I think. I take it High is generally recommended for HDR/Dolby Vision?
I think I'm getting there with settings... My old OLED is officially in its new home so there's no going back now. =oP
The sun, a lamppost and UI text in SDR may all look equally bright (100% white) but in reality their intensity differ by quite a margin, this is where HDR comes to the rescue, it's capable of describing the varying luminosity levels to a much higher degree of precision than the SDR format.
XDR when enabled in SDR mode will try to restore the punch of highlights and light sources as they were at the time of capture (by a camera or rendered in a game) but as the SDR content already is compressed to fit within the more limited luminosity range of older TV's the algorithm can only do an educated guess, how would it know the difference between a white lamppost or the sun? This is what I mean by the XDR being incorrect as it will brighten all the white parts equal amounts and making UI text searing like the sun.
When HDR is active, XDR acts almost like a secondary Brightness slider but really should be kept at High as it's no longer about trying to recovering lost luminosity levels beyond the SDR levels
but instead properly map the TV's luminosity range to the HDR standard
. It's good to know that HDR data values are specified to be absolute, a real world 100 nit light source if captured properly and stored in the data as 100 nit should light up on the TV screen when measured with a light probe as 100 nit as long as the TV can output the necessary amount of luminosity. In order for this to map correctly you need to set the XDR to High and most likely Brightness to Max but in reality not a lot of content we consume try to be accurate and often has contrast and colors artistically tweaked so I would rather adjust the Brightness slider to my liking, if the image feels too bright, than XDR to preserve the HDR luminosity values relationships.
Finally, the only reason I would lower XDR in HDR mode would perhaps be to try and decrease blooming a bit but then you're kind of negating the whole point of buying an LCD for HDR and it's superior brightness and baring burn-in issues, could have gone with an OLED instead.
Edit: Regarding Local Dimming, use which every High or Medium you prefer as the differences is minor. Using Medium to rein in some blooming at the expense of some peak brightness but it's effect is very subtle.