Originally Posted by bull3964
It's not going to cancel things out.
Think of it this way.
You have a disc mastered to 4000 nits and you have a display that can hit 800 nits.
HDR Optimizer Off:
TV has to tone map everything between 801 and 4000 nits down.
HDR Optimizer On:
The player tone maps everything from 1001-4000 nits down, the TV then maps everything from 801-1000 nits down.
It's a simplistic way of looking at it, but that's the theory. Which one is better is anyone's guess at this point. There may not be any visible difference even or it may only be edge cases where you can see a difference between them.
This does make sense. I read somewhere that all OLED displays can deal with up to 1000 nits with its own tone mapping... and then everything else just gets clipped. I'm thinking though that stuff between 800-1000 nits might show up quite differently depending upon the panel, as we have two completely different tone mapping curves from two different companies happening at 801-1000 nits and 1001-4000 nits. But, at this point, this is likely mostly insignificant to 99.99% of the population. I think people (like myself) are just happy to see highlights again, and not having them blown out.
BTW, where are the nit values for the settings documented? I couldn't find that. This article seems to suggest the OLED setting is built for 800 nits:
However, 1000 just makes more sense to me, with tone mapping up to 1000 nits built into the TV.
500: Basic Luminance LCD or Projector
1000: Middle or High Luminance LCD
1500: Super High Luminance LCD
Is it confirmed it's 1000 nits for OLED, the same as the Middle/High Luminance LCD? Or is the setting somehow different?
I note my LG B7 hits well over 800 nits with a 2% window. The LG C8 hits well over 900.
Would my XBR X800E be categorized as a Basic Luminance LCD?