Originally Posted by morphinapg
Yes it is, depending on the TV. Think of it like lowering the "exposure" of the HDR image so that 100 nits gets mapped to 17.5 nits and 4000 nits gets mapped to 700 nits, which allows you to uncap the brighter highlight detail and color. You need a perfectly dark environment for your eyes to be able to adapt to an image that dark, and it would probably take a while to adjust to that dark of an image, but once your eyes adjust, the relative contrast of the HDR highlights will look nearly as brilliant as they would on a 4000 nit screen, plus there would be no ABL as there might have been before.
Remapping 4000 nits to 700 may be too dark of an image for most people, and it might even be too dark to adapt to, not sure on the threshold on that, but LG did recently modify their HDR Game mode on the TV I own, and this results in essentially half the exposure, meaning it rolls off above 2000 nits (original tone mapping rolled off above 1000 nits), even though the screen is a 650 nit panel. This results in an image less than a third as bright as designed (and therefore a lot of complaints), but if you watch it in a dark enough environment for your eyes to adjust, it can feel like you're watching a 2000nit OLED with no ABL. It's quite impressive on games and some movies with brighter nit highlights like Batman v Superman for example.
For most people, who watch TV with some light in the room, this wouldn't work. It can only work if the ambient light is low enough to account for the lowered exposure on the image. Since HDR is designed with 5nit ambient light in mind, you would need <0.875 nits of ambient light for the above 4000-to-700 nit solution to work right.
The one thing you are not taking into account is most of the luminace range is well below 750 nits on Blu Ray HDR disc's. The megadata/dynamic contrast is being processed internally by the display. So that all depends on how the display processes the megadata info.
Now you can take this one step further and calibrate the grayscale/gamma a modified version of LG's 2017 OLED release notes. http://www.lg.com/us/support/product...OLED%20TVs.pdf
Using Calman's Levels editor set to 8bit RGB to convert the percentages, then use Calamns workflow HDR10 and LG's High/Low expert settings. Of course you will need a pattern generator to set the 2017 LG OLED to HDR mode and and send the correct triplet window patterns. Plus a well profiled color meter like a K10-A if you want to be exact.
If you would post all of my post that explains what the floor is on a 4000 nit Blu Ray movie and how all this works. As I said it is 0.005 nits, the floor on most 1000 nit Blu Ray is 0.0005 nits.
What ever LG TV you have, it would not be able to come anywhere close to 2000 nits. So it is not possible for your TV to roll off above 2000 nits. If you are using some external processor and have it set to start to roll off at 2000 nits, then there is at least one of the issues you are seeing.
Also as I said, even if you are in a light controlled room the PQ with a 2017 LG OLED and viewing a well mastered HDR Blu Ray will look at least as bright as a SDR Blu Ray. The only thing you will miss in a non dark room is some shadow detail if your display has a black level of .005 nits for most 4000 nit Blu Ray HDR disc's is some shadow detail and .0005 nits for 1000 nit Blu Ray video.
For a 4000 nit Blu Ray video, the only thing probably will be improperly displayed, is something like a very bright flash.
As for game mode on a LG 2017 OLED, it isn't any brighter (higher nits) than you can set HDR cinema mode. Game mode is simply a faster transfer function for video games.
Because of your use of video games are you setting your source to PC range 0-255 or video range 16-235.
If you are using PC range then that is also a issue when viewing Blu Ray HDR video disc's, and probably causing some of your poor PQ.
When it comes to streaming HDR video, you can't use that as a reference.
I hope this helps.