Originally Posted by ConnecTEDDD
The points (with 10-bit code) are not actually PQ-EOTF code values points, but gamma based points, then its added the PQ-EOTF tracking of tone-mapping.
I noticed something strange.
Can we say that the 19. point of WB signals us which tone curve is used by the TV out of the 3?
If that's true, then something weird is going on, at least with madvr passthrough on PC.
I theory, the 3 tone curves and their corresponding 19. WB points:
- 0 - 1000 : 669 (C8) , 669 (B8)
- 1001 - 4000 : 696 (C8) , 692 (B8)
- 4001 - 10000 : 713 (C8) , 705 (B8)
Now, the nits settings in madvr pixelshader with "output HDR format":
- 669 (C8) , 669 (B8) : 100 - 760
- 696 (C8) , 692 (B8) : 761 - 2500
- 713 (C8) , 705 (B8) : 2501 - 10000
Also noticed (!) when using passthrough:
- the 19. WB point only changes based on the mastering luminance value and *not* the maxCLL value of the metadata
Manni checked the metadata using HDFury and he said the GPU output is correct.
That can mean 1 thing, that LG screwed this up as well ...
If the above is true than the worst case scenario is a content like "Guardians of the Galaxy (2014)": "master luminance" 0.005/4000 ; maCLL/FALL 577/512
- although maxCLL is 577 nits but the "master luminance" is 4000, so the strongest tone curve is used by LG and not the lightest
Can somebody double check this with (preferably) standalone player and a HDFury device if you have one?