AVS Forum banner

1 - 6 of 6 Posts

·
Registered
Joined
·
4 Posts
Discussion Starter · #1 ·
Hi,

I have a Topic that I searched for quite some time within norm-bodies, technical papers and so on, but was not able to find a concrete answer to it:

What is the Blacklevel of an SDR TV supposed to be, if you want to stay as close as possible to the creators intend?
This may seem like a rather dumb question at first, but let me explain how that question arised.

1. Before OLED/MLED and so on, there was no display technology, not even the CRT, which had absolut zero black levels.
2. Norm-Papers define the Luminance range for SDR as 0,01 Nits to 100,00 Nits. (Rec.709 / Gamma 2.4 / 100 cd/m²)

One could argue that in the mastering history of TV content, there shouldn't exist any material that was mastered with zero nits Blacklevel,
at least from the beginning of TV production up to the point, where OLED, MLED, etc came into play, ca. 2012.
Please bare in mind that a Blacklevel of 0,01 Nits looks very different than a Blecklevel of zero Nits.

So, are we supposed to set our Blacklevels to 0,01 cd/m², even on an OLED-TV for SDR, and leave a level of zero cd/m² only for the new
standards as HDR?

sincerely
 

·
Registered
Joined
·
664 Posts
The black level performance has absolutely nothing to do with HDR vs SDR.
The black level will be presented by, and within the capabilities of the display, so an OLED would obviously present the SDR picture better than your 10 year old LCD ever could.
When you calibrate for SDR, you calibrate to leave data level 16 as absolute black, i.e. you should not see it, and data level 17 should barely be able to be seen.
Data levels for video (aka. video levels) ranges from 16 to 235, these are the available data levels for SDR 8-bit content, associated with Rec.709, BT.709.
The black level does not change for HDR, but the data values does, because HDR is 10-bit, with a broader range of data levels, increases the gradiation.
But, the values are not lower, nor does it yield a 'blacker' black, it will however give you increased shadow detail as there are more data levels available.
The nits are automatically lower on an OLED, since it is able to produce deeper blacks, but the data value for SDR 8-bit is still 16.
Data value level 16 is black, a technical 0 nit level, again, no difference between SDR and HDR, hope this helps.
So if you calibrate an OLED for 16 black, you will have much lower nit reading than from 16 black on LCD.

The black level will vary from movie to movie, depending on the mastering, some have raised blacks, some crushed, some perfect.
The best you can do is to calibrate your display/TV for SDR, where data level 16 is as black as your display/TV is capable of representing.
If you are disappointed by the black level performance of your display/TV, you need to replace it with a better one, there is no difference in calibration.
 

·
Registered
Joined
·
4 Posts
Discussion Starter · #3 ·
Hi,

Thank you very much for your reply.
The Statements you made are all technical correct.
Of course, should the blacklevel always be set as low as possible, and there is no doubt about it, that SDR and HDR are mastered with a zero Nit Black level in mind, in todays facilitys. All current master Monitors, like the ones from sony, etc. have this ability.

But my question was:
Should we set our SDR TV to a higher(brighter) blacklevel, even on an OLED TV cause of the background, that nearly 70 years of TV content was produced on (even master-)monitors that had definitely NOT zero Nits blacklevel, and therefore the Image looked rather different in the result to the creator on this monitor, as it would on todays Displays with perfect blacks.
No one doubts that it was always intended that value 16 is true black, but it simply was not, due to the technical limitations of that time (most of the time in history of content creation)
There are even standard bodies who suggest the black level of SDR as 0,01 Nits, even today.
It gets better, by far, but is this what ressembles the Look of the original intend during creation?

sincerely
 

·
Registered
Joined
·
664 Posts
For those cases, I think you need to measure the contents black floor, if you believe your display/TV needs adjustment; I see no other way, sorry.
 

·
Registered
Joined
·
434 Posts
Hi,

1. Before OLED/MLED and so on, there was no display technology, not even the CRT, which had absolute zero black levels.
And that is what you got wrong. BT.470, BT.601, BT.709 and BT.2020 (BT.2020 without BT.2100, that is SDR) all are scene differed. It does not matter that there was no such display tech, what matters is that there was such film and such ProPhoto RAW stuff. You are also forgetting printers, black of which can be done very good, even black body simulators paints. And that is the difference with sRGB and Adobe RGB 1998 that are display reffered... The standard sRGB, in violet, shows that when light intensity reaches less than 0.1 nit or greater than 100 nit, there is no more differentiation in the color value. While with HDR (ST.2084 PQ) https://docs.microsoft.com/en-us/windows/win32/direct3ddxgi/images/hdr-bright-and-dark.png it is 0.001 to 10000 (actually 0.0001, that is a typo there).

What you are also confusing is the fact that there are deaper blacks in HDR and "A BLACK". A black is 16, 128, 128 in YCbCr, 16, 16, 16 is limited range RGB and 0, 0, 0 in full range RGB. Indeed something like 1, 0, 1 may indeed be 0.1 nit, but there is no needfor black to be 0.1. One can just use maxium black possible. That may be needed to emulate black print. But yeah, the colorspaces like Adobe RGB that accurately DEFINE the black point should be produced with that black point, see wikipedia. CMYK too, read black point emulation in color management.
 

·
Registered
Joined
·
434 Posts
No one doubts that it was always intended that value 16 is true black,
sincerely
Except for standards that define the black point. Like Adobe RGB 1998. But yeah, BT stuff was supposed to be infinite contrast with perfect black body stuff and since it is scene reffered...
 
1 - 6 of 6 Posts
Top