Originally Posted by mrtickleuk
I know what pitch black means. To repeat, HDR is mastered and intended to be viewed in a dark, but not fully black, environment. That's a fact, not an opinion.
You can of course watch in fully black conditions if you prefer, or direct sunlight if you prefer. You can do whatever you like for your taste. I already made that clear.
Originally Posted by skschatzman
As mrtickleuk stated pitch black and dark room are not the same thing.
It is very important that we break this assumption as we go into the age of infinite contrast ratio and rapidly increasing luminance output. The HDR formats are designed to support up to 10,000nits for the future to grow into. Right now we are seeing peaks of 3000nits with some TVs.
As we get closer to 10,000nits watching content mastered that intensely bright in a pitch black room could pose serious injury or permanent damage risks. This is the main reason HDR is mastered in a dark room with low ambient lighting. It allows our eyes to adjust to sharp changes in contrast much easier. Let's be careful not to spread false information.
I do not mean any disrespect with the addition of the following information
(as it was mentioned previously with "that's a fact, not an opinion"):
piece of paper during a sunny day can be measure to reflect light at 1000 nits. ISO standards require there to be a luminance ratio of 10:1 between text and background, meaning that in the worst case scenario, a sepia tone piece of paper in a book with black text would probably measure 10000 nits; in practice, with common "whiter" pages, this is much, much higher. The argument of "800+ nits would cause severe eye injury" is far from true. Yes, reading for hours under direct sunlight during a sunny day is not ideal, but 800-1000 nits from a TV is relatively negligible compared to things that our eyes are subjected to on a daily basis. Eyes can adapt very easily to light changes before it's detrimental to our physiology.
Staring at a full moon, 2500 nits, will not cause any damage (at all). Looking at a typical tungsten-based filament lightbulb (low power, 20W) is still very tolarable at 10,000,000 nits. Part of what makes a bright sky beautiful is it's luminance (7000 nits; an illuminated white cloud can be around 10000 nits), and what about the sun itself during a beautiful sunset (600,000 nits): you can still look at the solar disk directly during a sunset, and you won't have any eye damage.
So if a display was capable of displaying a beautiful sunset scene making us squint, or see explosions of beautiful fireworks at night (~20,000 - 1,000,000 nits in a very dark
background in real life) through a panel that we have in the living room, it would most certainly be amazing and NOT cause any eye strain. I really don't know if this debate stating that LEDs and QLEDs suck because the only selling point is the high brightness that attracts the dumb consumer, and that high nit values can actually hurt our health, comes from ignorance or just from a desire to fight. And again, I don't wanna offend anyone; I just want people to have more information.
I for one am enjoying the perfect blacks of OLED. Dark scenes do look great! I do miss the higher nit values of brighter scenes from my QLED, especially when 90%+ of the screen is bright. I must admit that what brought me to share this info with the thread is the use of the word "intended". A film maker might want us to see a sunset at 50 nits (at the theater), and an image purist might want to replicate that with the OLED light
setting set to 27/100. I'm fine with both of those statements; like it's been said many times in this thread, we're free to do whatever we want, and picture quality and most importantly, its enjoyment, are completely subjective. But arguing that the luminance values of OLED are ideal and that anything brighter is dangerous...? or that HDR is already too bright to be watched in a pitch black room...?