Originally Posted by ray0414
You see, you are going off "theory" and opinion.
Your responses suggest you aren't approaching this with a very "scientific" mindset, which means that perhaps this is a fruitless conversation. But one more go....
No, I'm not simply going of "theory" and "opinion." I'm going off the basis of how display technology actually works, facts you can look up yourself, and which you can test yourself, and which are routinely tested by professionals.
You are aware of the importance of black levels always being touted by AV professionals and reviewers, and educated AVSmembers, right? This is why the contrast ratio and black level measurements - the deepest black a display is capable of, so often features in the tests of displays going back decades. Now...before UHD came along, what would be the point of measuring the very deepest black a display is capable of...if it wasn't relevant to the black levels the display could produce with existing DVD/Blu-Rays? Of course it's relevant. Reviewers don't like seeing high black level measurements on displays because it tells you about the black levels you will see with actual content! It tells you what black level you will get when some portion of the image on DVD or Blu-Ray is putting out 0 IRE, or "no" light.
That's why you can use EXISTING Blu-Ray test discs to MEASURE the black levels of your display! Blu-Ray can already produce the "blackest blacks" your display is capable of.
If Blu-Ray can already reach the limits of black level that a display is even capable of...how is it you suggest that UHD can produce "even darker" black levels?
If your idea made sense, all the technical calibrating done to displays, based on how the technology is understood to work, has never made sense. You are operating under a different paradigm than the actual people who build our displays, and who calibrate displays.
Can you not see the problem there?
Originally Posted by ray0414
Im going off real world testimony, and other DOZENS of people who have this real world testimony that agree that the black levels are better with HDR.
And have any of these been supported by measurements of the black level, showing it has actually deepened?
Do you not understand how easily our eyes are tricked in regards to contrast and light levels? Ever see any of the many
contrast effects on our vision?
The classic, wouldn't your eyes "tell you" that the A square is darker than the B square (even though it's not)?
Don't your eyes tell you that the right square is darker than the left square?:
But they are not. We know very well from empirical experience as above, not simply on some "funky theory," that how we perceive tonal shades is very influenced by a variety of factors. If we shift the contrast relationship around an area of the same brightness, it's brightness will appear to change to our eyes. Brightening bright areas will do this in an image, as will shifting gamma, which will shift the brightness of areas bordering on the dark areas. As I said, when I go to a higher gamma, I often perceive my black levels going lower, though I know they remain at the same actual measured levels.
the question is whether its from the higher bitrate/encoding or HDR itself. I think it may actually be a combination.
See...you seem actually interested in the answer, so it doesn't help to start off on the wrong foot towards getting it. You are starting with a conclusion "UHD sources produces lower black levels than Blu-Ray" that has been arrived at via faulty methods of inference and dubious assumptions.
UHD can't make the *same display* produce lower black levels than it could with Blu-Ray. Good calibration already sets the back level of a display as low as it will go (some caveats in there for different display technologies). In other words, the usual rec709 calibration tends to use the lowest black level your display can manage as a base from which it builds toward the brightest areas. What UHD does is allow for encoding of more information, so you can get better shadow detail (though not lower black levels) but also encode picture information that allows for much brighter highlights to maintain detail. So now you can calibrate the brighter end of the display to extend brighter - much brighter displays up until now could do - without losing detail in the source.
So think of UHD as allowing higher brightness, not lower darkness. That's why the emphasis is always on "specular highlights" such as the glint of sun off metal, for a more realistic image.
(And, having seen UHD on some LCDs, as to be expected it was certainly clear UHD didn't help the LCDs magically achieve lower black levels - the brightness of highlights was certainly increased, but any low APL dark scenes still showed less than stellar black levels...because those are set by the display, not the source).