Originally Posted by GregLee
I didn't say I don't see highlights. I do see them, for Exodus at least. But I also see highlights for non-HDR UHD videos. The highlights I see for non-HDR UHD are just as bright as those for HDR-UHD videos, given that my user controls are optimally adjusted to give nice highlights for each type of video.
What I'm trying to say here, and evidently not being clear enough, is that this is exactly what should be expected. The extra brightness of HDR is supplied by the TV, not the video. It can get really bright because of the LEDs Samsung used -- it's a physical thing. The brightness is not pumped in through the video signal. It may be licensed by what is in the video, but since you can set Contrast, Backlight, Dynamic Contrast as you please, you can make highlights as bright as you want them, up to the limit of what your TV can supply (600 nits, for me, perhaps 1000 nits for those with JS9500s). That's regardless of whether the metadata in the source video says it's "in the standard".
Whether the highlights look natural or not is something else, and there, HDR source should make a difference. But not in how bright the highlights get.
The highlights you see on non hdr are probably created by smart led high if you are using it which is a Pseudo hdr method. The metadata should read the tv capabilities and calibrate accordingly so if your tv has higher nits the metadata would know and tell the tv what to display. The tv's are much more complex than you think. The metadata controls the tv's settings in hdr mode. Hdr uses a completely different gamma curve from traditional rec 709 the metadata controls this look at the colour space settings in HDR mode the numbers are all kind of crazy in the colour space. Now watch a regular uhd movie and look @ the colour space numbers they are very tame in comparison . You are wrong metadata controls the gamma curve the luminance/brightness,black and white levels along with the colour gamut ..
You seem to need some more understanding on HDR and what it brings and how it works differently than anything we have had b4 this is a giant step in PQ. Here are some videos to help with knowledge .
WHAT DOES HDR STAND FOR?
As keen photographers amongst you will know, HDR is short for High Dynamic Range. Essentially, it refers to an image that displays a greater range of brightness and luminosity than “normal” pictures – so dark areas of the picture will look darker while, at the same time, bright areas will look brighter. You’ll also see more luminance detail in shadowy, darker areas of the picture.
The images above, provided by Sony, demonstrate the difference between standard and high dynamic range pictures.
Read more at http://www.stuff.tv/features/why-its...DWyB3Yo8koA.99
Read more at http://www.stuff.tv/features/why-its...miEjJUEb9uJ.99
HDR TVs explained
You can think of HDR as the next step after 4K Ultra HD. At least that is how the industry is positioning it. 4K is “more pixels” – four times as many as HD – whereas HDR is “better pixels”. There is obviously much more to it than that.
HDR is short for high dynamic range, which implies that you are currently watching standard dynamic range. It is impossible to show you what it looks like – your monitor is not capable of HDR – but consider the simulated photo below (from Dolby). HDR on the right side.
In essence, HDR is about brighter whites and deeper blacks, and more details in each end. HDR is about reproducing the world around us on a display. Current displays are not capable of reproducing the world as it really is because the world is more than just pixels. Light is just as important. That might sound confusing but we will get back to that.
Imagine being able to see bright sunlight reflections on metallic surfaces or all the stars in the sky on a perfect black canvas, even have your TV reproduce the colors of the worlds around you such as Coca Cola red (your current TV cannot reproduce this color).
There is quite a bit of confusion around HDR and for good reason. There are several players in the industry that are trying to make HDR happen, and you might already have heard about Dolby Vision. There is also an open HDR standard that has been adopted by Blu-ray and other distribution channels. TV manufacturers have come up with even more names.
For example, Samsung calls its HDR-capable TVs “SUHD” and refer to the system that enables it “Peak illuminator”. Panasonic refers to it simply as HDR but calls a panel that supports it “Super Bright Panel”. Sony refers to it as HDR and to be sure you should look for “X-tended Dynamic Range” in the specifications sheet. All of this is just marketing. Other players such as Dolby are talking about Dolby Vision, which actually has more elements to it than just HDR.
Why does HDR/WCG need metadata?
HDR / WCG metadata per program
• In order to quantify the creative color volume used, it is nec
necessary to have information on a per program basis so that
the correct conversion can be made between BT.2020 and B
T.709 for both SDR & HDR content.
• A draft SMPTE standard ST2086 describes the Mastering metadata required for SDR & HDR.
HDR / WCG metadata per scene
• In an outdoor scene full dynamic range of the mastering monitor is lik
likely to be used, average brightness level typically 2~5x higher than in
• How to map this to BT.709 100nit content where the average brightness
ess of the indoor and outdoor scenes are approximately the same.
• Requires different mappings for each scene.
• Can this mapping be done only by analysis of the HDR content i
n the player or TV?
• The simple answer is no, this requires a trim pass by a colorist t
o ensure the creative intent is maintained.
• Metadata can be created during the course of this process to
steer the player or TV to ensure that the optimum mapping is performed on a scene-by-scene basis to maintain creative intent.
• Scene based metadata is required to maintain creative intent when mapping from HD
R to SDR.
• Additional metadata elements can also be
included to optimize mapping performance.