Originally Posted by 3DMamper
Hey everyone. Just recieved my K8500. I watched Exodus and The Martian. Both HDR movies.
Inmitally a bit dissapointed. As they are both fake 4k movies so the resolution and sharpness is not actually better but worse than the best 4k streams im used to watching on netfliks, amazon and youtube.
HDR wise i cant actually see much difference between HDR and the bluray version.
What is obvious is the slightly washed out dark details and blacks. Thats definately dissapointing. So i would like to ask, how can i disable HDR. Forced upon us is annoying. I know HDR would be great on a 2016 OLED. But on my x940c the blacks suffer.
What i am impressed with UHD versions is the lack of motion artifacts in fights and high speed motion. Where as the normal bluray version can look really badly compressed in such scenes.
Another dissapointment for me was the Martain in fake 4k is nowjere as good as watching it in 3D.
First off if your blacks are suffering this is usually indicative of attempting to play HDR content on a non compatible display. Since the 940C seems to be HDR compatible based on my quick google search I would speculate that there is something wrong in your setup. Either settings on the player/display or a combination of both. I'm not familiar with your sets HDR related settings sorry. I'm sure someone else can help with that.
I will also say that if you feel THE SAME CONTENT viewed on a 4K streaming service is sharper than its Ultra HD Bluray counterpart I would have to say I find that very surprising. You have to compare apples to apples, by this I mean the same movie under the same conditions with the same picture settings (calibrated if possible). Any other comparison to other content is meaningless. Every film/video/documentary is created by different people who have varying degrees of skill and different views on how they want their content to look once released to consumers. Not all content is filmed and shot with the sole purpose of being very bright/high contrast/ultra sharp with everything in focus. Your right that there are a lot of high quality streaming videos and movies that can be almost demo worthy (I like time laps photography). Im sure if you could view this content on Ultra HD Bluray the difference would be significant.
The theory behind HDR and WCG is basic and when done properly through the whole signal train it will most definitely wow anyone who see's it.
My personal favorites right now are Life of Pi and Lego Movie and the revenant
Here are a couple of things to keep in mind about HDR (for comparison to SDR) if you would like to indulge me. Most of this is mentioned or covered in some way during the interview with Joe Kane.
1. HDR10 uses a new form of gamma or Electro Optical Transfer Function known as ST2084. This EOTF was designed for DARK ROOM viewing. Most of the video information within HDR10 exists from 0-100 NIT. Many people have become accustomed to viewing standard dynamic range material during the years past with a less than desirable gamma setting and higher than recommended light output in viewing conditions other than dark room. If HDR in its current form is viewed in a moderate to bright room the effect of the spectral highlights will not be able to overcome ambient light enough to have the impact intended. Some owners use the dynamic contrast settings to overcome this feeling. I believe this results in an inaccurate image but I have no tests or data to prove this at the moment so for now its to each their own.
2. The vast majority of colors created in nature exist within the Rec709 color space. Most of whats beyond that are man made. With the biggest exception being parts of green (color volume which relates to luminance is a separate thing). How much you notice the expanded color are going to be very content specific and may be subtle. Not to mention many people who prefer setting their display to a "preference" picture vs. a "reference" picture and are used to viewing over saturated and inaccurate colors may notice a smaller difference.
3. Luminance and Chromaticity (ST2086) - Right now luminance and tone mapping are done based on STATIC metadata. This means that the display takes into account the widest color gamut and the brightest scene within the content (the whole movie) and remaps based on that. The majority of the content will have a greater compression of dynamic range and color gamut than would be necessary based on the displays capabilities. HDR10 will one day be capable of scene by scene DYNAMIC metadata. Samsung recently showed a demo of a 2016 model display utilizing this feature. Dolby Visions system already does this. I don't know if this will require a hardware update, software update, or both. As always it will be at the manufactures discretion if and when certain products receive updates.