Originally Posted by BigCoolJesus
Thanks for the response!
I guess where I am confusing myself is how WCG is implemented with UHD. I've always set my TV's to "Normal" for color range. And same goes for my external devices, such as Bluray player or Xbox One. But with UHD and HDMI 2.0 specs and all the new specifications for TV's, etc...where does WCG come into play now? Do I have to manually turn it on in the TV for select material? Or will a UHD source that has a WCG encoded force it on?
I know, I must sound really lost and n00bish right now. Normally I grasp new concepts within a post or two of help. But still having trouble grasping where WCG falls into everything with UHD content now.
To add my two cents after attempting some self calibrations for UHD Blu Ray HDR and reading up on a lot of stuff here.
The consensus so far is that TV automatically uses a wider space than rec709 when watching UHD Blu Rays (which all have HDR and wider color gamut), which also seems to map "okay" in the rec2020 container for UHD Blu Rays. I'm not sure if anyone knows if HDR programing on Amazon etc utilize a wider color gamut - but if they do, it should map to it automatically to an extent
If you turn Wide Color ON in the TV settings you will be forcing the TV to use colors in line with and wider than rec709 but NOT
in line with rec2020 (the UHD HDR standard color container). Colors will certainly explode
when you turn it on when watching UHD Blu Rays - but that's not the intention of HDR. As you've probably read, most natural colors exist within the rec709 gamut, and colors beyond that tend to be man-made. So colors in UHD Blu Rays shouldn't look like you've suddenly set color to 100 on the TV. Some people may like this "WoW" factor, and there's nothing wrong with that - but it's not the intent if you are looking for accuracy.
Furthermore I have not seen or heard of anyone getting an accurate (obviously clipped) rec2020 calibration/color mapping on this set with Wide Color enabled unless maybe using an external color processor (and good luck finding an affordable one capable of HDR).
If you are looking for moderate accuracy and don't have equipment to generate HDR test patterns - I'd try doing a good, standard rec709 calibration, keep the TV on that picture preset, then pop in a UHD Blu Ray and let the TV do it's automatic HDR adjustments based off your prior calibration. See what you think then.
Even pros are still tackling proper calibration for HDR - so don't feel like a n00b.