Originally Posted by Manni01
I'm not sure you read properly, or maybe you replied before I edited my post:). HDMI 1.3 (and later) prevents the source from sending XvYCC information if the display is not compatible. It's part of the HDMI specs. So there cannot be incorrect hue.
What you get is just the rec709 "core". Again, this is my understanding. I'm not an expert, and if you don't know, I guess no one does:).
PS: I agree that KMO seems to know what he's talking about, see what he says in this post
"The HDMI specification says that the source can't feed xvYCC to the TV, unless the TV declares that it knows about xvYCC".
"...an HDMI source is not permitted to send xvYCC data to a non-xvYCC display. The source is required to reduce it to standard 601/709".
There is no "core" and "extension". This concept applies to DTS, but not to xvYCC.
Please understand that xvYCC is a very tricky technical solution. It looks just like normal BT.709 content. Unless you know what to look for, you would never know it's actually xvYCC. Which practically means: A non-xvYCC-compatible source does not know whether content is xvYCC or not, and as a result it will treat it as simple BT.709 content. So again (for the 3rd time): With a non-xvYCC-compatible player and display, all colors outside the valid BT.709 range will be simply clipped, which *WILL* (to the best of my knowledge) result in incorrect hues.
All the things you mention may apply if the source device is actually xvYCC aware. But that is not what I've been talking about all the time. I've been talking about source device and displays which are *not* xvYCC aware.
Originally Posted by R Harkness
I'm lost on this lust for 8 bit and beyond rec. 709 stuff.
I guess I just don't know enough about it. I know theoretically (more from audio) the advantages of expanding bit depth. I can think in my mind of "extended dynamic range" and "extended color space" as being good things. But in terms of the actual difference in an image, I'm not sure what I'd be seeing between a current-spec Blu-Ray and the 4UHD Blu-Ray specs you guys would like to see.
I'm intrigued, but a bit baffled, therefore about why increasing those specs would lead to a more obvious visible difference than the jump to 4K resolution. Would movies actually look that much different and how?
Most consumers leave their color on crappy settings, often already exaggerating color, but can still easily see the difference between HD and SD resolution on their displays. If the difference you guys are more in the realm of something "someone into calibrating displays" willl only notice, I don't see how it would drive sales at all. But, again, I don't know. Are there any illustrations or photos that would give an idea of the difference we would see?
There are 3 different things:
(1) A bigger gamut
means that your display will be able to produce more life-like colors. You can get a hint of this right now if you switch your display from BT.709 to "wide". That way you get exaggerated colors, which is not good. However, you'll get some colors, you'd never get with a properly calibrated BT.709 display otherwise. If content is actually recorded and encoded properly with a bigger gamut, the colors will not be exaggerated, anymore, but the color palette will be bigger, and those things which have more saturated colors in real life, will then actually make use of the bigger color palette. If I had to describe this in one short sentence, I'd say a bigger color gamut increases the "color palette" the movie can use to draw its images, without exaggerating anything.
(2) Higher bitdepth is sometimes referred to as increased dynamic range
, but that is actually a different thing. A higher dynamic range means that you extend the color palette, once again (see (1)), but instead of adding deeper color saturations, you add the ability to display brighter pixels. E.g. imagine looking at a sunset on your projector and you can't even look into the sun because it would blind you. So higher dynamic range means the content gets the ability to draw blindingly bright pixels, without losing the ability to show all the other (non-blindingly bright) pixel values. However, this is *NOT* what increased bitdepth is all about. Or actually, if you want to get higher dynamic range, you also want to increase the bitdepth, so you don't reintroduce banding. But these are really two separate things.
(3) Increased bitdepth
means most of all that you can draw finer color and brightness nuances. Imagine a grayscale test image which in 1080p resolution shows a gray gradiant from left to right. The left most pixel would be black, the right most pixel would be white. The pixels in between would slowly go from dark to light gray. Now if you do the math, with 8bit video there are only 256 different brightness steps available (0-255). And since content is usually encoded in limited range, it's actually only 220 brightness steps (16-235). So that means if you draw a grayscale in 1080p resolution from the left side of the screen up to the right side, every 1920/220 = 8.7 pixel columns will have the same value. So basically the left most 9 pixels will be black (RGB value 16,16,16). Then the next 8-9 pixels will be RGB 17,17,17 etc. If you have a big projection setup, this gradiant is not really smooth to our eyes. We can see light banding in such a test pattern. Banding means we can see those vertical lines between the different shades of gray in the grayscale. This is a limitation of 8bit encoding: You only have 220 different shades of gray, and that's not enough for our eyes. We can see the steps between those 220 grayscale steps.
There are ways to work around the issue. The easiest solution is to use some form of dithering. This will hide/mask the steps and that usually fools our eyes pretty well. However, adding dithering to the video signal before encoding makes compression more difficult, which reduces compression efficiency and raises the needed bitrate. And if you don't go crazy with the bitrate, the compression will actually reduce some of the dithering again, which will re-introduce the banding.
The proper solution to this is to increase the bitdepth. If you go to 10bit, we have 877 different shades of gray available (64 - 940). That's *MUCH* better than 220. There have been studies how many steps the human eye can see. From what I remember some people can see up to 11-12 bits of grayscale steps. So maybe even 10bit is not enough yet. But it's a hell of a lot better than 8bit.
So what benefit does 10bit bring practically? It brings smoother images with no visible banding steps. And as a nice side effect dithering isn't needed, anymore, which means that compression actually produces the same (or better) image quality compared to 8bit, while getting along with less bitrate.
Originally Posted by mark haflich
By going to 10 bits, we add an additional 256 shades of gray.
Nope. Every added bit doubles the number of steps. So going to 10 bits, we raise the shades of gray from 220 to 877.