Originally Posted by jimim
right end of the chain to a panel is always 4:2:2 i also thought so output at 4:2:2 would be the best.
what about source direct like the other players. . . wouldn't the color space output at 4:2:0 then vs the 4:2:2? then t he panel cold do the conversion if one wanted to vs the player?
when you output at 4:2:2 is it always it would always be 12 bit then cause the HDMI spec doesn't support anything lower. so could one possible have issues with picture quality if their panel can not handle a 12 bit signal to 10 bit output since all panels right now are 10 bit?
With 4:2:2 you can select which Color Depth you want to use -- UP TO 12b. You aren't stuck at 12b. 12b is just the upper limit.
You can do the same with 4:4:4 for /24 content (such as UHD disc movies). But for /60 content (live concerts on regular Blu-ray or SD-DVD's upscaled without DVD 24p Conversion) the HDMI spec won't allow 4K/60 4:4:4 higher than 8b. 10b and 12b require too high bandwidth for the HDMI spec.
You can also use 4:2:0 -- up to 12b -- but the HDMI timing specs only allow 4:2:0 for 4K/50 and 4k/60. It isn't legal for any other Resolution/Frame-rate combo. If you try to force 4:2:0 (at any bit depth) for 4K/24 output, or 1080p/60 output, or anything other than 4K/50 or 4K/60, the player will use 4:2:2 instead, since it can't put out a legal HDMI signal with 4:2:0.
Some basics here:
4:4:4 is jargon that means every pixel has its own, individual color data as well as its brightness data (gray scale).
4:2:2 means color information is present only half as often horizontally as brightness information. With 3 components to the data: Y for luminance (brightness) and Cb and Cr for how to color that brightness. This means the data along a given line goes Y, Cb, Y, Cr, Y, Cb, Y, Cr, etc.
4:2:0 means color information is present only half as often as brightness information *BOTH* horizontally and vertically. This is the format the data is stored on the disc, because it takes less space to store each image. This WORKS because the human eye is far less sensitive to resolution in colors than in gray scale.
4:2:0 is used for SD-DVD, Blu-ray, and now UHD.
But if you think about it, every pixel on the display needs its own color before it can light up! That means before the pixels light up the 4:2:0 coming off the disc has to be raised to 4:4:4. This process -- called Color Upsampling -- happens vertically to go from 4:2:0 to 4:2:2 and horizontally to go from 4:2:2 to 4:4:4. Think of it as a special type of upscaling -- except just for colors! The colors for a given pixel are established by using math on the color values found in the data near that pixel.
So do you want the player to do those two stages of Color Upsampling, or do you want the TV to do it? It's your choice. But it's going to happen one way or another, because every pixel has to have its own color before it can light up: 4:4:4.
(RGB, by the way, is always 4:4:4, in the sense that every pixel gets its own R, G, and B, values which define both its brightness and its color.)
Three components per pixel -- either Y, Cb, and Cr, or R, G, and B. The Color Depth is the number of bits used for each component. So Color Depth 12-bits means 36 bits total per pixel.
But if you send YCbCr 4:2:2 or 4:2:0 over the HDMI cable, the color components aren't being transmitted as often as the luminance (Y) components. So 4K/24 4:2:2 12b is lower bandwidth on the HDMI cable than 4K/24 4:4:4 12b. And 4K/24 4:2:0 12b is lower still!
If you don't mind the display doing the Color Upsampling, sending 4:2:0 or 4:2:2 would be the natural choice because that means less bandwidth on the HDM cable and thus less chance of HDMI signal problems.
But it is possible to SCREW UP Color Upsampling. If the player does it right and the TV does it wrong (in some cases) you'd want the player to do it.
Since we are talking about bugs here, there's little logic to it -- little ability to predict what will happen. You can just try it and SEE if things look better with the player down the Color Upsampling.
If not, send the lower bandwidth format (fewer HDMI problems) and let the TV do the Color Upsampling.
As an added complication, not all TVs will accept all formats. The Player will sort this out during the HDMI handshake -- sending a legal signal that the TV says it can accept -- trying to stay as close as possible to what you TOLD the player to send out. And you can use the on-screen Info displays from the Player to see what's actually happening -- what's actually being sent to the TV.
Then there's Color Depth. SD-DVD and Blu-ray are on disc as 8b. Current UHD discs -- which use HDR-10 for their HDR -- use 10b.
So why use 12b? Where would the extra bits COME FROM? Well where they come from is ROUNDING in the video processing. As such the real difference in 12b vs 10b *SHOULD* be subtle. But now we are back to bugs. Some displays will handle 12b better than 10b -- just a characteristic of how they process video. Other displays will receive 12b and immediately strip off the low order bits turning it into 10b. For those displays, sending 12b is a waste of bandwidth.
Confused yet? Then I suggest you start with the AUTO settings for Resolution, Color Space and Color Depth then, until you have time to experiment and see if you find reason to prefer something different.