Originally Posted by thepoohcontinuum B4) Why do TV / Movie studios do chroma subsampling?
For the most part, it’s because of modern limitations in data storage capacity and transmission bandwidth.
For example, all blurays are subsampled down to 4:2:0 at the mastering studio (this is according to official bluray spec). As a result, most bluray titles have an average video bitrate of 25 Mbps. Assuming a 2 hour movie, you need a storage medium that can hold ~22.5 GBytes; which a bluray disc can do no problem. Now lets do 4:4:4 (no subsampling) on that same 2 hour movie. Going from 4:2:0 to 4:4:4 quadruples the amount of information required. So that 25 Mbps becomes 100 Mbps, and 22.5 GBytes becomes 90 GBytes; not even a dual-layer bluray disc can hold that much data.
If a bluray disc can’t hold it, don’t expect satellite/cable/fiber to transmit this kind of data either.
Note: the numbers used in this example are under review; will update when straightened out
Those numbers are rather wrong
Note: for all of this I assume 8 bit component depth.
4:4:4 RGB or YCbCr has 3 full-resolution components, so 8*3 = 24 bits per pixel total. So one frame is 1920*1080*3*8 bits = 49.8 Mbit. At 24 fps this results in a datarate of 49.8*24 = 1.2 Gbps.
Chroma subsampling only affects the two chroma planes, not luma. So luma is full resolution (1920*1080) while for 4:2:0 each of the chroma planes is half resolution in each dimension (1920/2 * 1080/2). So one frame of 4:2:0 is (1920*1080 + 1920/2 * 1080/2 + 1920/2 * 1080/2) * 8 bits = 24.9 Mbit, half of 4:4:4. At 24 fps this results in a datarate of 597 Mbps.
But comparing uncompressed bandwidth numbers is only meaningful for HDMI, not Blu-ray or HDTV transmission as those are always compressed. And over HDMI, only 4:2:2 and 4:4:4 are supported, not 4:2:0.
It's a little hard to compare how different subsamplings affect compression, but to start with chroma has much less information than luma no matter the subsampling and thus compresses extremely well, to the point that the great majority of the bandwidth goes to luma data. Anyway, with modern codecs (H.264), 4:4:4 does not take significantly more bandwidth than 4:2:0, and in some cases (upsampling a 4:2:0 to encode as 4:4:4) might actually take less bandwidth due to the larger prediction modes available to 4:4:4 and better interpolation filter (H.264 only.)
The main reason movies are all 4:2:0 is simply because that's the way things were done long before MPEG (it's the original lossy compression, by a factor of two.) The main reason I can think of for modern TVs to subsample is to reduce the required processing power for all the processing done. Or maybe for a PenTile matrix, but no TVs have them as far as I'm aware.
Originally Posted by thepoohcontinuum
When HDMI audio extensions are enabled, *something* causes 4:4:4 to fail. I have no idea if this is a TV issue, or a video card issue, or a HDMI issue, or an EDID specification issue, or a combination.
RGB support (4:4:4) is required for all HDMI devices. This is the native format of the framebuffer in computers, making YCbCr support completely optional. There is absolutely no reason for the video card to convert to YCbCr, let alone subsequently subsample to 4:2:2 before sending the data over HDMI. The only place the conversion and subsampling would logically take place is in the TV.
I'll have to try my TV later, I'm honestly quite surprised that getting 4:4:4 from TVs is hard.