Originally Posted by mpgxsvcd
Are you stating this because UHD @ 60 FPS 4:4:4 10 bit would exceed the 18 GB/sec maximum throughput of HDMI 2.0? What would the bandwidth be for UHD @ 60 FPS 4:4:4 10 bit?
3840x1920 = 8,294,400 pixels per frame
8,294,400 pixels per frame x 30 bits per pixel = 248,832,000 bits per frame (Note: it's 30 bits per pixel because we are using 10 bits per primary and there are 3 primaries)
YUV 4:4:4 is uncompressed chroma, so no reduction in bits per frame
248,832,000 bits per frame x 60 frames per second = 14,929,920,000 bits per second = 14.9 Gbps (SI-units) or 13.9 Gibps (1024-based)
Bear in mind that while 14.9 Gbps is well below the 18 Gbps cap for HDMI 2.0 Level A, we have only calculated video bandwidth. We have not accounted for audio, HDCP 2.2 overhead, etc. Since I do not know how much bandwidth these things require, I can't mathematically show that this exceeds 18 Gbps. All I can point to is the chart in the HDMI 2.0 FAQ on HDMI.org, which shows support for 2160p60 YUV 4:2:0 all the way up to 16-bit, but only shows support for 2160p60 RGB/YUV 4:4:4 at 8-bit. Note that it supports 2160p24 RGB/YUV 4:4:4 even at 16-bit.
If we could get these 4K source devices to output 10-bit 2160p24 YUV 4:2:0 content without increasing the frame rate to 60 fps that would solve a lot of problems. Likewise, if we could get them to output 10-bit 2160p60 YUV 4:2:0 without decompressing the chroma resolution then that would solve other issues. I suspect the problem is that they need to be able to insert GUI overlays and perform other processing that is best done with uncompressed chroma and they don't want people complaining about a GUI that operates at less than 60 fps. So, something has to give.