Originally Posted by Ron Jones
HDMI 2.0 will have a total data rate limit of 18 Gbps and that sets a hard limit as to max. bit depth vs. resolution that it will be able to support. for a given chroma subsampling scheme. Remember that rec. BT.2020 for UHD also includes 8K video as well as 4K and HDMI 2.0 was not intended to supporting everything defined by BT2020. Increase bit-depth directly results in increased data rates, so there will be a limit, its just a question of where it is. Even though video will be using the bulk of the data capacity of HDMI, HDMI's data rate budget must include the capacity to carry uncompressed multichannel audio and control information. Working within the data rate limit, for supporting 4K video the real question is what combinations of bit depths and chroma subsampling schemes (e.g., 4:2:0 = lowest data rates and 4:4:4 = highest data rates) can technically be supported and to what extent has politics within the HDMI Forum artificially constrained the HDMI 2.0 supported capabilities to something less than what would be technically viable.
If these discussed limitation are really true, it will be a huge disappointment.
They took almost two year to put together this spec, and it is already obsolete before it is released as it does not have the full support for 4K.
Why is it so impossible for these short sited people to plan a bit ahead and build in extra capacity (bandwidth)?
The cost increase per HDMI chip is going to be miniscule in relations to the cost of a 4K projector/receiver (whether HDMI chip cost $1 or $10 or $30 on a $3,000 piece of equipment is totally irrelevant; $30 is 1% of $3,000 - completely irrelevant)
If it is that much then let Joe 6 pack have equipment with current standard - he probably does not even care or appreciate 4K.
From consumer's point of view, the HDMI consortium would be much better holding off another year and build in capacity for full 8k support, including 3D (without any artificial constraints in the chip).
Chips are cheap - receivers/projectors are expensive.
If broadcasters are concerned about current bandwidth, they can use lower standards in the way they transmit content. Just because the HDMI chip has full support for for 4:4:4 16 bit color, does not mean they could not broadcast content in 4:2:0 and 10 bits to start with and then transition to 4:4:4, 16 bit color as bandwidth capacity becomes available. But do not cripple chip support artificially.
The ideal is to move completely away from HDMI. Ideal is HDBaseT, which does mot impose any of these artificial constrains and uses STANDARD Cat5 wiring, supporting much, much more bandwidth and longer distances.