Originally Posted by madshi
Ok, but we're not talking about just one format out of many. We're talking about a key format here (24fps 4:4:4). So if you don't know this for sure, it would be worth investigating, IMHO.
We don't know for sure yet which chroma format and bitdepth 4K Blu-Rays are going to be encoded in, but 4:4:4 10bit (or even 12bit) seems a likely choice. Which means that for best quality transport to the display we need the display to accept at least 4:4:4 10bit at 24fps. Can the 10.2 Gbps chips in the Sony VW500/600/1100 do that or not? That's really important to know...
That, however, I do not agree with
. The chances of Bluray 4K going for any 444 format is simply zero in my opinion. Video is 422. Desktop and games are 444.
444 to 422 loses almost nothing for standard video content, so it's very unlikely they would lose valuable space, especially for download/streaming, with info that is not strictly needed.
We'll be lucky if we get 422 or even 420 10 bits. Quite a good chance to get 422 8 bits. Very little chances to get 422 12 bits, although I read somewhere that 422 has to be encoded in 12 bits so maybe it will be 422 12 bits or 420 12 bits. I fear most of the improvement will come from resolution and better compression with h265/HEVC, rather than super high bit depth and limited chroma downsampling.
I know, it's disappointing, but it's the same as being stuck with rec 709 or x.v.color.
If the Bluray 4K specs are released this year (and I hope they will), the present Sony models will be compatible, it's as simple as that, otherwise Sony will not agree to the standard and it won't happen.
There is absolutely no chance IMHO we'll get better than 422 12 bits in x.v.color with Bluray 4K if it happens this year, and even that is pushing it.
So whatever happens with Bluray 4K, the 500/600/1100/upgraded 1000 will be compatible. Or the standard won't be agreed until new models are available, which can support the new format.
The manufacturers are limiting the standards, it's not the other way around sadly.
This is why HDMI 2.0 is so conservative. It has to be realistic.
However, there is more of a question mark over the current models compatibility re UHDTV, which does require 10 or 12 bits at 50/60p, which they are not able to support with the limited 10.2Gbits/s implementation. But as the specs are in flux, who knows what they will settle with? They also require a rec2020 gamut and no consumer display can achieve that, so they'll have to give us more realistic specs at some point or there won't be any UHDTV broadcast until... 2020
By the way, I'd be super happy if I was wrong re bluray 4K, as 444 10 or 12 bits would be awesome