Originally Posted by avernar
No! HDMI 1.4 has YCbCr 4:2:2 pixel encoding. It can transport 10 and 12 bit pixels just fine.
We're talking about how Dolby Vision
transports its proprietary signal, which includes Metadata (not and a Base Layer and an optional Enhancement Layer, that can be two separate HEVC video streams, all over a single HDMI connection. It's data wrapped up and tunnelled in 8bit RGB.
And this is why the HDMI screen says it's 8bit RGB, because it IS 8bit RGB at the transport layer, which is all that screen is looking at.
Where the heck did you get this tunneling theory from?
I should really ask you where the heck did you get your irrational disbelief of new information from? It's not theory it's fact, you can find it in posts on the Calibration threads, the HDFury thread is a good resource; posts from Spectracal,
, all over. Here's one
, and no, it's not the only one.
I'm more disappointed that you just flatly stated "that's wrong", misleading readers, without wanting to learn anything, merely because you didn't know about it
. Dolby Vision's method of transporting its signal is not news
and has been known about since the start of Dolby bloody Vision!