Originally Posted by aaronwt
What color space is being sent? Are you sure it's actually 8 bit.? Some devices erroroneously show 8bit when being sent 422 at 12 bit. But at a certain framerate it's impossible to get it at 8 bit because it is not part of the HDMI spec to send it at 8bit. Only at 12 bit. And many devices erroneously report 8 bit or no bitdepth at all.
Sent from my Nexus 7(16GB) using Tapatalk
It's sent in BT.2020 and I'm convinced it is in this colour space because the colours look correct and HDR is engaged on my projector. The signal info on my projector shows the signal is 8bit depth at 4:2:2 and in this scenario I notice colour banding. My Denon receiver (X1400H) has enhanced 4K mode enabled, and also shows the signal is BT.2020 YCbCr 4K HDR 24Hz, however it seemingly doesn't recognise the bit depth.
My test file is a 4K bluray remux, 23.976 fps, YUV, 420, 10-bit depth.
Do you think this can be attributed to the fact that, under HDMI settings my Nvidia Shield TV only have the following three modes under BT.2020 10-bit? They are:
3840x2160 60 Hz YUV 420 10-bit Rec. 2020
3840x2160 59.940 Hz YUV 420 10-bit Rec. 2020
3840x2160 50 Hz YUV 420 10-bit Rec. 2020
There is no mode for 10-bit Rec. 2020 at 3840x2160 23.98 Hz! This seems absurd. I also tried MrMC and no matter what I do, at 23.98 Hz I cannot get a 10-bit HDR signal to my projector.
The only logical conclusion that I can make is that the Nvidia Shield TV does not support 10-bit HDR at 23.98 Hz.