It's possible that they don't see the need to support 12bit color input when the panel itself is "only" 10bit anyway.
Well 12bit is supposed to come to UHD Phase2 2017/2018. Maybe the panel is not "true" native 12bit. The only way of finding out if it is you would need 12bit test patterns and so if there is any dithering then it's not true 12 bit.It's possible that they don't see the need to support 12bit color input when the panel itself is "only" 10bit anyway.
Well this is something that the HDMI Forum should be working towards. Is getting HDMI to support 4:4:4 12bit 120fps or heck go all the way 16bit 4:4:4, 120fps. Then we truly will have something wonderful and a lot more future proof.You won't be getting 4:4:4 at those bit depths due to bandwidth limitations in HDMI 2.0, but 4:2:2 is supposedly supported. I've no hardware that can verify this though.
It's always baby steps with HDMI. It's a consumer-oriented cable standard with a lot of legacy baggage, so you don't see radical increases in bandwidth and efficiency or features that no video content uses like you do with DisplayPort / MHL.Well this is something that the HDMI Forum should be working towards. Is getting HDMI to support 4:4:4 12bit 120fps or heck go all the way 16bit 4:4:4, 120fps. Then we truly will have something wonderful and a lot more future proof.