Originally Posted by Marc Alexander
Strangely there are not choices for 10, 12, or 16-bit Deep Color like in past Sony BDPs (I have the S790, S6200, and S5500). Just Deep Color Auto or OFF. I kept it set to AUTO.
YCbCr 4:2:2 (10-bit)
YCbCr 4:4:4 (12-bit)
I will have to test with deep color off to see if it allows for RGB 10-bit. I'm not at home today sobI will have to test tomorrow.
what i would try is,
- set the HDMI output of the 4k player to RGB 12 bit, (RGB is by definition iirc always in 4.4.4 format)
- and set deep color to ON (deep color is just the higher then 8 bit "bit rate", i think it is a term used for both 10 and 12 and 16 bit rate etc
+ make sure the w900a is setup correctly in game mode, and with HDMI to full (the last bit is prob the better setting, but either might work, depending if it is a user adjustable setting on the player)
i suspect you will notice a significant difference, compared to your initial default setup
note: but for "best" the 2 or 3 different conversion steps i mentioned in earlier posts must all be implemented together
note 2: i suspect this sony player does at least 1 of those 3 steps correctly (once you use the right settings mentioned above), maybe even 2 or 3 (but sadly no xv.color, that would have been the cherry on the top)
my reason for suggesting 10 bit RGB output instead of 12 bit is because
- iirc 10 bit is the max allowed under the 1080p video spec, so sending it 12 bit could force some additional conversion step in the chain that further degrades the video signal
i am 99% sure our w900a panel is 10 bit (which is remarkable high quality in the era when most were 6 or 8 bit).. the 12 bit reference is most likely 10 bit hardware + 2 extra bit software dithering (nice to have, but it doesnt make the panel 12 bit)
: even if our w900a's keep reporting "12 bit" when it is send certain video signals, it doesnt mean it can accept (and display) 12 bit "unprocessed". eg when it receives it as 10 bit RGB 4.4.4 1080p I (and others) have confirmed with test images it will display this correctly and unprocessed. i havnt seen anybody confirm this for 12 bit,. since however 12 bit is your only RGB output setting available, use it and see if the tv accepts it (but as explained, even if the tv accepts and and reports "12 bit" it doesnt mean it hasnt added an otherwise unnecessary processing step, compared to sending it 10 bit RGB 4.4.4, but this is a minor issue)
disclaimer: i am still looking into the hdmi spec issue for 1080p, and issues with the 1080p video signal spec and the port and cable specs for hdmi, i am just summarizing for you how i would try it from what i figured out so far.