Originally Posted by aohus
therein lies the meat of the question i asked. Does 8 bit 4:2:2 look worst than 10 or 12 bit 4:2:2 or 4:4:4 at 24p? Under the impression it does since the display is outputting only a limited band of colors since its being forced to display at 8 bit...
would love to know the answer to this question from anyone
Depends what your source is - if it's for example, a UHD BluRay, that's encoded in 10-bit 4:2:0, so if you go to 8-bit you lose information, notably in colour gradations - you get horrible chroma banding on fine gradation - skies are the favourite to look for it!
This can be alleviated with a VP by dithering, and arguably, if done properly, you probably won't notice any difference, but at that point you're still making stuff up, and personally, I'd rather not do that.
If the source is 8-bit - eg, HD BluRay, then that's encoded at 8-bits, so 8-bit output is ideal.
If your display is only 8-bits, it depends which device dithers the best - the source, or the display - and in the absence of a decent VP, that will generally be the display - so again, output in 10-bits or 12-bits and let the display do the heavy lifting.
Note that if you output in 10-bits, the colour encoding must be 4:4:4 and 12-bits can be either 4:2:2 or 4:4:4.
Zidoo Z9S | Nvidia Shield Pro | HDFury Vertex² | 32TB mirrored storage
Denon AVC-X8500H | JVC DLA X7000 | Screen Research 9ft multi aspect ClearPix 2 ISF screen
3 x PMC IB2S fronts | 4 x PMC Wafer 2 rears | 4 x PMC Wafer 1 ceiling | 2 x M&K SS550 THX height | Rel Stentor III
A pair of slippers, a comfy sofa and a glass of 16 year old single malt