what is the ideal "display information" that you would like to see for the best possible hdr video file playback & hdr streaming, and why? or is there even such a thing as "ideal" information?
for example, when i connect my lg oled 4k hdr tv to my pc i get the following in "display information":
i've read that this is ideal given that it will pass hdr metadata to the tv for processing rather than windows having a hand in the processing, altho i don't know if this claim is accurate... i've also read that you want to see at least "10-bit" in "display information", not 8-bit, given that hdr is "hdr10", again i don't know if this claim is accurate either?
i have a vertex that allows me to set different edid info, which in turn influences what windows 10 shows in "display information" - for example, with my pc connected to my vertex, then to my lg oled 4k hdr tv, i can see this:
is this better than 8-bit/rgb? should it be better than 8-bit/rgb? frankly, to my eye, hdr playback looks better with 8-bit/rgb vs 10-bit/ycbcr420 in the hdr samples i've tested, with 10-bit/ycbcr420 colors seem to be overly saturated and blacks overly dark.
and then there's this that i found when googling, 12-bit/ycbcr422, is that even better than the two mentioned above? ideally, is this what you would want to see if possible?
and, finally, why would my display information show "only" 8-bit/rgb when connected directly to my 4k hdr tv? shouldn't windows default to higher info? could it be my cable? not sure why that would be an issue, it's certified 4k60hz hdr 18.2gbps hdcp2.2 4:4:4, as are the other cables i've tried which gave me the same results.
any clarification regarding this topic would be greatly appreciated.