Ideal Display Information For HDR? - AVS Forum | Home Theater Discussions And Reviews
Forum Jump: 
 
Thread Tools
post #1 of 3 Old 06-11-2019, 01:38 AM - Thread Starter
Advanced Member
 
Join Date: Feb 2015
Posts: 776
Mentioned: 6 Post(s)
Tagged: 0 Thread(s)
Quoted: 471 Post(s)
Liked: 109
Ideal Display Information For HDR?

what is the ideal "display information" that you would like to see for the best possible hdr video file playback & hdr streaming, and why? or is there even such a thing as "ideal" information?

for example, when i connect my lg oled 4k hdr tv to my pc i get the following in "display information":



i've read that this is ideal given that it will pass hdr metadata to the tv for processing rather than windows having a hand in the processing, altho i don't know if this claim is accurate... i've also read that you want to see at least "10-bit" in "display information", not 8-bit, given that hdr is "hdr10", again i don't know if this claim is accurate either?

i have a vertex that allows me to set different edid info, which in turn influences what windows 10 shows in "display information" - for example, with my pc connected to my vertex, then to my lg oled 4k hdr tv, i can see this:



is this better than 8-bit/rgb? should it be better than 8-bit/rgb? frankly, to my eye, hdr playback looks better with 8-bit/rgb vs 10-bit/ycbcr420 in the hdr samples i've tested, with 10-bit/ycbcr420 colors seem to be overly saturated and blacks overly dark.

and then there's this that i found when googling, 12-bit/ycbcr422, is that even better than the two mentioned above? ideally, is this what you would want to see if possible?



and, finally, why would my display information show "only" 8-bit/rgb when connected directly to my 4k hdr tv? shouldn't windows default to higher info? could it be my cable? not sure why that would be an issue, it's certified 4k60hz hdr 18.2gbps hdcp2.2 4:4:4, as are the other cables i've tried which gave me the same results.

any clarification regarding this topic would be greatly appreciated.
aeneas01 is offline  
Sponsored Links
Advertisement
 
post #2 of 3 Old 06-11-2019, 03:26 AM
Member
 
long_pn's Avatar
 
Join Date: Jun 2004
Posts: 122
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 52 Post(s)
Liked: 20
As I understand, HDR, color bit depth and resolutions are independent. I'm enjoying HDR with my 1080p 8-bit VA monitor. 10-bit is just a recommendation of the HDR10 spec, which is just one of many HDR standards.
RGB is like a lossless format in music, where 4:2:0, 4:2:2 are lossy in chroma. That's why you can see the difference. If using madVR, it's always recommended using 8-bit RGB because madVR dithering is very good and doesn't cause conversion in GPU or displays.

Last edited by long_pn; 06-11-2019 at 03:32 AM.
long_pn is online now  
post #3 of 3 Old 06-16-2019, 04:37 PM - Thread Starter
Advanced Member
 
Join Date: Feb 2015
Posts: 776
Mentioned: 6 Post(s)
Tagged: 0 Thread(s)
Quoted: 471 Post(s)
Liked: 109
Quote:
Originally Posted by long_pn View Post
As I understand, HDR, color bit depth and resolutions are independent. I'm enjoying HDR with my 1080p 8-bit VA monitor. 10-bit is just a recommendation of the HDR10 spec, which is just one of many HDR standards.
RGB is like a lossless format in music, where 4:2:0, 4:2:2 are lossy in chroma. That's why you can see the difference. If using madVR, it's always recommended using 8-bit RGB because madVR dithering is very good and doesn't cause conversion in GPU or displays.
interesting, thanks!
aeneas01 is offline  
Sponsored Links
Advertisement
 
Reply Home Theater Computers

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off