PC to HDR TV: RGB Full 8 bpc vs YCbCr422 10 bpc? - Page 4 - AVS Forum | Home Theater Discussions And Reviews
Forum Jump: 
 13Likes
Reply
 
Thread Tools
post #91 of 104 Old 04-27-2020, 01:11 PM
Senior Member
 
KurianOfBorg's Avatar
 
Join Date: Jan 2013
Posts: 310
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 193 Post(s)
Liked: 40
Quote:
Originally Posted by stuartasmith85 View Post
Another apology for resurrecting an old thread, but this seems like the most informed discussion of this incredibly confusing topic, so I'm hoping someone can clarify.

I am running a desktop PC with a GeForce RTX 2070 card, connected to a Panasonic DX-902B TV (running through a Denon AVR-x3200w amp, although I've tried connecting the PC directly to the TV and it didn't change the outcome).

With HDR enabled in Windows in Display Settings in Windows, if I then go to Advanced Display Settings, the Display Information tells me that the Bit depth is 8-bit with dithering and the color format is RGB.

If I change the refresh rate to 30hz, the Bit depth goes up to 12-bit.

It's hard to tell because the Panasonic TV doesn't have a particularly detailed read-out of source information, but I think that the TV displays 10-bit HDR fine via Xbox One X and Apple TV.

I'm really just trying to check, is it right that at 60hz the only option is RGB 8-bit, or is something wrong?
RGB 8-bit at 4K 60 Hz is the only option until HDMI 2.1 arrives. It's visually indistinguishable from 10-bit. A 10-bit signal between source and display is required only when the source cannot perform dithering. Consoles and Blu-ray players do not perform dithering like Windows, so they support HDR only in 10-bit mode, requiring YCbCr 4:2:0 to do 4K 60 Hz over HDMI 2.0.

Last edited by KurianOfBorg; 04-27-2020 at 01:14 PM.
KurianOfBorg is offline  
Sponsored Links
Advertisement
 
post #92 of 104 Old 04-27-2020, 01:31 PM
Newbie
 
Join Date: Apr 2020
Posts: 2
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 0
Quote:
Originally Posted by KurianOfBorg View Post
RGB 8-bit at 4K 60 Hz is the only option until HDMI 2.1 arrives. It's visually indistinguishable from 10-bit. A 10-bit signal between source and display is required only when the source cannot perform dithering. Consoles and Blu-ray players do not perform dithering like Windows, so they support HDR only in 10-bit mode, requiring YCbCr 4:2:0 to do 4K 60 Hz over HDMI 2.0.
Thank you so much for the prompt reply, that’s really helpful!
stuartasmith85 is offline  
post #93 of 104 Old 04-27-2020, 01:35 PM
Advanced Member
 
Join Date: Mar 2018
Posts: 600
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 426 Post(s)
Liked: 206
Quote:
Originally Posted by KurianOfBorg View Post
RGB 8-bit at 4K 60 Hz is the only option until HDMI 2.1 arrives. It's visually indistinguishable from 10-bit. A 10-bit signal between source and display is required only when the source cannot perform dithering. Consoles and Blu-ray players do not perform dithering like Windows, so they support HDR only in 10-bit mode, requiring YCbCr 4:2:0 to do 4K 60 Hz over HDMI 2.0.
wait, it means you can keep RGB 8-bit 4:4:4 even when playing HDR games on Windows ??
on PC there is no need for 10-bit and no increaed bandwidth, allowing you to keep 4K-60 RGB ?


thats nice if so
now if only HDR worked better on Windows ..
UltimateDisplay is online now  
Sponsored Links
Advertisement
 
post #94 of 104 Old 05-01-2020, 04:10 PM
Newbie
 
Join Date: May 2020
Posts: 1
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 0
Quote:
Originally Posted by KurianOfBorg View Post
RGB 8-bit at 4K 60 Hz is the only option until HDMI 2.1 arrives. It's visually indistinguishable from 10-bit. A 10-bit signal between source and display is required only when the source cannot perform dithering. Consoles and Blu-ray players do not perform dithering like Windows, so they support HDR only in 10-bit mode, requiring YCbCr 4:2:0 to do 4K 60 Hz over HDMI 2.0.

Hi! Thanks for the info!


When HDMI 2.1 gpus arrive, what should be the optimal settings on cpanel? and on tv?

RGB 10 BITS FULL or YCbCr 4:4:4 limited ( can it go to full on hdmi 2.1?)

black level set to low or high? (using my oled as an example)
Victor Hugo Benin is offline  
post #95 of 104 Old 05-04-2020, 11:01 PM
AVS Forum Special Member
 
sjchmura's Avatar
 
Join Date: Jan 2002
Location: Chicago, IL USA
Posts: 5,628
Mentioned: 32 Post(s)
Tagged: 0 Thread(s)
Quoted: 2440 Post(s)
Liked: 952
Wait so when in nvidia panel and I choose RGB 10bit to my hdr1000 monitor it’s not needed? 8 bit is the same?

DocACE911 on Steam, XBOX, PS4
"Kill Cancer or Die"
Custom Z9 Game Settings For SDR and HDR
Sony 75" Z9, Denon X4200, Definitive Technologies BP7002 + 9.1 ATMOS, PSV 18" sub
Loving Wife and 4 year old Son who tolerate this expense!
sjchmura is online now  
post #96 of 104 Old 05-07-2020, 10:26 PM
AVS Forum Special Member
 
aarons915's Avatar
 
Join Date: Oct 2012
Location: Cincinnati, OH
Posts: 1,785
Mentioned: 31 Post(s)
Tagged: 0 Thread(s)
Quoted: 1260 Post(s)
Liked: 1017
This is an informative thread but 1 thing I don't think I've seen addressed is this, I have a Vizio V series TV that has a video setting called PC mode that I have read means it properly supports RGB 4:4:4. My question is if I set my pixel format to RGB 8 bit everything should be good on the desktop, playing blurays and non HDR games but what happens with an HDR game? I assume because of HDMI 2.0 limitations, the game automatically sets the pixel format to YCbCr 4:2:2 or 4:2:0 10 bit, won't that cause problems since it is now sending a limited 16-235 signal and the TV is expecting RGB 4:4:4? I'm just trying to figure out the best setting where I'm not having to change things depending on what I'm doing.
aarons915 is online now  
post #97 of 104 Old 05-07-2020, 10:29 PM
Senior Member
 
KurianOfBorg's Avatar
 
Join Date: Jan 2013
Posts: 310
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 193 Post(s)
Liked: 40
Quote:
Originally Posted by aarons915 View Post
This is an informative thread but 1 thing I don't think I've seen addressed is this, I have a Vizio V series TV that has a video setting called PC mode that I have read means it properly supports RGB 4:4:4. My question is if I set my pixel format to RGB 8 bit everything should be good on the desktop, playing blurays and non HDR games but what happens with an HDR game? I assume because of HDMI 2.0 limitations, the game automatically sets the pixel format to YCbCr 4:2:2 or 4:2:0 10 bit, won't that cause problems since it is now sending a limited 16-235 signal and the TV is expecting RGB 4:4:4? I'm just trying to figure out the best setting where I'm not having to change things depending on what I'm doing.
PC mode on TVs typically disables additional processing like dynamic contrast etc. to reduce latency. RGB should be supported even outside PC mode. PC mode is great for SDR but may negatively impact HDR as processing is typically disabled and TVs have limited HDR dynamic range unlike a DisplayHDR 1000 gaming monitor.

PC games don't change the resolution or colour format unless you use fullscreen exclusive mode. Even then they usually change only the resolution and refresh rate but don't touch the colour format. This is true even for HDR. Most games will happily continue to run in 8-bit RGB mode for HDR. The colour format may inadvertently be reduced to YCbCr422 or YCbCr420 if the resolution or refresh rate is too high for RGB.

Last edited by KurianOfBorg; 05-07-2020 at 10:35 PM.
KurianOfBorg is offline  
post #98 of 104 Old 05-07-2020, 10:38 PM
Senior Member
 
KurianOfBorg's Avatar
 
Join Date: Jan 2013
Posts: 310
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 193 Post(s)
Liked: 40
Quote:
Originally Posted by Victor Hugo Benin View Post
When HDMI 2.1 gpus arrive, what should be the optimal settings on cpanel? and on tv?

RGB 10 BITS FULL or YCbCr 4:4:4 limited ( can it go to full on hdmi 2.1?)

black level set to low or high? (using my oled as an example)
RGB 10-bit would be ideal for PCs on HDMI 2.1. YCbCr444 may result in incorrect gamma in applications that behave correctly, as the desktop always uses sRGB gamma and monitors use sRGB gamma even in YCbCr mode. The current G-SYNC HDR monitors always use sRGB gamma on DisplayPort by default even if the signal is YCbCr, but they correctly switch between sRGB and YCbCr gamma on HDMI for consoles etc.

Quote:
Originally Posted by sjchmura View Post
Wait so when in nvidia panel and I choose RGB 10bit to my hdr1000 monitor it’s not needed? 8 bit is the same?
Yes, 8-bit RGB 120 Hz can be used at all times. The current monitors are not even 10-bit panels - they are 8-bit + FRC. So you might as well let Windows do the dithering and run a higher refresh rate.

Last edited by KurianOfBorg; 05-07-2020 at 10:43 PM.
KurianOfBorg is offline  
post #99 of 104 Old 05-08-2020, 08:04 AM
AVS Forum Special Member
 
aarons915's Avatar
 
Join Date: Oct 2012
Location: Cincinnati, OH
Posts: 1,785
Mentioned: 31 Post(s)
Tagged: 0 Thread(s)
Quoted: 1260 Post(s)
Liked: 1017
Quote:
Originally Posted by KurianOfBorg View Post
PC mode on TVs typically disables additional processing like dynamic contrast etc. to reduce latency. RGB should be supported even outside PC mode. PC mode is great for SDR but may negatively impact HDR as processing is typically disabled and TVs have limited HDR dynamic range unlike a DisplayHDR 1000 gaming monitor.

PC games don't change the resolution or colour format unless you use fullscreen exclusive mode. Even then they usually change only the resolution and refresh rate but don't touch the colour format. This is true even for HDR. Most games will happily continue to run in 8-bit RGB mode for HDR. The colour format may inadvertently be reduced to YCbCr422 or YCbCr420 if the resolution or refresh rate is too high for RGB.
If that's the case then it seems RGB 8 would be fine to use but if my TV changes to HDR10 mode when an HDR game is played doesn't that mean it is now in 10 bit mode or does that always show when it receives an HDR signal? I'm starting to think just leaving my setting at 4:2:2 10 bit won't hurt anything.
aarons915 is online now  
post #100 of 104 Old 05-08-2020, 12:19 PM
Senior Member
 
KurianOfBorg's Avatar
 
Join Date: Jan 2013
Posts: 310
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 193 Post(s)
Liked: 40
Quote:
Originally Posted by aarons915 View Post
If that's the case then it seems RGB 8 would be fine to use but if my TV changes to HDR10 mode when an HDR game is played doesn't that mean it is now in 10 bit mode or does that always show when it receives an HDR signal? I'm starting to think just leaving my setting at 4:2:2 10 bit won't hurt anything.
HDR is activated with metadata and not based on the bit depth. YCbCr422 / 420 will have fringing on text on the desktop and in games - it's only suitable for video which is typically recorded in 4:2:0. YCbCr422 / 420 10-bit is worse than RGB 8-bit in every way. Even YCbCr444 will require black level expansion and is worse than RGB, so just use RGB.
aarons915 likes this.
KurianOfBorg is offline  
post #101 of 104 Old 05-08-2020, 07:35 PM
AVS Forum Special Member
 
sjchmura's Avatar
 
Join Date: Jan 2002
Location: Chicago, IL USA
Posts: 5,628
Mentioned: 32 Post(s)
Tagged: 0 Thread(s)
Quoted: 2440 Post(s)
Liked: 952
So is running rgb 10 bit gaining anything on my hdr1000 monitor (philips 43”?)? Supposedly like 97% dci3 color coverage but assuming while says 10bit panel likely 8+FRC?

DocACE911 on Steam, XBOX, PS4
"Kill Cancer or Die"
Custom Z9 Game Settings For SDR and HDR
Sony 75" Z9, Denon X4200, Definitive Technologies BP7002 + 9.1 ATMOS, PSV 18" sub
Loving Wife and 4 year old Son who tolerate this expense!
sjchmura is online now  
post #102 of 104 Old 05-08-2020, 08:17 PM
Senior Member
 
KurianOfBorg's Avatar
 
Join Date: Jan 2013
Posts: 310
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 193 Post(s)
Liked: 40
Quote:
Originally Posted by sjchmura View Post
So is running rgb 10 bit gaining anything on my hdr1000 monitor (philips 43”?)? Supposedly like 97% dci3 color coverage but assuming while says 10bit panel likely 8+FRC?
Some workstation applications that do not use the new DXGI formats for HDR & WCG will not perform dithering and will use 10-bit only if the desktop runs in 10-bit mode. These typically require a Quadro / Radeon Pro GPU as well.
KurianOfBorg is offline  
post #103 of 104 Old 05-08-2020, 08:27 PM
AVS Forum Special Member
 
sjchmura's Avatar
 
Join Date: Jan 2002
Location: Chicago, IL USA
Posts: 5,628
Mentioned: 32 Post(s)
Tagged: 0 Thread(s)
Quoted: 2440 Post(s)
Liked: 952
Do on my 2080ti and some HDR game, no point? Wow.

So the LG oled is a true 10 bit panel or also 8+FRC? If so then why not just run even the oled in rgb 8bit and not worry ?

DocACE911 on Steam, XBOX, PS4
"Kill Cancer or Die"
Custom Z9 Game Settings For SDR and HDR
Sony 75" Z9, Denon X4200, Definitive Technologies BP7002 + 9.1 ATMOS, PSV 18" sub
Loving Wife and 4 year old Son who tolerate this expense!
sjchmura is online now  
post #104 of 104 Old 05-08-2020, 08:32 PM
Senior Member
 
KurianOfBorg's Avatar
 
Join Date: Jan 2013
Posts: 310
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 193 Post(s)
Liked: 40
Quote:
Originally Posted by sjchmura View Post
Do on my 2080ti and some HDR game, no point? Wow.

So the LG oled is a true 10 bit panel or also 8+FRC? If so then why not just run even the oled in rgb 8bit and not worry ?
One real-world benefit (for now) is you can view 10-bit WCG images (but still not HDR) in Firefox only if you run the desktop in 10-bit mode. But if you have to switch to 4:2:2 or 4:2:0 down from RGB, it'll look like crap so it's pointless. Edge uses the new DXGI formats so it can render HDR & WCG images with dithering even if the desktop is 8-bit. Once Firefox is updated to use the new DXGI formats it will also support HDR and WCG while the desktop is 8-bit.

IIRC the current OLED TVs are 10-bit + FRC and can accept up to 12-bit input. I cannot see any difference between 8-bit with dithering, 10-bit, and 12-bit output from my PC on my LG B7. They all look identical in a banding test. I run it at 4K 60 Hz RGB 8-bit. If you match the refresh rate to the movie framerate (24 / 30 Hz) so that the TV can perform frame interpolation, you can run 10-bit / 12-bit at the lower refresh rate, but there's no visible difference.

Last edited by KurianOfBorg; 05-08-2020 at 08:38 PM.
KurianOfBorg is offline  
Sponsored Links
Advertisement
 
Reply High Dynamic Range (HDR) & Wide Color Gamut (WCG)

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off