NVIDIA Output Color Format - 4K Gaming - AVS Forum | Home Theater Discussions And Reviews
Forum Jump: 
 1Likes
  • 1 Post By NewAger
 
Thread Tools
post #1 of 14 Old 01-09-2019, 08:17 AM - Thread Starter
AVS Forum Special Member
 
mickey79's Avatar
 
Join Date: Dec 2005
Location: Burbank, CA
Posts: 1,295
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 152 Post(s)
Liked: 37
NVIDIA Output Color Format - 4K Gaming

Hi All,

I use an HTPC equipped with a ASUS ROG Strix GTX 1080 for 4K Gaming on a TCL 6 Series 55R617 LED which supports both HDR & Dolby Vision.

I have a question about the NVIDIA Control Panel, specifically the NVIDIA Color Settings section.






I'm not an expert on these things and don't really understand too much about the different options. I wanted some advice on what most gamers use and recommend for 4K HDR/Dolby Vision gaming.

My question is regarding the "Output Color Format", "Output Color Depth" and "Output Dynamic Range", which seem to be correlated as options populate based on the choices you make.

If I select "RGB" in Output Color Format, then Output Dynamic Range can be set to "Full".
However, Output Color Depth can only be set to 8bpc. The other options disappear.

If I select any of the other 3 "YCbCr" options, then Output Color Depth allows 8bpc, 10bpc & 12bpc.
However, Output Dynamic Range can only be set to "Limited".

I'm looking for the best or the most recommended combination here for 4K HDR Gaming.

Essentially, is 4K HDR Gaming best with RGB / 8bpc / Full, or one of the YCbCr options (which one?) / 12bpc / Limited?

Thanks in advance!

7.1.4: Polk RTi12 Fronts, RTi8 Surrounds, Monitor70 Surround Backs, CSi A6 Center, Klipsch R-14M Front Height & Rear Height + Outlaw LFM-1 EX Subwoofer.
JVC NX5 True 4K HDR Projector / Denon X6300H AVR
Intel Broadwell-E 6 Core (H2O) / EVGA RTX 2080Ti FTW3 Ultra Hybrid (H2O) HTPC
LG UBK80 UHD Blu-ray / XBOX One X

Last edited by mickey79; 01-09-2019 at 08:25 AM.
mickey79 is offline  
Sponsored Links
Advertisement
 
post #2 of 14 Old 05-23-2019, 03:56 AM
Member
 
FGEvans's Avatar
 
Join Date: Feb 2007
Posts: 179
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 80 Post(s)
Liked: 15
I would also love to know the answer to this
FGEvans is offline  
post #3 of 14 Old 05-23-2019, 04:20 AM
Member
 
Join Date: May 2018
Posts: 129
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 91 Post(s)
Liked: 45
This is considered to be one of the best videos for understanding Nvidia settings:
and use it to match it to your TV's capabilities/specs (10 bit color, for example, as there are no 12 bit TVs yet)
cathodeRay is offline  
Sponsored Links
Advertisement
 
post #4 of 14 Old 05-27-2019, 06:38 AM
Advanced Member
 
Join Date: May 2015
Posts: 588
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 343 Post(s)
Liked: 200
Quote:
Originally Posted by mickey79 View Post
Hi All,

I use an HTPC equipped with a ASUS ROG Strix GTX 1080 for 4K Gaming on a TCL 6 Series 55R617 LED which supports both HDR & Dolby Vision.

I have a question about the NVIDIA Control Panel, specifically the NVIDIA Color Settings section.






I'm not an expert on these things and don't really understand too much about the different options. I wanted some advice on what most gamers use and recommend for 4K HDR/Dolby Vision gaming.

My question is regarding the "Output Color Format", "Output Color Depth" and "Output Dynamic Range", which seem to be correlated as options populate based on the choices you make.

If I select "RGB" in Output Color Format, then Output Dynamic Range can be set to "Full".
However, Output Color Depth can only be set to 8bpc. The other options disappear.

If I select any of the other 3 "YCbCr" options, then Output Color Depth allows 8bpc, 10bpc & 12bpc.
However, Output Dynamic Range can only be set to "Limited".

I'm looking for the best or the most recommended combination here for 4K HDR Gaming.

Essentially, is 4K HDR Gaming best with RGB / 8bpc / Full, or one of the YCbCr options (which one?) / 12bpc / Limited?

Thanks in advance!
For regular gaming 8bit rgb full is the best. For hdr gaming 10 bit ycbcr limited is the best. The reason rgb or ycbcr is limited at 10 bit is the bandwidth restriction of hdmi 2.0. You will still have a much larger color palette using 10 bit limited vs 8 bit full. As I said though for regular gaming only use full because colors will be out of whack if you try to use 10 bit color on a non hdr game.
Friendlys is offline  
post #5 of 14 Old 05-27-2019, 10:50 AM - Thread Starter
AVS Forum Special Member
 
mickey79's Avatar
 
Join Date: Dec 2005
Location: Burbank, CA
Posts: 1,295
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 152 Post(s)
Liked: 37
Quote:
Originally Posted by Friendlys View Post
For regular gaming 8bit rgb full is the best. For hdr gaming 10 bit ycbcr limited is the best. The reason rgb or ycbcr is limited at 10 bit is the bandwidth restriction of hdmi 2.0. You will still have a much larger color palette using 10 bit limited vs 8 bit full. As I said though for regular gaming only use full because colors will be out of whack if you try to use 10 bit color on a non hdr game.
Ah! This is exactly the kind of info I was looking for. Thank you so much for the response. I see three different options for Ycbcr - 422, 444 & 420. For HDR, which one should I use to pair with 10 bit limited?

Most of my library is non-HDR, so I'll continue to use 8bit Full.

However, I have at least 15 titles which are HDR now and the list is growing. Will use Ycbcr 10bit limited for these.

Wish there was a script which could switch the settings one-click.

Thanks!!

7.1.4: Polk RTi12 Fronts, RTi8 Surrounds, Monitor70 Surround Backs, CSi A6 Center, Klipsch R-14M Front Height & Rear Height + Outlaw LFM-1 EX Subwoofer.
JVC NX5 True 4K HDR Projector / Denon X6300H AVR
Intel Broadwell-E 6 Core (H2O) / EVGA RTX 2080Ti FTW3 Ultra Hybrid (H2O) HTPC
LG UBK80 UHD Blu-ray / XBOX One X
mickey79 is offline  
post #6 of 14 Old 05-27-2019, 10:57 AM - Thread Starter
AVS Forum Special Member
 
mickey79's Avatar
 
Join Date: Dec 2005
Location: Burbank, CA
Posts: 1,295
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 152 Post(s)
Liked: 37
Did a little playing with settings and discovered, on my panel:

444 only gives 8bit as an option.
420 only allows 8bit & 10bit as an option.
422 allows all three, 8, 10 & 12bit options.

So to answer my question above, for HDR gaming, should I choose Ycbcr422 10bit limited?

Thanks!

7.1.4: Polk RTi12 Fronts, RTi8 Surrounds, Monitor70 Surround Backs, CSi A6 Center, Klipsch R-14M Front Height & Rear Height + Outlaw LFM-1 EX Subwoofer.
JVC NX5 True 4K HDR Projector / Denon X6300H AVR
Intel Broadwell-E 6 Core (H2O) / EVGA RTX 2080Ti FTW3 Ultra Hybrid (H2O) HTPC
LG UBK80 UHD Blu-ray / XBOX One X
mickey79 is offline  
post #7 of 14 Old 05-27-2019, 02:28 PM
Member
 
Join Date: Mar 2019
Posts: 144
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 41 Post(s)
Liked: 49
Quote:
Originally Posted by mickey79 View Post
So to answer my question above, for HDR gaming, should I choose Ycbcr422 10bit limited?
That's the idea. However, even after careful inspection with HDR games such as Resident Evil 2 and Hitman 2 played on a Sony X900F, I cannot tell the difference between forcing YCbCr422 10bit and just using default system values. I don't know if the software is automatically doing the right thing or if dithered 8-bit RGB is simply good enough for my eyes and display.
NewAger is offline  
post #8 of 14 Old 05-27-2019, 04:24 PM - Thread Starter
AVS Forum Special Member
 
mickey79's Avatar
 
Join Date: Dec 2005
Location: Burbank, CA
Posts: 1,295
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 152 Post(s)
Liked: 37
Quote:
Originally Posted by NewAger View Post
That's the idea. However, even after careful inspection with HDR games such as Resident Evil 2 and Hitman 2 played on a Sony X900F, I cannot tell the difference between forcing YCbCr422 10bit and just using default system values. I don't know if the software is automatically doing the right thing or if dithered 8-bit RGB is simply good enough for my eyes and display.
I can agree with you.

Did some testing with Hitman & Far Cry 5 today after setting Ycbcr422 10bit Limited. On the same visuals, tested against RGB 8bit Full. Ultimately after going back & forth, I could see absolutely no difference with my eyes and my panel, if anything, found brightness better with RGB, so decided to stay with RGB.

But this was a good test and good info. At least now I know I'm on the right settings and don't have to wonder anymore.

Thanks for your help!

7.1.4: Polk RTi12 Fronts, RTi8 Surrounds, Monitor70 Surround Backs, CSi A6 Center, Klipsch R-14M Front Height & Rear Height + Outlaw LFM-1 EX Subwoofer.
JVC NX5 True 4K HDR Projector / Denon X6300H AVR
Intel Broadwell-E 6 Core (H2O) / EVGA RTX 2080Ti FTW3 Ultra Hybrid (H2O) HTPC
LG UBK80 UHD Blu-ray / XBOX One X
mickey79 is offline  
post #9 of 14 Old 05-27-2019, 08:31 PM
Member
 
Join Date: Mar 2019
Posts: 144
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 41 Post(s)
Liked: 49
Quote:
Originally Posted by mickey79 View Post
I can agree with you.

Did some testing with Hitman & Far Cry 5 today after setting Ycbcr422 10bit Limited. On the same visuals, tested against RGB 8bit Full. Ultimately after going back & forth, I could see absolutely no difference with my eyes and my panel, if anything, found brightness better with RGB, so decided to stay with RGB.

But this was a good test and good info. At least now I know I'm on the right settings and don't have to wonder anymore.

Thanks for your help!
Yeah, at some point I learned that just leaving it at default RGB values for my display is a stress free way to go. There are way too many variables involved to get caught up in all of this.

I used one HDR display that never looked right with Resident Evil 2 when forcing YCbCr422 10-bit Limited, in that the visuals would always be washed-out in HDR and nothing could fix it. Using the system default RGB setting and forcing Limited on the display side made it look absolutely perfect without having to even touch the in-game HDR brightness sliders.

Lack of standardization causes nothing but headaches.
mickey79 likes this.
NewAger is offline  
post #10 of 14 Old 05-28-2019, 05:31 PM - Thread Starter
AVS Forum Special Member
 
mickey79's Avatar
 
Join Date: Dec 2005
Location: Burbank, CA
Posts: 1,295
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 152 Post(s)
Liked: 37
Quote:
Originally Posted by NewAger View Post
Yeah, at some point I learned that just leaving it at default RGB values for my display is a stress free way to go. There are way too many variables involved to get caught up in all of this.

I used one HDR display that never looked right with Resident Evil 2 when forcing YCbCr422 10-bit Limited, in that the visuals would always be washed-out in HDR and nothing could fix it. Using the system default RGB setting and forcing Limited on the display side made it look absolutely perfect without having to even touch the in-game HDR brightness sliders.

Lack of standardization causes nothing but headaches.
100% +1.

Are you also using RGB/8/Full for video? HD/SDR & UHD/HDR? I have been as I preferred it.

Thanks!!

7.1.4: Polk RTi12 Fronts, RTi8 Surrounds, Monitor70 Surround Backs, CSi A6 Center, Klipsch R-14M Front Height & Rear Height + Outlaw LFM-1 EX Subwoofer.
JVC NX5 True 4K HDR Projector / Denon X6300H AVR
Intel Broadwell-E 6 Core (H2O) / EVGA RTX 2080Ti FTW3 Ultra Hybrid (H2O) HTPC
LG UBK80 UHD Blu-ray / XBOX One X
mickey79 is offline  
post #11 of 14 Old 05-28-2019, 09:38 PM
Member
 
Join Date: Mar 2019
Posts: 144
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 41 Post(s)
Liked: 49
Quote:
Originally Posted by mickey79 View Post
100% +1.

Are you also using RGB/8/Full for video? HD/SDR & UHD/HDR? I have been as I preferred it.

Thanks!!
When I watched my Blu-ray rips on the PC, I did use default RGB 8-bit Full as the system setting and let PowerDVD handle the rest (which it did perfectly well). I currently just stream video and watch Hulu/Netflix through my Roku Ultra for the purpose of added convenience.

Last edited by NewAger; 05-28-2019 at 09:47 PM.
NewAger is offline  
post #12 of 14 Old 06-07-2019, 07:09 AM
Member
 
FGEvans's Avatar
 
Join Date: Feb 2007
Posts: 179
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 80 Post(s)
Liked: 15
cheers for the gaming tips. I wonder whether i should force 4:2:2 at 10 bit for HDR films or leave it as 8 Bit Full RGB

what is confusing however is that most guides suggest that you can not have true HDR at 8bit

http://community.cedia.net/blogs/dav...tes-for-4k-hdr

i found this quite useful

https://docs.google.com/spreadsheets...Wwc/edit#gid=0

Last edited by FGEvans; 06-07-2019 at 07:41 AM.
FGEvans is offline  
post #13 of 14 Old 06-09-2019, 07:22 AM
Member
 
FGEvans's Avatar
 
Join Date: Feb 2007
Posts: 179
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 80 Post(s)
Liked: 15
I am still none the wiser now whether to use 8 bit rgb or 10 bit 4.2.2

On the xbox the recommended set up is to chose 8 bit but then the xbox when playing hdr will automatically then pick 4.2.2 at 10 bit. Ie it over rides the 8 bit setting. However the PC does not do this if you pick 8 bit rgb. The display still is receiving a 8 bit rgb hdr signal. It seems counter productive to send the display a 8 bit signal as that means the colour space is a lot lower than a 10 bit colour space etc
FGEvans is offline  
post #14 of 14 Old 06-09-2019, 10:03 PM
Member
 
Join Date: Mar 2019
Posts: 144
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 41 Post(s)
Liked: 49
Quote:
Originally Posted by FGEvans View Post
I am still none the wiser now whether to use 8 bit rgb or 10 bit 4.2.2

On the xbox the recommended set up is to chose 8 bit but then the xbox when playing hdr will automatically then pick 4.2.2 at 10 bit. Ie it over rides the 8 bit setting. However the PC does not do this if you pick 8 bit rgb. The display still is receiving a 8 bit rgb hdr signal. It seems counter productive to send the display a 8 bit signal as that means the colour space is a lot lower than a 10 bit colour space etc
For HDR video on the PC, it may be technically better to set the OS video settings to 422 at 10-bit. Even then, though, you have to worry about what the software you're using makes of it. I've learned that asking relevant content developers for the answer is a futile endeavor. "Forget it, Jake. It's PC town."

Regardless, you still reap benefits of HDR through a properly dithered RGB 8-bit output. You should be able to see for yourself.
NewAger is offline  
Sponsored Links
Advertisement
 
Reply HTPC Gaming

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off