AVS Forum | Home Theater Discussions And Reviews (https://www.avsforum.com/forum/)
-   HTPC Gaming (https://www.avsforum.com/forum/145-htpc-gaming/)
-   -   NVIDIA Output Color Format - 4K Gaming (https://www.avsforum.com/forum/145-htpc-gaming/3041736-nvidia-output-color-format-4k-gaming.html)

mickey79 01-09-2019 08:17 AM

NVIDIA Output Color Format - 4K Gaming
 
Hi All,

I use an HTPC equipped with a ASUS ROG Strix GTX 1080 for 4K Gaming on a TCL 6 Series 55R617 LED which supports both HDR & Dolby Vision.

I have a question about the NVIDIA Control Panel, specifically the NVIDIA Color Settings section.

http://www.damiansj.com/exgpd/ASSETS/GRUBS/ncp_v1.png


http://www.damiansj.com/exgpd/ASSETS/GRUBS/ncp_v1.png

I'm not an expert on these things and don't really understand too much about the different options. I wanted some advice on what most gamers use and recommend for 4K HDR/Dolby Vision gaming.

My question is regarding the "Output Color Format", "Output Color Depth" and "Output Dynamic Range", which seem to be correlated as options populate based on the choices you make.

If I select "RGB" in Output Color Format, then Output Dynamic Range can be set to "Full".
However, Output Color Depth can only be set to 8bpc. The other options disappear.

If I select any of the other 3 "YCbCr" options, then Output Color Depth allows 8bpc, 10bpc & 12bpc.
However, Output Dynamic Range can only be set to "Limited".

I'm looking for the best or the most recommended combination here for 4K HDR Gaming.

Essentially, is 4K HDR Gaming best with RGB / 8bpc / Full, or one of the YCbCr options (which one?) / 12bpc / Limited?

Thanks in advance!

FGEvans 05-23-2019 03:56 AM

I would also love to know the answer to this

cathodeRay 05-23-2019 04:20 AM

This is considered to be one of the best videos for understanding Nvidia settings: and use it to match it to your TV's capabilities/specs (10 bit color, for example, as there are no 12 bit TVs yet)

Friendlys 05-27-2019 06:38 AM

Quote:

Originally Posted by mickey79 (Post 57406094)
Hi All,

I use an HTPC equipped with a ASUS ROG Strix GTX 1080 for 4K Gaming on a TCL 6 Series 55R617 LED which supports both HDR & Dolby Vision.

I have a question about the NVIDIA Control Panel, specifically the NVIDIA Color Settings section.

http://www.damiansj.com/exgpd/ASSETS/GRUBS/ncp_v1.png


http://www.damiansj.com/exgpd/ASSETS/GRUBS/ncp_v1.png

I'm not an expert on these things and don't really understand too much about the different options. I wanted some advice on what most gamers use and recommend for 4K HDR/Dolby Vision gaming.

My question is regarding the "Output Color Format", "Output Color Depth" and "Output Dynamic Range", which seem to be correlated as options populate based on the choices you make.

If I select "RGB" in Output Color Format, then Output Dynamic Range can be set to "Full".
However, Output Color Depth can only be set to 8bpc. The other options disappear.

If I select any of the other 3 "YCbCr" options, then Output Color Depth allows 8bpc, 10bpc & 12bpc.
However, Output Dynamic Range can only be set to "Limited".

I'm looking for the best or the most recommended combination here for 4K HDR Gaming.

Essentially, is 4K HDR Gaming best with RGB / 8bpc / Full, or one of the YCbCr options (which one?) / 12bpc / Limited?

Thanks in advance!

For regular gaming 8bit rgb full is the best. For hdr gaming 10 bit ycbcr limited is the best. The reason rgb or ycbcr is limited at 10 bit is the bandwidth restriction of hdmi 2.0. You will still have a much larger color palette using 10 bit limited vs 8 bit full. As I said though for regular gaming only use full because colors will be out of whack if you try to use 10 bit color on a non hdr game.

mickey79 05-27-2019 10:50 AM

Quote:

Originally Posted by Friendlys (Post 58103684)
For regular gaming 8bit rgb full is the best. For hdr gaming 10 bit ycbcr limited is the best. The reason rgb or ycbcr is limited at 10 bit is the bandwidth restriction of hdmi 2.0. You will still have a much larger color palette using 10 bit limited vs 8 bit full. As I said though for regular gaming only use full because colors will be out of whack if you try to use 10 bit color on a non hdr game.

Ah! This is exactly the kind of info I was looking for. Thank you so much for the response. I see three different options for Ycbcr - 422, 444 & 420. For HDR, which one should I use to pair with 10 bit limited?

Most of my library is non-HDR, so I'll continue to use 8bit Full.

However, I have at least 15 titles which are HDR now and the list is growing. Will use Ycbcr 10bit limited for these.

Wish there was a script which could switch the settings one-click.

Thanks!!

mickey79 05-27-2019 10:57 AM

Did a little playing with settings and discovered, on my panel:

444 only gives 8bit as an option.
420 only allows 8bit & 10bit as an option.
422 allows all three, 8, 10 & 12bit options.

So to answer my question above, for HDR gaming, should I choose Ycbcr422 10bit limited?

Thanks!

NewAger 05-27-2019 02:28 PM

Quote:

Originally Posted by mickey79 (Post 58104604)
So to answer my question above, for HDR gaming, should I choose Ycbcr422 10bit limited?

That's the idea. However, even after careful inspection with HDR games such as Resident Evil 2 and Hitman 2 played on a Sony X900F, I cannot tell the difference between forcing YCbCr422 10bit and just using default system values. I don't know if the software is automatically doing the right thing or if dithered 8-bit RGB is simply good enough for my eyes and display.

mickey79 05-27-2019 04:24 PM

Quote:

Originally Posted by NewAger (Post 58105366)
That's the idea. However, even after careful inspection with HDR games such as Resident Evil 2 and Hitman 2 played on a Sony X900F, I cannot tell the difference between forcing YCbCr422 10bit and just using default system values. I don't know if the software is automatically doing the right thing or if dithered 8-bit RGB is simply good enough for my eyes and display.

I can agree with you.

Did some testing with Hitman & Far Cry 5 today after setting Ycbcr422 10bit Limited. On the same visuals, tested against RGB 8bit Full. Ultimately after going back & forth, I could see absolutely no difference with my eyes and my panel, if anything, found brightness better with RGB, so decided to stay with RGB.

But this was a good test and good info. At least now I know I'm on the right settings and don't have to wonder anymore.

Thanks for your help!

NewAger 05-27-2019 08:31 PM

Quote:

Originally Posted by mickey79 (Post 58105818)
I can agree with you.

Did some testing with Hitman & Far Cry 5 today after setting Ycbcr422 10bit Limited. On the same visuals, tested against RGB 8bit Full. Ultimately after going back & forth, I could see absolutely no difference with my eyes and my panel, if anything, found brightness better with RGB, so decided to stay with RGB.

But this was a good test and good info. At least now I know I'm on the right settings and don't have to wonder anymore.

Thanks for your help!

Yeah, at some point I learned that just leaving it at default RGB values for my display is a stress free way to go. There are way too many variables involved to get caught up in all of this.

I used one HDR display that never looked right with Resident Evil 2 when forcing YCbCr422 10-bit Limited, in that the visuals would always be washed-out in HDR and nothing could fix it. Using the system default RGB setting and forcing Limited on the display side made it look absolutely perfect without having to even touch the in-game HDR brightness sliders.

Lack of standardization causes nothing but headaches.

mickey79 05-28-2019 05:31 PM

Quote:

Originally Posted by NewAger (Post 58106558)
Yeah, at some point I learned that just leaving it at default RGB values for my display is a stress free way to go. There are way too many variables involved to get caught up in all of this.

I used one HDR display that never looked right with Resident Evil 2 when forcing YCbCr422 10-bit Limited, in that the visuals would always be washed-out in HDR and nothing could fix it. Using the system default RGB setting and forcing Limited on the display side made it look absolutely perfect without having to even touch the in-game HDR brightness sliders.

Lack of standardization causes nothing but headaches.

100% +1.

Are you also using RGB/8/Full for video? HD/SDR & UHD/HDR? I have been as I preferred it.

Thanks!!

NewAger 05-28-2019 09:38 PM

Quote:

Originally Posted by mickey79 (Post 58110666)
100% +1.

Are you also using RGB/8/Full for video? HD/SDR & UHD/HDR? I have been as I preferred it.

Thanks!!

When I watched my Blu-ray rips on the PC, I did use default RGB 8-bit Full as the system setting and let PowerDVD handle the rest (which it did perfectly well). I currently just stream video and watch Hulu/Netflix through my Roku Ultra for the purpose of added convenience.

FGEvans 06-07-2019 07:09 AM

cheers for the gaming tips. I wonder whether i should force 4:2:2 at 10 bit for HDR films or leave it as 8 Bit Full RGB

what is confusing however is that most guides suggest that you can not have true HDR at 8bit

http://community.cedia.net/blogs/dav...tes-for-4k-hdr

i found this quite useful

https://docs.google.com/spreadsheets...Wwc/edit#gid=0

FGEvans 06-09-2019 07:22 AM

I am still none the wiser now whether to use 8 bit rgb or 10 bit 4.2.2

On the xbox the recommended set up is to chose 8 bit but then the xbox when playing hdr will automatically then pick 4.2.2 at 10 bit. Ie it over rides the 8 bit setting. However the PC does not do this if you pick 8 bit rgb. The display still is receiving a 8 bit rgb hdr signal. It seems counter productive to send the display a 8 bit signal as that means the colour space is a lot lower than a 10 bit colour space etc

NewAger 06-09-2019 10:03 PM

Quote:

Originally Posted by FGEvans (Post 58160274)
I am still none the wiser now whether to use 8 bit rgb or 10 bit 4.2.2

On the xbox the recommended set up is to chose 8 bit but then the xbox when playing hdr will automatically then pick 4.2.2 at 10 bit. Ie it over rides the 8 bit setting. However the PC does not do this if you pick 8 bit rgb. The display still is receiving a 8 bit rgb hdr signal. It seems counter productive to send the display a 8 bit signal as that means the colour space is a lot lower than a 10 bit colour space etc

For HDR video on the PC, it may be technically better to set the OS video settings to 422 at 10-bit. Even then, though, you have to worry about what the software you're using makes of it. I've learned that asking relevant content developers for the answer is a futile endeavor. "Forget it, Jake. It's PC town."

Regardless, you still reap benefits of HDR through a properly dithered RGB 8-bit output. You should be able to see for yourself.


All times are GMT -7. The time now is 08:13 PM.

Powered by vBulletin® Copyright ©2000 - 2019, Jelsoft Enterprises Ltd.
vBulletin Security provided by vBSecurity (Pro) - vBulletin Mods & Addons Copyright © 2019 DragonByte Technologies Ltd.

vBulletin Optimisation provided by vB Optimise (Pro) - vBulletin Mods & Addons Copyright © 2019 DragonByte Technologies Ltd.
User Alert System provided by Advanced User Tagging (Pro) - vBulletin Mods & Addons Copyright © 2019 DragonByte Technologies Ltd.