LG C9 PC Input vs Game Console, "the struggles are real".
There is a ton of conflicting information regarding which is better for gaming. I have done lots of internet reading on this topic, therefore, I am NOT an expert or professional. I just want to point others to some of the information I have gathered so they can hopefully make a more educated guess.
To start, there is a ton of information in the Official Owners thread here on AVS forum, do yourself a favor and read through its contents. The contents of that thread have a lot to teach about displays and video processing. There is also a ton of information in the Display Calibration section of AVSforums, I suggest reading the C9/C8/C7 threads over there as well.
To Start:
Per Wiki
EDID: Extended Display Identification Data is a metadata format for display devices to describe their capabilities to a video source (e.g.
graphics card or set-top box). The data format is defined by a standard published by the Video Electronics Standards Association (VESA)
There are valid signals and invalid signals for each version of HDMI. Since we are talking about the new LG C9 and only have Hdmi 2.0 cables let us concern ourselves with only HDMI 2.0b. Here is a link to the supported 4k formats. Please scroll down to the bottom of the page and then to the bottom of the graph where it lists 4K@60.
http://www.hstecc.com/Support-467.aspx
Math time
HDMI 2.0b can only handle 18Gbps max. Even if the TV displays an image it will not be correct if it is out of spec. Here is a link to a data calculator, if you want to play around with numbers click the advanced tab instead of basic.
https://www.extron.com/product/videotools.aspx
So now lets try and quickly understand what the TV will do with each of these signals, RGB/4:4:4/4:2:2/4:2:0. Vincent Teoh has a wonderful video of what a display device does with each of the formats. Please watch the entire video.
To briefly summarize
4K@24 4:2:0 is not valid for any bitdepth
4K@60 4:2:0 is valid for all bit depths
4K@60 4:2:2 is only valid for 12bit
4K@60 4:4:4 (RGB) is only valid for 8 bit
So if you have watched the video, Vincent goes over what the TV will do with each signal. Now since this is a thread about the LG C9, we need to figure out what this TV is doing with each signal. From what people have been reporting and to my limited understanding, when placed in PC mode the LG C9 accepts 4:4:4 fully and does not down convert . If you played around with the bandwidth calculator you would see that 4K@60 4:4:4 8bit is 17.82Gbps, almost 100% of the available bandwidth of HDMI 2.0b!
When in regular mode (Game Console, DVD, etc) the TV accepts the 4:4:4 signal but then down converts it to 4:2:2 and then up converts it back to 4:4:4 before displaying. Meaning, if you send it 4:2:2 to begin with it accepts the signal and only performs a single conversion. Now this is a moot point because at the end of the day 4:4:4 is properly handled by the C9, it just adds an extra layer of conversion. People have reported color banding being very prevalent when in PC mode. The best guess I can offer is that PC mode either or strips the TV of image quality functions and or nearly using 100% of the HDMI bandwidth limits some functions. Maybe a qualified person can chime in and help me understand what is going on.
So this leads us to see that PC input 4K@60 4:4:4 8bit is perfectly playable if you are playing SDR games. Some of you are probably thinking at this point, "Well if 8 bit is fine why cant I use 10 or even better, 12 bits?" If you were to check the Math link up above you would see that this combination requires to much bandwidth and therefore is not listed in the 2.0b spec.
But why did you buy this beefy 4k HDR tv if your limiting yourself to just SDR? Are there better options? Yes, yes there are. Read on to find out.
Now on to bit depth and HDR
HDR is very specific. It is encoded at 4:2:0 10bit with a LIMITED dynamic range (more on this later)
As Vincent above has told us 12bit acts as a buffer against errors and banding, meaning you should be able to safely play 8bit/10bit files through a 12bit signal.
By all accounts the LG C9 seems to processes 12bit accurately and remember that graph I linked earlier? Only 4:2:2 and 4:2:0 are valid signals at 12bit. So we have 2 options; 4:2:0 and 4:2:2. With the provided information this should be an easy guess, since the LG wants a 4:2:2 signal we should probably send it one.
Great!, so now we can properly play all of our PC games correctly in SDR and HDR! All we have to do is switch a ton of options around everytime we fire up a different game! What if I told you there was a more convenient way with just a slight loss in quality?
Convenience
Finding a single setting that works for both SDR and HDR is very, very, convenient.
Remember when I said as long as your display process 12bit correctly you use that instead of 8/10bit? What if I also said there is just a slight loss in quality when gaming in 4:2:2 vs 4:4:4? Sure your desktop and text is a little fuzzier but if your overly concerned with browsing the internet on your C9 Oled then why are you looking at a gaming thread??
Also we should touch on Dynamic Range now. I will keep it simple.
Black Level ( located in picture mode settings)
Low = Limited (video levels)
High = Full (PC levels)
Make them match. If you set Limited in the Nvidia Control Panel choose Low on the TV. If you go Full in the control panel go High in the TV. Remember HDR is ALWAYS LIMITED
Ma boob tubes seddins:
Nvidia Control Panel
Use Nvidia Color Settings
32bit Desktop color depth
Output Color Format: yCbCr 4:2:2
Output Color Depth: 12 bpc
Output Dynamic Range: Limited
TV settings:
Black Level: Low
I use the above for HDR and SDR and am perfectly satisfied with the results. I don't have to mess with any settings other than the annoying Windows Desktop HDR tab ( Can we please get rid of it Microsoft and switch modes automatically?). Hopefully there is some accuracy in what I have wrote here, maybe some technical people can chime in and point out where I am wrong.