AVS Forum banner

2019 LG C9–E9 dedicated GAMING thread, consoles and PC

819K views 7K replies 557 participants last post by  b0rnarian 
#1 · (Edited)
The LG OLED C9 and E9 are fantastic displays for watching movies/TV, and even better for playing console and/or PC games. With many cutting edge features like Variable Refresh Rates, G-sync, HDMI 2.1, Auto Low Latency Mode, HGiG and more, it's a great way to enhance the gaming experience.


But with the cutting edge often comes questions about setting the display for the best performance and picture quality. This is the thread to discuss this! What works well for you, what don't you understand, what games really show off this beautiful display...?


If you are a dedicated console or PC gamer and have a LG OLED, or are thinking of getting one, please SUBSCRIBE to this thread and participate in the discussion.
 
#2 ·
Thank you
 
  • Like
Reactions: DaverJ
#4 ·
G-Sync settings. What should be set in the Nvidia Controls vs In-games settings
 
  • Like
Reactions: DaverJ
#5 ·
Good point, this one is confusing and a perfect topic for a FAQ. I had to Google this topic last night because I was getting tearing with G-Sync on -- I found out if the framerate goes over the monitor's refresh rate (120Hz at 1440p) even with G-Sync on, screen tearing will result. In this case, my understanding is it's best to enable V-sync in-game if the game doesn't have a framecap to smooth out the framerate above the refresh rate.
 
#26 ·
LG C9 PC Input vs Game Console, "the struggles are real".

There is a ton of conflicting information regarding which is better for gaming. I have done lots of internet reading on this topic, therefore, I am NOT an expert or professional. I just want to point others to some of the information I have gathered so they can hopefully make a more educated guess.

To start, there is a ton of information in the Official Owners thread here on AVS forum, do yourself a favor and read through its contents. The contents of that thread have a lot to teach about displays and video processing. There is also a ton of information in the Display Calibration section of AVSforums, I suggest reading the C9/C8/C7 threads over there as well.


To Start:
Per Wiki
EDID: Extended Display Identification Data is a metadata format for display devices to describe their capabilities to a video source (e.g.
graphics card or set-top box). The data format is defined by a standard published by the Video Electronics Standards Association (VESA)

There are valid signals and invalid signals for each version of HDMI. Since we are talking about the new LG C9 and only have Hdmi 2.0 cables let us concern ourselves with only HDMI 2.0b. Here is a link to the supported 4k formats. Please scroll down to the bottom of the page and then to the bottom of the graph where it lists 4K@60.

http://www.hstecc.com/Support-467.aspx


Math time
HDMI 2.0b can only handle 18Gbps max. Even if the TV displays an image it will not be correct if it is out of spec. Here is a link to a data calculator, if you want to play around with numbers click the advanced tab instead of basic.

https://www.extron.com/product/videotools.aspx

So now lets try and quickly understand what the TV will do with each of these signals, RGB/4:4:4/4:2:2/4:2:0. Vincent Teoh has a wonderful video of what a display device does with each of the formats. Please watch the entire video.



To briefly summarize
4K@24 4:2:0 is not valid for any bitdepth
4K@60 4:2:0 is valid for all bit depths
4K@60 4:2:2 is only valid for 12bit
4K@60 4:4:4 (RGB) is only valid for 8 bit

So if you have watched the video, Vincent goes over what the TV will do with each signal. Now since this is a thread about the LG C9, we need to figure out what this TV is doing with each signal. From what people have been reporting and to my limited understanding, when placed in PC mode the LG C9 accepts 4:4:4 fully and does not down convert . If you played around with the bandwidth calculator you would see that 4K@60 4:4:4 8bit is 17.82Gbps, almost 100% of the available bandwidth of HDMI 2.0b!

When in regular mode (Game Console, DVD, etc) the TV accepts the 4:4:4 signal but then down converts it to 4:2:2 and then up converts it back to 4:4:4 before displaying. Meaning, if you send it 4:2:2 to begin with it accepts the signal and only performs a single conversion. Now this is a moot point because at the end of the day 4:4:4 is properly handled by the C9, it just adds an extra layer of conversion. People have reported color banding being very prevalent when in PC mode. The best guess I can offer is that PC mode either or strips the TV of image quality functions and or nearly using 100% of the HDMI bandwidth limits some functions. Maybe a qualified person can chime in and help me understand what is going on.

So this leads us to see that PC input 4K@60 4:4:4 8bit is perfectly playable if you are playing SDR games. Some of you are probably thinking at this point, "Well if 8 bit is fine why cant I use 10 or even better, 12 bits?" If you were to check the Math link up above you would see that this combination requires to much bandwidth and therefore is not listed in the 2.0b spec.

But why did you buy this beefy 4k HDR tv if your limiting yourself to just SDR? Are there better options? Yes, yes there are. Read on to find out.


Now on to bit depth and HDR
HDR is very specific. It is encoded at 4:2:0 10bit with a LIMITED dynamic range (more on this later)
As Vincent above has told us 12bit acts as a buffer against errors and banding, meaning you should be able to safely play 8bit/10bit files through a 12bit signal.

By all accounts the LG C9 seems to processes 12bit accurately and remember that graph I linked earlier? Only 4:2:2 and 4:2:0 are valid signals at 12bit. So we have 2 options; 4:2:0 and 4:2:2. With the provided information this should be an easy guess, since the LG wants a 4:2:2 signal we should probably send it one.

Great!, so now we can properly play all of our PC games correctly in SDR and HDR! All we have to do is switch a ton of options around everytime we fire up a different game! What if I told you there was a more convenient way with just a slight loss in quality?


Convenience
Finding a single setting that works for both SDR and HDR is very, very, convenient.

Remember when I said as long as your display process 12bit correctly you use that instead of 8/10bit? What if I also said there is just a slight loss in quality when gaming in 4:2:2 vs 4:4:4? Sure your desktop and text is a little fuzzier but if your overly concerned with browsing the internet on your C9 Oled then why are you looking at a gaming thread??

Also we should touch on Dynamic Range now. I will keep it simple.
Black Level ( located in picture mode settings)
Low = Limited (video levels)
High = Full (PC levels)

Make them match. If you set Limited in the Nvidia Control Panel choose Low on the TV. If you go Full in the control panel go High in the TV. Remember HDR is ALWAYS LIMITED


Ma boob tubes seddins:
Nvidia Control Panel
Use Nvidia Color Settings
32bit Desktop color depth
Output Color Format: yCbCr 4:2:2
Output Color Depth: 12 bpc
Output Dynamic Range: Limited

TV settings:

Black Level: Low

I use the above for HDR and SDR and am perfectly satisfied with the results. I don't have to mess with any settings other than the annoying Windows Desktop HDR tab ( Can we please get rid of it Microsoft and switch modes automatically?). Hopefully there is some accuracy in what I have wrote here, maybe some technical people can chime in and point out where I am wrong.
 
#27 · (Edited)
Just a FYI, don't use this for HDMI 2.1 resolutions. The encoding for 18Gbps-48Gbps is different (16b/18b instead of 8b/10b) so the data rate number won't be correct.

4K@60 4:2:2 is only valid for 12bit
4:2:2 is a 12bit pixel format. But you can send 8 or 10 bit pixels inside it by padding the lower bits with zero. This is why you'll see "8, 10 or 12" listed for 4:2:2 on the various HDMI resolution charts.

When in regular mode (Game Console, DVD, etc) the TV accepts the 4:4:4 signal but then down converts it to 4:2:2 and then up converts it back to 4:4:4 before displaying.
Not sure if it goes back to 4:4:4. It definitely goes to RGB as that's what the calibration functions operate on.

HDR is very specific. It is encoded at 4:2:0 10bit with a LIMITED dynamic range (more on this later)
Limited quantization range. Dynamic range is how that quantization range maps to brightness.
 
#30 ·
In my specific configuration 1080ti sending 12 bit 4:2:2 in PC actually has way worse banding compared to sending 8 bit, also it seems that enabling 6bit temporal dithering via registry edit almost completely removes PC banding when sending Ycbcr 444 8 bit in PC mode,while for some reason 8/10bit temporal dithering is worse.
 
#31 ·
If you set Limited in the Nvidia Control Panel choose Low on the TV. If you go Full in the control panel go High in the TV. Remember HDR is ALWAYS LIMITED

I meant to post this the other day.


I'm actually going to test this tonight and set to Black Level to Low and Limited in NVP and see if theres a difference. Should there be a difference from High/Full?
 
#68 · (Edited)
I used RGB full @ 8bit @ HDR too, and I can't remember why I haven't used it again lol. You are totally right.
But what i remember is if you play with HDR its a must to set the brightness of the C9 to 47-48 because at 50 the blacklevels are way to elevated.

My god, The Division 2 Coney Island DLC blows my mind @ 4K and HDR.
 
#64 ·
Amen to that. I just upgraded a bunch of my speakers and now it's almost a religious experience to play an Xbox One X game.
 
#42 · (Edited)
Quote:
If you set Limited in the Nvidia Control Panel choose Low on the TV. If you go Full in the control panel go High in the TV. Remember HDR is ALWAYS LIMITED


I meant to post this the other day. I'm actually going to test this tonight and set to Black Level to Low and Limited in NVP and see if theres a difference. Should there be a difference from High/Full?
Huge difference. I'm glad I read this. If anyone is going to do some real gaming, and you have a NVIDIA card go with these simple settings.


0-255/Full in NCP, and Black level High in Game picture mode. This setting its little too bright in the game - almost little washed out in some areas in HDR. You would have to test for yourself. (Even though HDR gaming is still new its nice to experiment with some older games like Fallout3)


So I changed the setting to limited in NCP and Black level Low and its so much better visually. :D its really excellent detail. Along with GSYNC and FPS cap. Runs like butta. my .02
 
#43 · (Edited)
Playing Far Cry New Dawn on pc with my c9 oled. The loading screen has perfect black (notice the black bars at the top and bottom), but once it gets in-game, the screen flashes for a second and it becomes grey instead of black, so the black during gameplay is not truly black but grey, which greatly reduces the contrast. Sth is wrong with the hdr of this game. Anyone who play this game notice the same thing? Pls note that this is not a problem of black level mismatch and I have no such issue with some other hdr games.
 
#47 ·
Sometimes going back to windows and then back to the game can fix some hdr issues on games with Windows
Either use in-game go windowed mode and then back to full-screen
Or alt-enter sometimes works as well
Not sure this will work but worth a try, I know it used to work previously but I usually use SDR most of the time
 
#44 ·
I’ve tried setting the desktop in windows to HDR on my laptop connected to the C9 via HDMI, but some web pages show a very odd grey colour instead of white. Anyone else get this?

It’s fine in sdr mode - just don’t want to miss out on hdr in games.

Incidentally, should the tv kick in to HDR mode automatically if you run a full screen HDR games on a connected pc / laptop?


Sent from my iPhone using Tapatalk
 
#45 ·
Hello all.

New to this forum but already found some helpfull and interesting stuff here.

Proud owner of an 65" C9, loving it for gaming.

Recently i've noticed when launching a game (Fallout 4), while HDR is active on the desktop, it is not when in the actual game. The HDR logo pops up but when entering the TV's picture settings it only shows the SD modes etc. so clearly not in HDR mode.
@22Green, i've read you're playing Fallout 3 with HDR so i was wondering why it wouldn't work with Fallout 4?

Playing at 1440p @120hz with G-Sync. Framerate capped below maximum refreshrate with RTSS.

NVCP RGB/Limited, HDMI Blacklevel low.
 
#46 ·
Hello all.

New to this forum but already found some helpfull and interesting stuff here.

Proud owner of an 65" C9, loving it for gaming.

Recently i've noticed when launching a game (Fallout 4), while HDR is active on the desktop, it is not when in the actual game. The HDR logo pops up but when entering the TV's picture settings it only shows the SD modes etc. so clearly not in HDR mode.

@22Green, i've read you're playing Fallout 3 with HDR so i was wondering why it wouldn't work with Fallout 4?

Playing at 1440p @120hz with G-Sync. Framerate capped below maximum refreshrate with RTSS.

NVCP RGB/Limited, HDMI Blacklevel low.
I don’t see Fallout 4 in the list below, so pretty sure the game does not support hdr on pc
https://www.pcgamingwiki.com/wiki/S...Feature-2Fintro/outrotemplate=Feature-2Foutro
 
#59 · (Edited)
excellent thread

so about the Black levels Low/High ..

I originally thought using High and Full is the way to go .. but it seems that breaks (?) HDR on PC, so for HDR you must put Low+Limited ? Is that the case ? And then just leave Low+Limited also for SDR gaming ?
For SDR = high + full.
For HDR = low + limited.

Btw, i don't use the PC-Mode, because it has way more colorbanding than console-mode when using HDR. In SDR and 4:4:4 8 bit its fine, but you lose some picture-settings, so i stick with console-mode.
 
#72 ·
For me with the b7 and now the C9 I have always used ycbcr 4:4:4 Pc mode
I always found less banding rather than using RGB and other settings on ycbcr , I used to use unbox therapy channel on you-tube with the grey background to check my test for banding
I haven't tested much more since upgrading to the C9 just used the same settings

But am interested in a post previous about the 6 bit dithering hack that @Nanekiu did, how do you do this
 
#77 ·
Here's a tip for PC gamers on a C9 - make sure your GPU's control panel is set to the display to performing the scaling, not the GPU.


I couldn't figure out why I wasn't getting over 60fps (v-sync on) in full-screen games set to 2560x1440 resolution. This was because I had reset my display driver, and it reverted back to having the GPU do the scaling, so full-screen games were being scaled to a 4K resolution @ 1440/60, instead of the proper 1440/120.
 
#80 ·
I recently had an AMD RX560 running and as far as I can tell there's no easy way in the AMD panel to turn off GPU scaling (there is an option but it didn't actually change the setting). There is no resolution option whatsoever on the AMD panel since they want you to use the Windows 10 display setting for that, which forces GPU scaling. So if this is true for other AMD cards, the way to switch resolutions without GPU scaling is to go into the advanced display settings and then click on "Display adapter properties for (your display)". Then right on the Adapter tab, click on "List All Modes" and then select or double click on the resolution/refresh rate combination you want, then OK, and OK again.

Nvidia of course makes this easier, you can do everything from the NV control panel.

On the other hand, most games will change the resolution without GPU scaling, so if you set the resolution within the game it's probably easier.
 
#86 ·
That's what I did, and also of course this TV only supports VRR with Nvidia cards in the form of G-Sync. I was having other issues with the AMD card as well but not really on topic. Also to be fair the RX 560 is pretty old, and my "RX560" was actually a rebranded RX460, AMD at one point just rebranded them even though they were lower performing (14 CU vs 16 CU for the actual RX560). Newer cards may or may not have these issues. I got a 1650 Super and it's all right for HTPC, and I only buy old games on sale so I can get away with 80-120Hz G-Sync gaming at 1080p.
 
#88 ·


Game mode vs Cinema on C9(helpful education)
 
  • Like
Reactions: DaverJ
#89 ·
#95 ·
PSA: INSTANT GAME RESPONSE MODE ON THE C9 CAUSES MACROBLOCKING & NEAR BLACK FLASHING PROBLEM. Tested with Stranger Things S03. Several scenes have this near black flashing problem that persists since the c8 (as demonstrated by Vincent here: https://www.youtube.com/watch?v=KjObx--Oq8g). Macroblocking also becomes more severe, and someone already reported this is due to Instant Game Response Mode increasing gamma in the 5-15 IRE range.
Another issue is disabling Instant Game Response mode seems to remove Gsync settings from Nvidia Control Panel, but Gsync still works for some reason. Everyone, pls kindly test this with low bitrate sources and tweet this to @LGUS/@Vincent_Teoh/@BigJohnnyArcher/@EvilBoris so the problem can be quickly patched in the next update from LG.
 
#96 ·
PSA: INSTANT GAME RESPONSE MODE ON THE C9 CAUSES MACROBLOCKING & NEAR BLACK FLASHING PROBLEM.

Thanks for the info -- I have seen a bit of near black flashing on occasion, but not all the time. I wondered if it was my video card, and hoped it wasn't something in the TV, like the power supply. So I have Instant Game Response turned off for now, but does this increase input lag any?
 
Top