AVS Forum banner

2019 LG C9–E9 dedicated GAMING thread, consoles and PC

819K views 7K replies 557 participants last post by  b0rnarian 
#1 · (Edited)
The LG OLED C9 and E9 are fantastic displays for watching movies/TV, and even better for playing console and/or PC games. With many cutting edge features like Variable Refresh Rates, G-sync, HDMI 2.1, Auto Low Latency Mode, HGiG and more, it's a great way to enhance the gaming experience.


But with the cutting edge often comes questions about setting the display for the best performance and picture quality. This is the thread to discuss this! What works well for you, what don't you understand, what games really show off this beautiful display...?


If you are a dedicated console or PC gamer and have a LG OLED, or are thinking of getting one, please SUBSCRIBE to this thread and participate in the discussion.
 
#2,886 ·
Also, 04.90.33 does NOT fix pixel shift. Not for me. I'm guessing that's in the 05.xx.xx firmware?
 
#2,887 · (Edited)
So, i did a 2 hour gradient / colorbanding test with PC / console mode, rgb full, ycbcr444 ètc. with the latest engineer FW (05.00.03) and latest NV hotfix driver.
My conclusion: PC mode and HDR are no longer an issue. I can't spot a difference between console and pc mode while HDR is on.I tested RDR2, Division 2, Forza Horizon 4. There is just no difference, i even think PC mode is very very slightly better now, or maybe this was an placebo. But 12bit is clearly doing a better job for gradients, especially with test patterns.

Can someone confirm this? I mean, thats ****ing awesome. Either you set it to PC mode and rgb full 12 bit or ycbcr444 12 bit, both looks great.
 
#2,888 ·
So, i did a 2 hour gradient / colorbanding test with PC / console mode, rgb full, ycbcr444 ètc. with the latest engineer FW (05.00.03) and latest NV hotfix driver.
My conclusion: PC mode and HDR are no longer an issue. I can't spot a difference between console and pc mode while HDR is on.I tested RDR2, Division 2, Forza Horizon 4. There is just no difference, i even think PC mode is very very slightly better now, or maybe this was an placebo. But 12bit is clearly doing a better job for gradients, especially with test patterns.

Can someone confirm this? I mean, thats ****ing awesome. Either you set it to PC mode and rgb full 12 bit oder ycbcr444 12 bit, both looks great.
Same here. I tested console mode, RGB Full, Limited. Then the same for HDR. And then again both with ALLM on.
Then I went to PC mode, again, RGB Full, Limited. And the same for HDR. And then again both with ALLM on.


I also did a Black level test for near blacks. 1, Closest to 0 Black seems to be suffering from a slight black crush for me (with PC input and ALLM turned On in SDR where this issue appears). So I set it to ISF Bright Mode while in PC input with ALLM. PC input with HDR and ALLM is fine.
Full introduces more noticeable banding where I can more more 'steps'. So I stuck with Limited and Black level Low.

The program I use to test is the 'Display Tester' app on the Windows Store. It's free for most of the test where it really matters. There are some where they specifically need you to enable HDR. That's how I know this app does track if you have HDR or not when testing.

SDR


SDR with HDMI diagnostic information displayed


HDR


HDR with HDMI diagnostic information displayed
 
#2,895 ·
Honestly, and this won't help us now, but, with the new HDMI 2.1 consoles coming out, AMD HDMI 2.1 cards coming out very soon, nVidia HDMI 2.1 cards currently and continuing to enter the market ... it's going to be all the 2021 TV's that have the best support. Pretty exciting times we are living in technology wise.

Question, does the C9 and CX have the same guts? CPU, etc?
 
#2,896 ·
LG says the C9 has an α9 Gen2 CPU, and the CX has an α9 Gen3 CPU. So I guess not the same. The C9 uses a chip called HAWK2 for HDMI 2.1. Not sure if the CX integrated the HDMI 2.1 directly in the α9 or not. But certainly they do not contain the same chips inside.
 
#2,905 ·
Does anyone with true knowledge of the 12bit vs 10bit can explain which is better and why?? I've read that if you select 12 your really downscaling to 10 because our panels are 10. I just wonder if anyone has hard facts on the difference and why..... Everyone has an opinion but I'm trying to figure out why one is better than the other on our 10 bit panels.
 
#2,906 ·
Unfortunately I'm not seeing any difference with 10bit/12bit or full/limited in terms of banding and also black levels. Black level - Lagom LCD test
10bit/12bit banding is exact same looking and black test first row is all super dark black w/ Full/High or Limited/Low. For some reason, Limited/High seems to show much more on black test but I think I'm going to stick with RGB Full/High/10bit for now.
 
#2,907 ·
Unfortunately I'm not seeing any difference with 10bit/12bit or full/limited in terms of banding and also black levels. Black level - Lagom LCD test
10bit/12bit banding is exact same looking and black test first row is all super dark black w/ Full/High or Limited/Low. For some reason, Limited/High seems to show much more on black test but I think I'm going to stick with RGB Full/High/10bit for now.
I've been rolling with RGB Full/ High / 10bit as well.. Can't tell any difference either way in 10 vs 12 when I change between the two.
 
#2,909 ·
Hi y'all - I just ordered a 15' Blue Jeans HDMI cable to plug my PC into my LG C9 for 4K/120hz gaming. Am I going to need to upgrade the HDMI cable that's plugged into the eARC HDMI port, which communicates with my receiver? It's currently a Zeskit "8K" cable.

Edit: I'm upgrading the PC to TV table, since I'm experiencing audio drops on 05.00.03.
 
#2,912 ·
Hi. Can you try this method that I quoted below? I tried using a 2m 48Gbps HDMI cable and the issue was still there. I tried the method below and it finally fixed the issue. When I went back to my 5m 18Gbps cable, issue wasn't present anymore too. So it really was just a Windows sound settings issue.

Once you've made sure this doesn't work for you, then a DDU method is needed to really scrub any drivers that were tuned for your old graphics card and its limitation to go and do a fresh install of the Nvidia drivers too.

And remember, for eARC issues, if you have a full 4K 120Hz video being fed to the TV from your PC, then it can surely pass the audio well. Make sure then that your AVR HDMI cable has Ethernet. That is the cable where the audio is being passed through the Ethernet part and not the main line.

Edit: eARC dropouts are really gone even though sample rate of 48,000Hz is still being reported as 192,000Hz.
What I did to solve the eARC issue was instead of DDU, was go to the Windows Sound Settings, click on properties for the LG TV, go to advanced, untick 'Give exclusive mode application priority' and 'Allow application to take exclusive control of this device'.
After clicking apply, just play some audio and you will only be limited to PCM. No bitstream. Then enable 'Allow application to take exclusive control of this device' and 'Give exclusive mode applications priority'.

It seems to refresh the exclusive issue I had previously. So now, Dolby Atmos for Home Theatre works fine without any audio dropouts. DTS:X and Dolby Atmos too.

So now that VRR is available and eARC is really fixed with this engineering firmware 05.00.03, I have not much qualms left. Except for getting 4:4:4 working outside of PC Mode as next gen consoles too will be able to deliver 4:4:4 through HDMI 2.1 and there is no need to subsample the image to 4:2:2.
 
#2,910 ·
Has any non-americans managed to get enrolled on the FW beta list?

In Denmark, there's no chat support (only a useless phone support) so I tried the US chatbot service and got through to Jay who was helpful enough, but apologized and said the FW update could not be offered to non-US LG TVs.
 
#2,914 ·
Easy ****ing peasy. Wow. Got the Club 3d cable. No more artifiacting. Thr Zeskit cables are confirmed crapshoot even though they specify 8k capability. Used Remocon controller to upgrade firmware. Had to restart PC and TV. GSYNC now works beautifully. Playing Horizon Zero Dawn at 90 to 120 fps with not a single hitch or issue. Game is smooth as ****ing silk, and the GSYNC indicator is on.

Amazing. This is what I bought this 3090 FTW3 Ultra for. Next gen is here!
 
#2,916 ·
Easy ****ing peasy. Wow. Got the Club 3d cable. No more artifiacting. Thr Zeskit cables are confirmed crapshoot even though they specify 8k capability. Used Remocon controller to upgrade firmware. Had to restart PC and TV. GSYNC now works beautifully. Playing Horizon Zero Dawn at 90 to 120 fps with not a single hitch or issue. Game is smooth as ****ing silk, and the GSYNC indicator is on.

Amazing. This is what I bought this 3090 FTW3 Ultra for. Next gen is here!
I have the 3090 tuf oc and aren't they beast! Good times...
 
  • Like
Reactions: scoobyroo1
#2,923 ·
Never experienced it. Which game allows you to see those flicker in the dark/bright scenes?

What's the input lag at 4k 4:4:4 120hz? How about 4K 60hz 4:4:4: VRR?
4K 60Hz, - around 13 - 15 ms
4K 120Hz - around 6ms

VRR means your input lag will fluctuate up and down depending on the Hz that is being displayed on screen. But due to the current HDMI 2.1 GPU owners owning an RTX 3080 or the 3090, it would usually be above 60fps. With that in mind, it just means that your input lag will be less than 15ms and will fluctuate between 13ms and down to 6ms. Again, based on the fps of the game that determines the VRR's refresh rate.

As currently, no one has tested the C9 running at 4K 120Hz with ALLM, it is based on the closest denominator, 1080p and 1440p @ 120Hz.

Info from Rtings LG C9 review under input lag.
 
#2,938 ·
So I got my Blue Jeans 10ft cable in yesterday and promptly returned the Zeskit cable... well the blue jeans actually has this really strange disconnect/connect every 1-2 hours or so which has been annoying the hell out of me.. should have at least kept the Zeskit for a day or so because this is a nightmare during gaming. Hoping Club3d or Monoprice fix the issue but that'll be a few days...

Anyways.. I was looking through some old reddit posts and someone recommended to force "Perform scaling on" to GPU. It was already on 'No Scaling' option but I clicked the "Override the scaling mode" option then changed Display to GPU under the "Perform scaling on" option. This gave me black bars on 3840x2160. I had to switch to 4096x2160 to get proper full screen. I made sure my C9 was set to 16:9. I thought the C9 was 3840 native... could it be this new cable or have I been playing with wrong settings all this time?
 
#2,942 ·
[QUOTE = "jeffytt, post: 60158655, member: 9524835"]
Так что вчера я принес свой 10-футовый кабель Blue Jeans и сразу вернул кабель Zeskit ... ну, у синих джинсов действительно странное отключение / подключение каждые 1-2 часа или около того, что меня чертовски раздражает ... по крайней мере, держали Zeskit в течение дня или около того, потому что это кошмар во время игр. Надеюсь, что Club3d или Monoprice решат проблему, но это займет несколько дней ...

В любом случае ... Я просматривал некоторые старые сообщения Reddit, и кто-то рекомендовал принудительно включить "Выполнить масштабирование" на GPU. Он уже был на опции «Без масштабирования», но я щелкнул опцию «Переопределить режим масштабирования», а затем изменил отображение на GPU под опцией «Выполнить масштабирование». Это дало мне черные полосы на разрешении 3840x2160. Мне пришлось переключиться на 4096x2160, чтобы получить полноценный полноэкранный режим. Я убедился, что мой C9 настроен на 16: 9. Я думал, что у C9 родной 3840 ... может быть, это новый кабель или я все это время играл с неправильными настройками?
[/ QUOTE]
 
Top