AVS Forum banner

341 - 360 of 1567 Posts

·
Registered
Joined
·
466 Posts
What you are saying has nothing to do with what is in the video. This is not fluctuation in brightness issue. It’s near-black gamma shift that exists regardless of framerate or frame spikes. You can even see the difference between gsync on/off in a static picture.

The brightness at static 120 hz will be different than whatever framerate your refresh rate is matched to.

Just adjust brightness with gsync on with a framerate you can hold and the black levels will be fine.
 

·
Registered
Joined
·
466 Posts
I’ve been playing modern warfare with gsync plus hdr and have had no issues with raised black levels.

1440 p gsync with my FPS capped at 80. The picture mode I’m using is hdr cinema.
 

·
Registered
Joined
·
466 Posts
You can't lower brightness setting? You mean even with zero brightness setting, it's still grey?



I don't use gsync/VRR yet until we get 4K/120, as I prefer 4k/60 with bfi (true motion user), and if I can't hit a stable 60 there, I set a custom res of 3840x1600 for ultra widescreen gaming (I don't mind the black bars). I hate how 1440p looks on the TV, even if its 120fps.


I find if you put scaling to 150 % the desktop use is fine. The default 175 % is terrible. 1440 p plus some sharpening looks great in games in my opinion.
 

·
Registered
Joined
·
10,165 Posts
Discussion Starter #344 (Edited)
I just did some testing and can't confirm what @Duc Vu is saying about the black levels looking grey with g-sync on. However, I CAN confirm there's something up with black levels and g-sync.
.

Using this test pattern: https://i.imgur.com/AfO4paZ.jpg and viewing it fullscreen (F11 on Firefox), look at the number 2 and 3 boxes on the top row. With g-sync off via NVIDIA Control Panel, the boxes look correct. Turn on g-sync and adjust the TV's contrast from default 85 (Game Mode, input at PC) down to 84 or up to 86, the boxes aren't behaving as they should. Box 3 actually gets darker raising the contrast to 86, yet it still gets darker lowering the contrast to 84. Strange.
.

So while black levels appear black to me (I'll test this more later), the contrast with g-sync on is doing something odd compared to g-sync off. But the workaround of setting Contrast to 84 with g-sync on seems to match by eye contrast at 85 with g-sync off.
.
More testing is in order, but it seems like a simple workaround/adjustment fixes this on my setup. EDIT: I had Gamma set to 2.2 during these tests.
.
 

·
Registered
Joined
·
466 Posts
I ran display test hdr with gsync on and had notifier on to confirm gsync was on. 0 nits was pure black. No issues whatsoever with hdr plus gsync and raised blacks.

Also I checked out that test pattern Daver put up. I saw no difference between gsync on vs off.


 

·
Registered
Joined
·
464 Posts
I just did some testing and can't confirm what @Duc Vu is saying about the black levels looking grey with g-sync on.
.
Do you have the latest firmware 04.71.25 installed on your c9? It was just released yesterday in my area. Before that firmware update black is black and the only issue was the gamma shift I have mentioned repeatedly on this forum. Now after firmware update black becomes grey.
 

·
Registered
Joined
·
464 Posts
Do you have the latest firmware 04.71.25 installed on your c9?

Yes, updated last night. Black levels are still inky black in games with g-sync on, just tested in Destiny 2 (HDR), a game with notoriously dark sections.
Ok I just switched from 422 12bit to rgb 8bit and then switched back to 422 12bit and now black level is fine. Seems like just a stupid temporary bug.
 
  • Like
Reactions: DaverJ

·
Registered
Joined
·
1,049 Posts
Considering the Xbox One does HDMI VRR with the C9, which would fall under adaptive sync through an AMD gpu, I have high hopes the next gen consoles will also support HDMI VRR for the C9, even though I'm still iffy on 'Freesync' moniker being supported through the C9 (CX will). I mean, there is technically NO reason why the C9 wouldn't accept Freesync, as it is just open sourced adaptive sync anyways branded by AMD. But with the Gsync compatible branding, I fear that LG will actively keep the TV from allowing adaptive sync through next gen consoles because they're not 'Nvidia-equipped', even though there's nothing in the actual TV to stop them from just enabling it. But then again the Xbox already does it, so we'll see.

I just really don't want the Xbox Series X and PS5 to lack VRR on these TVs. That would be a travesty, especially since there's nothing on the TV that is holding it back.

Wait and see, as with all things.


I'm also just waiting on Club3D or another manufacturer to make a DisplayPort 1.4 to HDMI 2.1 adapter so current graphics cards can do 4K/120.

They've been teased for like a year now, but still nothing. I guess we'll have to wait until 2.1 sources come out for these things to come out, even though PCs could benefit from such an adapter TODAY.
 

·
Registered
Joined
·
526 Posts
Considering the Xbox One does HDMI VRR with the C9, which would fall under adaptive sync through an AMD gpu, I have high hopes the next gen consoles will also support HDMI VRR for the C9, even though I'm still iffy on 'Freesync' moniker being supported through the C9 (CX will). I mean, there is technically NO reason why the C9 wouldn't accept Freesync, as it is just open sourced adaptive sync anyways branded by AMD. But with the Gsync compatible branding, I fear that LG will actively keep the TV from allowing adaptive sync through next gen consoles because they're not 'Nvidia-equipped', even though there's nothing in the actual TV to stop them from just enabling it. But then again the Xbox already does it, so we'll see.

I just really don't want the Xbox Series X and PS5 to lack VRR on these TVs. That would be a travesty, especially since there's nothing on the TV that is holding it back.

Wait and see, as with all things.


I'm also just waiting on Club3D or another manufacturer to make a DisplayPort 1.4 to HDMI 2.1 adapter so current graphics cards can do 4K/120.

They've been teased for like a year now, but still nothing. I guess we'll have to wait until 2.1 sources come out for these things to come out, even though PCs could benefit from such an adapter TODAY.
Ya the fact that they advertised the CX as Freesync and G-Sync worry's me slightly, not just for consoles but for GPU's also, The new Navi cards are looking real good at the moment, but we will have to wait and see, see what this high refresh rate update on 4k does
 

·
Registered
Joined
·
316 Posts
Considering the Xbox One does HDMI VRR with the C9, which would fall under adaptive sync through an AMD gpu, I have high hopes the next gen consoles will also support HDMI VRR for the C9, even though I'm still iffy on 'Freesync' moniker being supported through the C9 (CX will). I mean, there is technically NO reason why the C9 wouldn't accept Freesync, as it is just open sourced adaptive sync anyways branded by AMD. But with the Gsync compatible branding, I fear that LG will actively keep the TV from allowing adaptive sync through next gen consoles because they're not 'Nvidia-equipped', even though there's nothing in the actual TV to stop them from just enabling it. But then again the Xbox already does it, so we'll see.
FreeSync via HDMI: AMDs custom solution to add support for variable refresh rate via HDMI.
This was devloped years ago because at that time HDMI did not have official support for VRR.
From what I know AMDs implementation is based on VESAs adaptive sync/ that is also used via DisplayPort).

HDMI VRR: With HDMI 2.1 the HDMI Forum finally added VRR to the specifications. From now on this will become
the new standard companies will support. From what I have heard HDMI VRR is also inspired by adaptive sync But it is (slightly?) different from AMDs approach.

In theory there is no need for FreeSync via HDMI anymore. Unless you want to add support for older types of consoles/GPUs/monitors.
Going forward HDMI VRR will be the way to go. This will be the standard AVRs, TVs, consoles and GPUs with HDMI 2.1 will support.
NVIDIA is actually using the official HDMI VRR spec.. Not AMDs implementation of FreeSync via HDMI.

Microsoft already announced support for HDMI VRR and FreeSync (via HDMI).
This can be seen in a video Digital Foundry posted. They got access to the entire spec sheet of the Series X. This can be
seen at 0:06.

https://youtu.be/oNZibJazWTo?t=6
 

Attachments

·
Registered
Joined
·
10,165 Posts
Discussion Starter #355
. But with the Gsync compatible branding, I fear that LG will actively keep the TV from allowing adaptive sync through next gen consoles because they're not 'Nvidia-equipped', even though there's nothing in the actual TV to stop them from just enabling it. But then again the Xbox already does it, so we'll see.

I just really don't want the Xbox Series X and PS5 to lack VRR on these TVs. That would be a travesty, especially since there's nothing on the TV that is holding it back.
Ya the fact that they advertised the CX as Freesync and G-Sync worry's me slightly, not just for consoles but for GPU's also, The new Navi cards are looking real good at the moment, but we will have to wait and see, see what this high refresh rate update on 4k does



To reiterate what @fenster3000 said, moving forward we are looking at HDMI 2.1 for VRR, not Freesync. The Xbox Series X, Playstation 5, and theoretically upcoming GPUs will support VRR through HDMI 2.1. The C9 supports HDMI 2.1 devices.
.
Freesync mentions aside, there is no reason to fear that the C9 won't support VRR for Series X or PS5. All indications are that VRR support will be there.
.
.
 

·
Registered
Joined
·
1,049 Posts
So the only hold out in terms of adaptive sync support in the C9 are AMD gpus on PC? If that's it, I'm ok with that. As long as Nvidia gpus and consoles do VRR, I'm good. Sucks for AMD card users though. Though since apparently LG is talking to AMD, this theoretically can come out through an update in the future.

I'm mainly sticking with nvidia here because DLSS 2.0 and above are absolute game changers, and unless AMDs deep learning tech gets on that level, Nvidia has an advantage of offering essentially near native quality of 4k at the cost of 1080/1440p, which is huge. Wolfenstein Younblood's DLSS tech turned me around on the tech. It was incredible. Not like the blurfest that was DLSS at launch.

I'll easily take a DLSS 4K that can reach near 120fps, than native 4k closer to 60. HDMI 2.1 support can't come out soon enough. The wait has been excruciating.
 

·
Registered
Joined
·
143 Posts
So the only hold out in terms of adaptive sync support in the C9 are AMD gpus on PC? If that's it, I'm ok with that. As long as Nvidia gpus and consoles do VRR, I'm good. Sucks for AMD card users though. Though since apparently LG is talking to AMD, this theoretically can come out through an update in the future.

I'm mainly sticking with nvidia here because DLSS 2.0 and above are absolute game changers, and unless AMDs deep learning tech gets on that level, Nvidia has an advantage of offering essentially near native quality of 4k at the cost of 1080/1440p, which is huge. Wolfenstein Younblood's DLSS tech turned me around on the tech. It was incredible. Not like the blurfest that was DLSS at launch.

I'll easily take a DLSS 4K that can reach near 120fps, than native 4k closer to 60. HDMI 2.1 support can't come out soon enough. The wait has been excruciating.
So true. My jaw dropped yesterday after testing "control" with DLSS 2.0 at 1440p. Even 1080p DLSS looks better than native 4K in this game. Absolutely stunning.

I would never use native 4K again when DLSS can produce such an image quality in every game. This feature has to be pushed so hard. It's such a gamechanger.
 

·
Registered
Joined
·
316 Posts
So the only hold out in terms of adaptive sync support in the C9 are AMD gpus on PC?

Only current and older AMD GPUs.
Upcomming AMD GPUs will also support HDMI 2.1. Which means HDMI VRR will be implemented.
A few years ago AMD actually mentioned that they are planning to add support for HDMI VRR via a driver update
to current GPUs. We have not heard anything since. But future GPUs will have HDMI VRR because it is a mandatory feature.
 

·
Registered
Joined
·
1,665 Posts
Hey guys I’ve finally received my 77 C9. This will be used in my living room for a gaming computer setup and daily driver for tv and movies. I’ve done a lot of reading that suggest never turning the oled light above 70 and setting contrast to 90. I’ve went ahead and done that as well as set dark mode throughout windows. I also set the taskbar and bookmark bars to auto hide. Is all of this standard practice or am I just being extremely cautious? Are there any other settings I need to adjust to help reduce the likelihood of burn in? Sorry for the newb questions but this is my first oled and I wanted to reach out to other owners to really help ensure I take the necessary precautions


Sent from my iPhone using Tapatalk
 

·
Registered
Joined
·
384 Posts
Hey guys I’ve finally received my 77 C9. This will be used in my living room for a gaming computer setup and daily driver for tv and movies. I’ve done a lot of reading that suggest never turning the oled light above 70 and setting contrast to 90. I’ve went ahead and done that as well as set dark mode throughout windows. I also set the taskbar and bookmark bars to auto hide. Is all of this standard practice or am I just being extremely cautious? Are there any other settings I need to adjust to help reduce the likelihood of burn in? Sorry for the newb questions but this is my first oled and I wanted to reach out to other owners to really help ensure I take the necessary precautions


Sent from my iPhone using Tapatalk
Contrast is usually correct at 85.
Get spears and munsil disk for the basic setup of your tv.
I found that most of the time a gamma of 2.2 is not crushing black compare to 2.4 where i have to bring the brightness to 52.
100 nits is around 26 oled light for dark room
200 nits around 68 oled light is good for daylight condition.


Sent from my SM-G950F using Tapatalk
 
341 - 360 of 1567 Posts
Top