AVS Forum banner

1021 - 1040 of 6733 Posts

·
Premium Member
Joined
·
28 Posts
My current desktop is [email protected]

4:2:2 10/12 bit vs 4:4:4 8 bit vs RGB full

I'm not able to do 4:4:4 10 bit. I'm using a Zeskit 22 gbps 2.0 hdmi cable.

Which of the above should be set for this TV in PC mode?
4:4:4 8 bit is the best option for resolution and minimal banding. 8 bit RGB full is the native format of Windows but it will have more banding with the TV in PC mode. There is no point to being in PC mode if you use a 4:2:2 mode.

4:2:2 is a big quality drop, much higher than 8 bit v.s. 10 bit. 4:4:4 video data consists of three planes each at 2560x1440 and is converted to 4:2:2 by simply resizing two of the planes to 1280x720. Also, these LG's do not handle 10 bit well in PC mode. I see more banding with 10 bit input than I do with 8 bit input so 10 bit is worse than 8 bit even if you can do 10 bit 4:4:4. I use 8 bit for everything on these LG OLEDs when in PC mode, the only point of 10 bit is less banding (or lower dithering noise if there is no visible banding with 8 or 10 bit) so if the banding is worse with 10 bit you should definitely not use it.
 

·
Registered
Joined
·
414 Posts
I have been using what was recommended by the person who replied to you. I leave AMD Freesync off, ALLM on and VRR on.
Thanks, same here, I noticed to enable VRR, ALLM on the xbox I need to have instant game response enabled in the TV settings anyway, otherwise they were greyed out.
 

·
Premium Member
Joined
·
28 Posts
What about the HDMI mode and watching 4:2:0 UHD BluRays (MPC with MadVR)? Would it still make sense to run it at 10-bit?
I see no benefit to 10 bit in the other modes but at least it isn't worse than 8 bit like when in PC mode. 10 bit is never worth using 4:2:2 or 4:2:0 from the GPU though. I can see using HDMI mode on the TV but sending full range RGB from the GPU is always the best quality option when using madVR. If you can send 10 bit full range RGB at the refresh rate you want, and are not using PC mode, I might use 10 bit instead of 8 bit, but only then.

Especially with madVR 10 bit is not really worth anything; madVR's dithering is so good that 8 bit is effectively exactly the same as 10 bit in all ways. The benefit to 10 bit with madVR is only less dithering noise, but since I do not notice the dithering noise at 8 bit this is only a theoretical improvement for me. On these LG OLEDs, in PC mode, the banding is distinctly worse when sending them 10 bit. There seems to be something wrong with how 10 bit is handled in PC mode which makes be suspect 10 bit input in any mode but I cannot actually notice any differences when not in PC mode. :eek:
 

·
Registered
Joined
·
33 Posts
This is a HDMI 2.0 limitation. You need a HDMI 2.1 GPU for 10-bit 4:4:4 full. You can get this color space at 30 Hz probably but who wants to use that.
Except its not; you can't do 1080p 10-bit 4:4:4 full either, even though HDMI 2.0 has more then enough bandwidth to do so. It's a stupid driver-level limitation on NIVIDIAs part.

We'll know more in about a month in any case.
 

·
Registered
Joined
·
246 Posts
What are best settings for PS4 Pro? Not sure which I should use, limited on PS4, low on CX or full and high. Auto seems to choose the wrong one.

Sent from my SM-G975U using Tapatalk
 

·
Registered
Joined
·
2,763 Posts
What are best settings for PS4 Pro? Not sure which I should use, limited on PS4, low on CX or full and high. Auto seems to choose the wrong one.

Sent from my SM-G975U using Tapatalk
Just match the settings ie low cx and ps4 pro limited or full cx high ps4 pro
 

·
Registered
LG CX65+Sonos Arc/Sub/OneSL, LG CX 48+Sonos Beam, Shield TV Pro, Google TV, XSX, Switch, RTX2070S
Joined
·
450 Posts
Why am I unable to select 10 bit color in nVidia control panel on RTX card at any chroma, resolution (2160p, 1440p, 1080p), or refresh rate? I see people here claim they can choose 10 bit. I have RTX 2070 Super and 48 Gbps Monoprice HDMI cable. Thanks!
 

·
Registered
Joined
·
246 Posts
Why am I unable to select 10 bit color in nVidia control panel on RTX card at any chroma, resolution (2160p, 1440p, 1080p), or refresh rate? I see people here claim they can choose 10 bit. I have RTX 2070 Super and 48 Gbps Monoprice HDMI cable. Thanks!
I also have 2070 Super. I'm able to do 10 or 12 bit in 4:2:2 and 4:2:0 @1080p, as well as 1440p 120hz. I can only do 60hz @4K, which I believe it is the way it should be until new cards come out.


Do you have the cable going straight into the TV from the pc or is it connected through a receiver of some type?

Sent from my SM-G975U using Tapatalk
 

·
Registered
LG CX65+Sonos Arc/Sub/OneSL, LG CX 48+Sonos Beam, Shield TV Pro, Google TV, XSX, Switch, RTX2070S
Joined
·
450 Posts
I also have 2070 Super. I'm able to do 10 or 12 bit in 4:2:2 and 4:2:0 @1080p, as well as 1440p 120hz. I can only do 60hz @4K, which I believe it is the way it should be until new cards come out.

Sent from my SM-G975U using Tapatalk
Just to confirm, are you using the HDMI port or the DP to HDMI converter?
 

·
Registered
LG CX65+Sonos Arc/Sub/OneSL, LG CX 48+Sonos Beam, Shield TV Pro, Google TV, XSX, Switch, RTX2070S
Joined
·
450 Posts
I'm connected tv to pc. I wanted to go through my receiver, but it doesn't support VRR and ALM.

Sent from my SM-G975U using Tapatalk
Sorry, I meant, are you using the HDMI port on the RTX 2070 Super or the DisplayPort with a DP to HDMI converter? Any idea why I would be unable to select anything besides 8 bit at 422/420 over HDMI? I am using HDMI 48 Gbps cable directly from GPU to TV. I am able to do 8/10/12 bit using DP (on a different monitor).

And yes, 2160p60 full/444 12 bit is not currently possible over HDMI 2.0 due to bandwidth limitation. And 10 bit not supported at full chroma over HDMI by nVidia on GeForce drivers which is something they will hopefully change with Ampere, otherwise I'm going to have to consider switching to AMD.
 

·
Registered
Joined
·
246 Posts
Sorry, I meant, are you using the HDMI port on the RTX 2070 Super or the DisplayPort with a DP to HDMI converter? Any idea why I would be unable to select anything besides 8 bit at 422/420 over HDMI? I am using HDMI 48 Gbps cable directly from GPU to TV. I am able to do 8/10/12 bit using DP (on a different monitor).

And yes, 2160p60 full/444 12 bit is not currently possible over HDMI 2.0 due to bandwidth limitation. And 10 bit not supported at full chroma over HDMI by nVidia on GeForce drivers which is something they will hopefully change with Ampere, otherwise I'm going to have to consider switching to AMD.
Yes, I'm using hdmi to hdmi. Not sure why you are not able to select 10 bit. Could it be the converter? Have you tried without it?

Sent from my SM-G975U using Tapatalk
 

·
Premium Member
Joined
·
28 Posts
Why am I unable to select 10 bit color in nVidia control panel on RTX card at any chroma, resolution (2160p, 1440p, 1080p), or refresh rate? I see people here claim they can choose 10 bit. I have RTX 2070 Super and 48 Gbps Monoprice HDMI cable. Thanks!
In my case it is because I am using DP to HDMI with a Club3D CAC-1085.

But I do see 10 bit options with YCbCr 4:2:2 or 4:2:0 using HDMI to HDMI. I have to apply 8 bit YCbCr before I can see the >8 bit modes though.

Edit: You are not missing anything useful. 10 bit is never worth YCbCr 4:2:2. A higher refresh rate could be worth it but 10 bit is not.

I also have 2070 Super. I'm able to do 10 or 12 bit in 4:2:2 and 4:2:0 @1080p, as well as 1440p 120hz. I can only do 60hz @4K, which I believe it is the way it should be until new cards come out.
Yes, 10 bit with 4:2:2 and 4:2:0 are both option for me too. I am using a LG CX, 2080 Ti 451.67, and Windows 10 2004.
 

·
Registered
Joined
·
444 Posts
maybe it's just me but I feel like to my eyes I prefer the way standard mode looks in games. I wish I could bring game setting closer to that. something about game mode seems a bit deem and washed up in comparison. do you mean inaccurate in terms of picture quality or input lag? I do notice the input lag isn't as low as game mode of course.
Super late reply, but what I meant to you and Jason is that Standard is really inaccurate when it comes to picture quality, and vivid is the setting farthest from accurate. If you prefer the way it looks that's alright, it's your tv, but just understand it's not supposed to look like that. Flimmaker, ISF modes and the Cinema mode in HDR are the most accurate. Warm 2 color temp and color at 50 looks off since we are mostly used at looking at screens that tend to push blue, so warm looks wrong compared to what we are used to.
 

·
Registered
LG CX65+Sonos Arc/Sub/OneSL, LG CX 48+Sonos Beam, Shield TV Pro, Google TV, XSX, Switch, RTX2070S
Joined
·
450 Posts
The Series X will have a 40Gbps HDMI port, they have put up some of the slides from their upcoming presentation at Hot Chips; for those still worried about 40 vs 48. View attachment 3029047
Well, this should be a good predictor of what Big Navi will be offering.
 

·
Registered
LG CX65+Sonos Arc/Sub/OneSL, LG CX 48+Sonos Beam, Shield TV Pro, Google TV, XSX, Switch, RTX2070S
Joined
·
450 Posts
Super late reply, but what I meant to you and Jason is that Standard is really inaccurate when it comes to picture quality, and vivid is the setting farthest from accurate. If you prefer the way it looks that's alright, it's your tv, but just understand it's not supposed to look like that. Flimmaker, ISF modes and the Cinema mode in HDR are the most accurate. Warm 2 color temp and color at 50 looks off since we are mostly used at looking at screens that tend to push blue, so warm looks wrong compared to what we are used to.
I've tried using Warm2 for a few days and ultimately decided to just go with Medium (in Game Mode). I really couldn't get used to the appearance, and I realize I am sacrificing color accuracy. But so long as color contrast and black levels are maintained, I'm okay with that.
 
1021 - 1040 of 6733 Posts
Top