AVS Forum banner

1481 - 1500 of 3132 Posts

·
Registered
Joined
·
680 Posts
Discussion Starter #1,481
This might NEVER be fixed!
Im on C9 and had 2080Ti.
I got DP 1.4 to HDMI 2.1 adapter and had all the same issues you have, also lack of GSYNC.
Basically I think Ampere has no native HDMI 2.1 support [some brands like gigabyte even offer 2 HDMI 2.1 ports] so they use this chip that converts DP signal to HDMI signal.
Its cheap solution
Damn imagine this is true and the backlash nvidia and LG are gonna get
 

·
Registered
Joined
·
39 Posts
This might NEVER be fixed!
Im on C9 and had 2080Ti.
I got DP 1.4 to HDMI 2.1 adapter and had all the same issues you have, also lack of GSYNC.
Basically I think Ampere has no native HDMI 2.1 support [some brands like gigabyte even offer 2 HDMI 2.1 ports] so they use this chip that converts DP signal to HDMI signal.
Its cheap solution
Its been documented that this 4:4:4 at 120Hz issue does not exist on a C9 with Ampere / RTX 3080 and even on a CX works fine at 60Hz. Seems perfectly reasonable therefore that there is something goofy with the CX that could be corrected in software / firmware since everything is OK at 60Hz (i.e. not a hardware problem).

The DP 1.4 to HDMI 2.1 dongle seems like it has lots of problems by all reviews despite recent firmware updates and since for any monitor or display Nvidia needs to certify it as GSYNC compatible - so not surprised this is broken in the first place. The dongle was just a short term bridge until native HDMI 2.1 devices started to emerge. Like I said LG and NVIDA obviously ae doing some cross-marketing here for their mutual benefit.

I haven't seen any evidence that that AIB boards like the one Gigabyte is offering is using any kind of conversion chip for the additional HDMI 2.1 connector. Despite having five outputs the card only allows a max of four simultaneously so its just a convenience thing if someone needs two of the four to be HDMI.
 

·
Registered
Joined
·
142 Posts
Its been documented that this 4:4:4 at 120Hz issue does not exist on a C9 with Ampere / RTX 3080 and even on a CX works fine at 60Hz. Seems perfectly reasonable therefore that there is something goofy with the CX that could be corrected in software / firmware since everything is OK at 60Hz (i.e. not a hardware problem).

The DP 1.4 to HDMI 2.1 dongle seems like it has lots of problems by all reviews despite recent firmware updates and since for any monitor or display Nvidia needs to certify it as GSYNC compatible - so not surprised this is broken in the first place. The dongle was just a short term bridge until native HDMI 2.1 devices started to emerge. Like I said LG and NVIDA obviously ae doing some cross-marketing here for their mutual benefit.

I haven't seen any evidence that that AIB boards like the one Gigabyte is offering is using any kind of conversion chip for the additional HDMI 2.1 connector. Despite having five outputs the card only allows a max of four simultaneously so its just a convenience thing if someone needs two of the four to be HDMI.
Are you sure there is no issue on C9? People on reddit say C9 has same issue
https://www.reddit.com/r/OLED/comments/ivt082
I cant test it now, i sold my 2080ti, using the DP 1.4 to HDMI 2.1 dongle on iGPU on 10700K [I get 4K/60hz but HDR always stuck on 8Bit and ycbcr mode doesn't work]

If what you say is true then im all set for the future, i really hope that we wont find out that nvidia using adapter to convert display port to HDMI 2.1 to save money, i mean they can use different chip or better chip one that has more features then the dongle, but not native HDMI 2.1
 

·
Registered
Joined
·
194 Posts
I was about to pull the trigger on the CX 77”, but this now makes me wonder if I should wait. In my experience, there’s little incentive for tv manufacturers to update tv’s, and IIRC, LG said they’d have AirPlay 2, then backed out, then w/ pressure, reversed course.
 

·
Registered
Joined
·
39 Posts
Are you sure there is no issue on C9? People on reddit say C9 has same issue
https://www.reddit.com/r/OLED/comments/ivt082
I cant test it now, i sold my 2080ti, using the DP 1.4 to HDMI 2.1 dongle on iGPU on 10700K [I get 4K/60hz but HDR always stuck on 8Bit and ycbcr mode doesn't work]

If what you say is true then im all set for the future, i really hope that we wont find out that nvidia using adapter to convert display port to HDMI 2.1 to save money, i mean they can use different chip or better chip one that has more features then the dongle, but not native HDMI 2.1
There is a sister thread to this over on the [H]ardForum. Reports of it working with the C9 start on page 101 - https://hardforum.com/threads/lg-48cx.1991077/page-101
 

·
Registered
Joined
·
680 Posts
Discussion Starter #1,487 (Edited)
I believe @domenicdistefano was talking about the 444 downsampling bug in pc mode (which plagues only the cx not the c9 based on reports), while @ViruzzX was talking about the gsync black screen/lost signal bug (which plagues both the cx and the c9 based on reports).
 

·
Registered
Joined
·
194 Posts
Honest question, but can anyone really tell the difference of 4:4:4 vs 4:2:2?
 
  • Like
Reactions: t1337Dude

·
Registered
Joined
·
142 Posts
I believe @domenicdistefano was talking about the 444 downsampling bug in pc mode (which plagues only the cx not the c9 based on reports), while @ViruzzX was talking about the gsync black screen/lost signal bug (which plagues both the cx and the c9 based on reports).
Yep, i was talking about broken GSYNC, I hope nvidia fixes it, and im sure its not the TV since it worked on 2080TI, so its nvidia for sure
 

·
Registered
LG CX 48, Vizio OLED55-H1, nVidia Shield TV, Google TV, nVidia RTX 2070S FE
Joined
·
368 Posts
This was the main reason I bought the cx, this has to be addressed. I have a brand new one just delivered. I’m not opening it up if it does t support 10 bit @ 4:4:4 and will be getting my money back
I'd return it and get a C9. You'll have as much luck buying a 3080 right now as you will having LG update the firmware to address this or other issues on CX.
 

·
Registered
Joined
·
680 Posts
Discussion Starter #1,492
Honest question, but can anyone really tell the difference of 4:4:4 vs 4:2:2?
I never tried to look for the difference myself in contents like games and movies so if someone can point out one example that'd be helpful, but I heard it's so minimal it's unnoticeable. Maybe some texts on certain color backgrounds will have problems. Anyway, as a very picky person when it comes to picture quality, i'd love things to work as intended and 444 can be shown on this tv as it is supposed to be shown.
 

·
Registered
Joined
·
142 Posts
Honest question, but can anyone really tell the difference of 4:4:4 vs 4:2:2?
I dont, I got used to it and it looks perfectly fine, when i was using my 2080ti on C9 to get easy auto switch to HDR mode, I always used YcBcR/Limited/4:2:2/12Bit mode in 2D, so when i run HDR movie it auto enables the right HDR mode, if you use RGB/4:4:4/8Bit/Full for SDR, and then run HDR movie, and check the windows 10 display settings it always says HDR 8Bit-Dithering, basically windows can enable or disable HDR mode automatically but it cant enable the proper color mode, switch to YCBCR/Limited, enable 4:2:2 and 10 or 12 bit.
The reason im hyped for HDMI 2.1 is that in theory we should get 4K/120Hz/GSYNC + 4:4:4/Full/10Bit so when you enable HDR you always get the best color mode in both SDR and HDR

But honestly, if you watch 4K/HDR bluray movies [or encodes] or HDR streaming, its 4:2:0/10Bit/Limited anyway, thats the official HDR standard for media.
You can take any bluray .ts file and check for yourself the meta data, its 4:2:0.
Because HDR format was finalized during HDMI 2.0 times, all current HDR content is optimized for that anyway, because we had nothing better.
Im sure Media will stay this way for ever, for backwards compatibility with HDMI 2.0 devices and due to bandwidth [4:4:4 takes more space then 4:2:0]
But for video games, in theory they should support HDR in RGB/4:4:4/Full, but only in theory since no one used it before and on consoles for example HDR games adhere to same format as HDR movies
 

·
Registered
Joined
·
216 Posts
Honest question, but can anyone really tell the difference of 4:4:4 vs 4:2:2?
In text it's very obviously tattered on colored text.
I use readouts and chat apps and I also have in game chat windows in a lot of games that use different colored text to highlight different things like combat text, notifications, "hyperlink" item names you can click, etc. "Playing card" looking stat sheets for items and creatures, characters, inventory items etc often have their own color themes in backgrounds and text colors. In some games there are name plaques hovering over people and creatures, vehicles, games have in game text on signs and storefronts, and some games have maps with map text and other fine detail colors.

When my 43" samsung nu6900 TV I'm using as a side monitor dropped back to it's named input being called "Hdmi" instead of "PC" after unplugging it, it dropped RGB and I could instantly see how tattered my text was until I set it back to being named PC which is required for it to accept and display RGB.

Dropping down to a watered down lower chroma resolution from RGB is unacceptable for pc use to me, including gaming.

RTings 422 text image from Chroma Subsampling: 4:4:4 vs 4:2:2 vs 4:2:0



------------------------------------
Informative Reddit Thread
https://www.reddit.com/r/hardware/comments/8rlf2z
Since RGB-format images don't have luma or chroma components, you can't have "chroma subsampling" on an RGB image, since there are no chroma values for you to subsample in the first place. Terms like "RGB 4:4:4" are redundant/nonsensical. RGB format is always full resolution in all channels, which is equivalent or better than YCbCr 4:4:4. You can just call it RGB, RGB is always "4:4:4".
Also, chroma subsampling is not a form of compression, because it doesn't involve any de-compression on the receiving side to recover any of the data. It is simply gone. 4:2:2 removes half the color information from the image, and 4:2:0 removes 3/4 of it, and you don't get any of it back. The information is simply removed, and that's all there is to it. So please don't refer to it as "4:2:2 compression" or "compressed using chroma subsampling" or things like that, it's no more a form of compression than simply reducing resolution from 4K to 1080p is; that isn't compression, that's just reducing the resolution. By the same token, 4:2:2 isn't compression, it's just subsampling (reducing the resolution on 2/3 of the components).

 

·
Registered
LG CX 48, Vizio OLED55-H1, nVidia Shield TV, Google TV, nVidia RTX 2070S FE
Joined
·
368 Posts
If you are interested in raising awareness about the major issues on CX, then I recommend tweeting @LGUS, @BigJohnnyArcher, @HDTVTest, and perhaps @rtingsdotcom. Also, consider supporting my posts at LG Community Forums:

120 Hz at RGB/YCbCr 444 in PC mode causing chroma subsampling

DV causing raised blacks (LG has supposedly acknowledged this, source?)

VRR & FreeSync causing gamma shift/raised near blacks (LG has acknowledged this)

VRR causing flashing in dark scenes

EDID error preventing 5.1/7.1 surround sound on Windows 10 (LG has acknowledged this on C9 but still hasn't released a fix)
 

·
Registered
Joined
·
677 Posts
I haven't seen any evidence that that AIB boards like the one Gigabyte is offering is using any kind of conversion chip for the additional HDMI 2.1 connector. Despite having five outputs the card only allows a max of four simultaneously so its just a convenience thing if someone needs two of the four to be HDMI.
Do you have a source for this ?
 

·
Registered
Joined
·
194 Posts
I dont, I got used to it and it looks perfectly fine, when i was using my 2080ti on C9 to get easy auto switch to HDR mode, I always used YcBcR/Limited/4:2:2/12Bit mode in 2D, so when i run HDR movie it auto enables the right HDR mode, if you use RGB/4:4:4/8Bit/Full for SDR, and then run HDR movie, and check the windows 10 display settings it always says HDR 8Bit-Dithering, basically windows can enable or disable HDR mode automatically but it cant enable the proper color mode, switch to YCBCR/Limited, enable 4:2:2 and 10 or 12 bit.
The reason im hyped for HDMI 2.1 is that in theory we should get 4K/120Hz/GSYNC + 4:4:4/Full/10Bit so when you enable HDR you always get the best color mode in both SDR and HDR

But honestly, if you watch 4K/HDR bluray movies [or encodes] or HDR streaming, its 4:2:0/10Bit/Limited anyway, thats the official HDR standard for media.
You can take any bluray .ts file and check for yourself the meta data, its 4:2:0.
Because HDR format was finalized during HDMI 2.0 times, all current HDR content is optimized for that anyway, because we had nothing better.
Im sure Media will stay this way for ever, for backwards compatibility with HDMI 2.0 devices and due to bandwidth [4:4:4 takes more space then 4:2:0]
But for video games, in theory they should support HDR in RGB/4:4:4/Full, but only in theory since no one used it before and on consoles for example HDR games adhere to same format as HDR movies
This is helpful. I’m hoping then that you, or someone else here, can help answer my question. I have a dedicated theater room (most of the stuff is correct in my sig, except for a newer preamp, and it’s where I watch movies / TV most of the time. All processing is handled by my Lumagen), but in my main living room, I have a 75” Sony x900e (I believe is the correct model). I have a PC I built that’s dedicated to it for 4K gaming, VR for my Index, and some streaming content (but mostly use either a Shield or an Apple TV).

The computer has a 2080ti in it on Win 10. The issue I’m having, is that while I have 4:2:2 @ both 10 & 12 bit (to test all settings), when I put on a HDR game, it doesn’t enable HDR on the TV. The TV works fine on this hdmi port for all streaming boxes and auto adjusts correctly.

I’ve heard that I need to turn on HDR manually through Windows display settings before putting on a HDR game, but I wanted to confirm as that seems like a terrible process. Is this how you went about this process? Because that’d mean that Plex & Netflix also wouldn’t engage HDR without changing the display settings first.

it was due to this annoyance (and my inability to figure this out) that lead me to the CX. I just want it to “just work”. 120hz + GSync would also be nice however.
 

·
Registered
Joined
·
31 Posts
This is a good watch if you have a couple minutes. I mostly notice banding in sky and land areas but in “theory” it would remove it.

 

·
Registered
Joined
·
31 Posts
This is helpful. I’m hoping then that you, or someone else here, can help answer my question. I have a dedicated theater room (most of the stuff is correct in my sig, except for a newer preamp, and it’s where I watch movies / TV most of the time. All processing is handled by my Lumagen), but in my main living room, I have a 75” Sony x900e (I believe is the correct model). I have a PC I built that’s dedicated to it for 4K gaming, VR for my Index, and some streaming content (but mostly use either a Shield or an Apple TV).

The computer has a 2080ti in it on Win 10. The issue I’m having, is that while I have 4:2:2 @ both 10 & 12 bit (to test all settings), when I put on a HDR game, it doesn’t enable HDR on the TV. The TV works fine on this hdmi port for all streaming boxes and auto adjusts correctly.

I’ve heard that I need to turn on HDR manually through Windows display settings before putting on a HDR game, but I wanted to confirm as that seems like a terrible process. Is this how you went about this process? Because that’d mean that Plex & Netflix also wouldn’t engage HDR without changing the display settings first.

it was due to this annoyance (and my inability to figure this out) that lead me to the CX. I just want it to “just work”. 120hz + GSync would also be nice however.
It usually will automatically switch it for you in game albeit it does not in red dead redemption 2. You have to switch it to hdr in display setting first before you load the game, this will allow the hdr setting to appear in game.
 

·
Registered
Joined
·
427 Posts
Honest question, but can anyone really tell the difference of 4:4:4 vs 4:2:2?
Personally I have a hard time seeing the difference in real-world application (games, movies, etc.) - even on the text tests, it's not an incredible difference to me. If I stand up and examine the pixels on my TV closely I can see but certainly not from sitting back on my couch. UHD Blu Ray's, as far as I know, are 4:2:0 - which also looks pretty good to me. Having said that - for PC desktop use, 4:4:4 certainly makes sense to have, especially given that the 48" model was marketed as a monitor, and the C9 supports it. It would be extremely obnoxious on LG's part if there's a hardware-based reason for the downsampling of 4:4:4 to 4:2:2 at 120Hz, but I certainly wouldn't be annoyed to the part where I want to trade for a C9.
 
1481 - 1500 of 3132 Posts
Top