AVS Forum banner

2821 - 2840 of 8337 Posts

·
Registered
Joined
·
152 Posts
Are you referring to 4k support only? I don't have a 4k monitor, but I am able to select 10 bit at 1440p120 full RGB/YCbCr 444 in NVCP.
No, I was referring to any resolution. Interesting that 1440p works for you at 10 bit. Were you able to apply that setting, and then check that indeed you are running at 10 bit? Sometimes the nvidia Control Panel lets you see choices that it does not support - once you click OK it will not actually switch to the selected mode.

If it does work, does it also work for 1920x1080?

Would be useful to know what video card you are using, what type of connection: DP or HDMI, and what display too, indeed!


The reasons for this constraint are not technical, are to prevent companies from using professional software (CAD/CAM/CAE) on computers with consumer video cards.
 

·
Registered
Joined
·
124 Posts
The reasons for this constraint are not technical, are to prevent companies from using professional software (CAD/CAM/CAE) on computers with consumer video cards.
If that's the case then I find it hard to believe next gen cards will be any different.
 

·
Registered
Joined
·
54 Posts
please post a screenshot of this
and specify which display, GPU and DP/HDMI


I just checked my own NVCP and indeed - 10-bit only shows up for me in 422 mode .. even though there are no bandwidth constraints ..
however i am able to select 12-bit in 444 mode, which is baffling (10-bit isnt supported, but 12-bit is ??)
I am using an RTX 2070 Super with a Samsung C27HG70. I have attached pictures of the NVCP and a few test patterns I found. I am only able to select/apply 10 bpc with DisplayPort (at either 1080p120 or 1440p120 RGB Full/Limited, YCbCr 422/444). I can only select 8bpc with HDMI. However, the test patterns I found look the same to me regardless of whether I use HDMI or DP.

Picture album here: https://imgur.com/a/4svxGGc
 

·
Registered
Joined
·
659 Posts
but they did release a driver enabling 10-bit on GeForce cards in summer 2019 ..
we need to establish if its only for 422 or not




besides, how would GeForce cards do HDR in Windows if they dont support 10-bit ??
 

·
Registered
Joined
·
659 Posts
I am using an RTX 2070 Super with a Samsung C27HG70. I have attached pictures of the NVCP and a few test patterns I found. I am only able to select 10 bpc with DisplayPort (at 1440p120 RGB Full/Limited, YCbCr 422/444). I can only select 8bpc with HDMI. However, the test patterns I found look the same to me regardless of whether I use HDMI or DP.

Picture album here: https://imgur.com/a/4svxGGc
thanks !

so 10-bit 444 does work ..
it might really just be a device (monitor vs TV) and DP/HDMI issue then maybe

in theory, HDMI 2.1 should fix this and Nvidia should have the presence of mind to support 444 10-bit on HDMI 2.1 ..




edit: actually .. might it be a bandwidth issue ? IIRC HDMI 2.0 actually lacks bandwidth for 1440p120 444 10-bit, whereas DP has enough
 

·
Registered
Joined
·
35 Posts
Can I exchange the motherboard of a 65CX with that of my 65C9? I understand that it is the same screen so the differences are on the motherboard

Enviado desde mi GM1917 mediante Tapatalk

Why wouldn't you just return the CX if you don't like it and buy a C9 like most sane people would? Not sure I understand what the point of this question is.
 

·
Registered
Joined
·
659 Posts
I am using an RTX 2070 Super with a Samsung C27HG70. I have attached pictures of the NVCP and a few test patterns I found. I am only able to select/apply 10 bpc with DisplayPort (at either 1080p120 or 1440p120 RGB Full/Limited, YCbCr 422/444). I can only select 8bpc with HDMI. However, the test patterns I found look the same to me regardless of whether I use HDMI or DP.

Picture album here: https://imgur.com/a/4svxGGc
can you please test


if you select 1080p for HDMI (maybe even 1080p60) - does it still not let you select 10-bit with 444 ?
 

·
Registered
Joined
·
152 Posts
JasonAVSF, thanks! So, it works when using DP! I wonder since when. And I also wonder why this is not the case over HDMI as well.

UltimateDisplay, 12 bit output was not crippled since the beginning.

The 10 bit frame buffers OpenGL limitation was specifically to prevent professional software from working on consumer cards. Most CAD software are designed around OpenGL because they were multi-paltform and OpenGL was the only multi-platform graphics backend. The reason why nvidia decided to allow 10 bit framebuffers last year for GeForce series might have been because OpenGL HDR games would not be possible without this, who knows...
 

·
Registered
Joined
·
149 Posts
can you please test


if you select 1080p for HDMI (maybe even 1080p60) - does it still not let you select 10-bit with 444 ?
I tried this on my 2080 Ti and the only options are 8 bpc and 12 bpc. With any other chroma setting I can pick 10 bpc though.
 

·
Registered
Joined
·
659 Posts
JasonAVSF, thanks! So, it works when using DP! I wonder since when. And I also wonder why this is not the case over HDMI as well.

UltimateDisplay, 12 bit output was not crippled since the beginning.

The 10 bit frame buffers OpenGL limitation was specifically to prevent professional software from working on consumer cards. Most CAD software are designed around OpenGL because they were multi-paltform and OpenGL was the only multi-platform graphics backend. The reason why nvidia decided to allow 10 bit framebuffers last year for GeForce series might have been because OpenGL HDR games would not be possible without this, who knows...
this is quite bizarre, but if 12-bit over HDMI really works, whereas 10-bit over HDMI does not ..

then it is an advantage for C9 48gbps


its not like Nvidia is locking it out anymore, since it does work fine on DP .. its literally just a HDMI thing then ? HDMI 2.1 should be fixing this
 

·
Registered
Joined
·
659 Posts
this could be a good question to ask rtings on their CX review


- are you able to get 10-bit on 444 on CX on a GeForce RTX GPU ? What about an AMD GPU ?
 

·
Registered
Joined
·
149 Posts
this could be a good question to ask rtings on their CX review


- are you able to get 10-bit on 444 on CX on a GeForce RTX GPU ? What about an AMD GPU ?
Unless there's something I'm missing, I can't find a way to enable 10 bit at any resolution on 444 with my RTX 2080 Ti.
 

·
Premium Member
Joined
·
3,513 Posts
I don't have a 2020 OLED (at least yet, I was possibly eyeing the 48" model) so forgive my ignorance on the matter - but does it not even support plain old uncompressed PCM?




Do you want me to hold you to that? :p

The current rumors are that RDNA2 is looking promising and that the lack of any news is a case of "no news is good news", especially since RTG under Raja previously took the more "noisy" hype and marketing approach on less-than-compelling GPU launches (see: Raja's infamous "poor Volta" and "make some noise" marketing with regard to Vega, also Intel's perpetual hype machine around their future GPUs ever since Raja joined Intel).
I'm just wondering what kind of software tricks will be exclusive to AMD. With the new consoles there's opporuntiny. With DLSS 3.0 on nVidia working with any TAA that's exciting, VRS (Variable Rate Shading) is exciting... I'm curious if Ray-tracing will be "open-sourced" and compatible with both cards (implementation-wise). That's why I've been paranoid about my C9 order. I'm hoping the new GPU's offer up HDMI 2.1 VRR capability or the C9 implements (formally) FreeSync (I'm aware there's a work-around)
 

·
Registered
Joined
·
2,093 Posts
Discussion Starter #2,837
I've seen this spread all over the internet as though it was a fact. But no one has ever provided any proof that this is the case. The B series has ALWAYS been essentially the same as the C Series with main difference being the Design and the Process. There's zero reason to believe LG would suddenly change this year and make the BX have only 2 HDMI 2.1 Ports. There's also no logical business reason for LG to suddenly change how they do business either, since the B Series is usually the previous year's C Specs.
Yes, the BX has less HDMI 2.1 ports than B9. See the FAQ for more info!

I know the conversation here is mainly on how LG has essentially done a bait and switch in regards to the HDMI 2.1 spec, but I have a question regarding BFI. Sorry I’m a newb here, but just how do the 3 different choices of BFI work? I mean I get what BFI is but what exactly are the “low, medium, & high” settings, what does each selection represent? Thank you to any of you that can enlighten me here.
You can find more info about BFI and it's effects on the image in the first 6 posts of this thread.

  • DTS and DTS-HD support for USB and HDMI sources - the internal decoder is missing from the factory (as announced in the documentation) and probably it will never be added to the 2020 generation - what is puzzling is that DTS and DTS-HD is not even permitted to passthrough via ARC/eARC;
Confirming the above is correct even while using an AVR that supports eARC with lossless audio. Any video content containing anything beyond a Dolby Digital 5.1 audio track plays with no sound. So this means that regular DTS, DTS-HD and DTS-X in addition to Dolby True HD and Atmos doesn't work. I tested this with the built-in Photos and Video player and PLEX app with Web OS running on my LG CX 77" model.
DTS and DTS-HD are unsupported from the factory in the 2020 models - see the FAQ in the 1-6 posts.
The WebOS apps can't decode any lossless HD audio because the Alpha9 has only an ARC capable digital output - see the FAQ in the 1-6 posts.

Othrewise, the CX should support TrueHD/MAT quite well after the 03.00.45 firmware:

Code:
MAT (MLP):
      Max channels: 8
      Supported sample rates (kHz): 192 176.4 96 88.2 48 44.1
    Dolby Digital+:
      Max channels: 8
      Supported sample rates (kHz): 48 44.1
      Supports Joint Object Coding
      Supports Joint Object Coding with ACMOD28
    AC-3:
      Max channels: 6
      Supported sample rates (kHz): 48 44.1 32
      Maximum bit rate: 640 kb/s
    Linear PCM:
      Max channels: 8
      Supported sample rates (kHz): 192 176.4 96 88.2 48 44.1 32
      Supported sample sizes (bits): 24 20 16
The only way I can get around it is to have my Nvidia Shield running through eARC via my AVR (Sony STR-DN10180 which supports select HDMI 2.1 features such as eARC for lossless audio and ALLM) and that way I can playback DTS, DTS-HD and DTS-X content through in addition to Dolby True HD and Atmos. Both the KODI and PLEX apps on the Shield work just fine (AVR lists the output audio as DTS, DTS-HD, DTS-X, Atmos etc).
Right now, we have info that DTS/DTS-HD cannot pass-trough CX via eARC to a receiver because of a lack of proper EDID.
Are you sure that DTS/DTS-HD works via eARC on a CX?

My issue now with the complete lack of support for lossless audio on the LG CX is that connecting my PC directly to the TV as opposed to through my AVR so that I can get GSYNC support is that I won't be able to play PC games with Atmos audio which is messed up. Unless there's some way to use the second HDMI port on my GPU to route audio directly to my AVR (not sure how I would go about setting that up in Windows 10)? Same deal with my Xbox One X as I can't get ALLM and VRR working on my Xbox One X while connected via my Sony STR-DN1080 AVR which by the way supports some HDMI 2.1 features such as eARC and ALLM. So connecting it directly to the LG CX means that I'll get ALLM and VRR but no Atmos thanks to LG's incredibly stupid oversight.
On what version of firmware are you right now?
 

·
Registered
Joined
·
659 Posts
RTX GPUs already offer HDMI-VRR (on HDMI 2.0) .. thats exactly what Gsync compatible with LG OLEDs is .. "GSync Compatible" over HDMI is literally just enabling HDMI-VRR

AMD HDMI 2.1 GPUs should also have HDMI-VRR (not Freesync, but HDMI-VRR), so they will work with C9 too, pretty much removing the need for Freesync
the existence of HDMI-VRR more or less makes both Gsync chips, and Freesync - obsolete ...

and I expect RTX 3080 Ti to keep the performance crown tbh
 

·
Registered
Joined
·
659 Posts
is there anyone here with and AMD GPU and LG C9/CX ?

if so - can you please see if you can enable 10-bit color with 444 chroma on any resolution (like 1080p or 60hz) on your OLED ? and mayeb check 12-bit 444 at the same time too .. thx
it seems it (10-bit) doesnt work on GeForce over HDMI, so curious if AMD does
 

·
Registered
Joined
·
2,980 Posts
if an AMD GPU is worse than Nvidias - i wont buy it just for 10-bit lol
plus right now they dont even have a 2080Ti alternative


and frankly the RX 5700 series drivers were pretty bad for months after release
ATI has had pretty bad drivers for 25 years. That's why I stopped buying them even though the hardware is usually quite nice. I have stopped hoping they will ever take driver quality seriously.
 
2821 - 2840 of 8337 Posts
Top