AVS Forum banner

2761 - 2780 of 8337 Posts

·
Registered
Joined
·
9,387 Posts
The last web page capture is from April 20th.
And? Honestly who is going to win this. Either people buy it or don’t at the end of the day. It is their product. Companies might and do make last minute changes everyday. Just like we get features not originally sold Or announced with the tv 6 months later.
 

·
Premium Member
Joined
·
2,051 Posts
And? Honestly who is going to win this. Either people buy it or don’t at the end of the day. It is their product. Companies might and do make last minute changes everyday. Just like we get features not originally sold Or announced with the tv 6 months later.
It was after the release. Which means people bought it before the fixed their advertising mistake. The rest is between those who want to return it and the dealer/LG.
 

·
Registered
Joined
·
22 Posts
Not at all, where did you hear that NVIDIA can't output 10bpc?

If you play a game right now in HDR, the GPU is outputting 10bpc.

I can confirm this as I have the CX 77 and a Titan RTX and everything is working as expected.

The one area where there may be some confusion is applications like Photoshop. NVIDIA used to limit non-fullscreen applications to 8bpc output unless you were using a quadro proffesional GPU but I don't think thats the case anymore.
To give some more detail, the 8bpc limit you're referencing was only for OpenGL applications on GeForce & Titan cards. Last year NVIDIA did remove the limitation for the studio drivers. I'm not sure if that change has made its way into the game ready drivers though.

source: https://www.anandtech.com/show/14682/nvidia-siggraph-2019-nv-to-enable-30bit-opengl-support-on-geforce-cards

I got curious and did some digging through release notes. I think the answer is OpenGL applications should have access to 10bpc in both the studio drivers and the game ready drivers for GeForce & Titan cards.


  • 10bpc support was added to the studio driver with version 431.70 (Release 430), release date 2019.7.29

  • Best I can tell the next game ready driver released was version 436.15 WHQL (Release 435), release date 2019.8.27
The release notes pdf for the game ready driver Release 435 indicates:

What’s New in Release 435
This section summarizes the driver changes previously introduced in Release 435.
New Features
Added support for 30-bit color.
So it looks like OpenGL applications should be able to use 10bpc with the studio drivers and game ready drivers on GeForce & Titan cards.
 

·
Registered
Joined
·
149 Posts
Not at all, where did you hear that NVIDIA can't output 10bpc?

If you play a game right now in HDR, the GPU is outputting 10bpc.

I can confirm this as I have the CX 77 and a Titan RTX and everything is working as expected.

The one area where there may be some confusion is applications like Photoshop. NVIDIA used to limit non-fullscreen applications to 8bpc output unless you were using a quadro proffesional GPU but I don't think thats the case anymore.
Something is not right with my setup. I can't change from 8 bpc in Nivdia Control Panel at all and Windows is detecting my CX as an 8 bit display or if HDR is swtich on an 8 bit display with dithering.
 

·
Registered
Joined
·
2,098 Posts
I find it interesting that people in this thread are ready to look to other vendors (Vizio and the like) as a hope for full 48Gbps connections, yet nobody wants to consider that Nvidia isn't the only maker of discrete GPUs...

Yes I understand that displeasure from current Nvidia owners, but those very same current GPUs aren't capable of HDMI 2.1 anyway - there's nothing stopping you from simply not given Nvidia your money if you're unhappy with their behavior.
 

·
Registered
Joined
·
27 Posts
The last web page capture is from April 20th.
And? Honestly who is going to win this. Either people buy it or don’️t at the end of the day. It is their product. Companies might and do make last minute changes everyday. Just like we get features not originally sold Or announced with the tv 6 months later.
Getting new features via an update is 100% different than falsely advertising the specs of the TV. And for the record, April 20th is not a "last minute" change. I ordered on April 17th.
 

·
Registered
Joined
·
22 Posts
I find it interesting that people in this thread are ready to look to other vendors (Vizio and the like) as a hope for full 48Gbps connections, yet nobody wants to consider that Nvidia isn't the only maker of discrete GPUs...
That is a good point, Intel is coming out with a discrete gpu soon!;)


Plus even if a current nvidia HDMI 2.0 gpu has restrictions on when it outputs 10-bit, that doesn't indicate it would be the same for the upcoming HDMI 2.1 cards. I don't think we can really know until we have a nvidia HDMI 2.1 gpu to test with.
 

·
Registered
Joined
·
9,387 Posts
Getting new features via an update is 100% different than falsely advertising the specs of the TV. And for the record, April 20th is not a "last minute" change. I ordered on April 17th.
Okay...so who is filing the suit so I can sit with my popcorn and watch?
 

·
Registered
Joined
·
2,098 Posts
It makes you wonder if it is a firmware not hardware change. If they could decide to do 48 if they wanted to cut other processing out.
If so, then an easy work-around would be to simply re-allocate the processing back to re-enabling the 12bit pipeline if game mode, VRR, or film maker modes are enabled as all three modes by definition bypass a lot of said post-processing.


That is a good point, Intel is coming out with a discrete gpu soon!;)
I get it, Nvidia fans would rather succumb to Stockholm syndrom before ever considering using an AMD GPU (I hope none of them are Linux users then :p).

...yet many of them probably own a PlayStation, Xbox, or older Nintendo console that also use an AMD GPU.
 

·
Registered
Joined
·
659 Posts
if an AMD GPU is worse than Nvidias - i wont buy it just for 10-bit lol
plus right now they dont even have a 2080Ti alternative


and frankly the RX 5700 series drivers were pretty bad for months after release
 

·
Registered
Joined
·
22 Posts
I get it, Nvidia fans would rather succumb to Stockholm syndrom before ever considering using an AMD GPU (I hope none of them are Linux users then :p).

...yet many of them probably own a PlayStation, Xbox, or older Nintendo console that also use an AMD GPU.
Jokes aside, switching to AMD gpu would be better than switching from OLED to LCD. At least that would make more sense to me. Although, none of this matters until we see what the new gpus support.

Since there are no hdmi 2.1 gpus yet from any manufacturer, I don't think we should be flipping out just yet about 10-bit support for 4k120hz.
 

·
Registered
Joined
·
54 Posts
I know the conversation here is mainly on how LG has essentially done a bait and switch in regards to the HDMI 2.1 spec, but I have a question regarding BFI. Sorry I’m a newb here, but just how do the 3 different choices of BFI work? I mean I get what BFI is but what exactly are the “low, medium, & high” settings, what does each selection represent? Thank you to any of you that can enlighten me here.


Sent from my iPhone using Tapatalk
 

·
Registered
Joined
·
2,098 Posts
if an AMD GPU is worse than Nvidias - i wont buy it just for 10-bit lol
plus right now they dont even have a 2080Ti alternative
We're obviously talking about future HDMI 2.1-capable GPUs, you know like the RDNA2-powered stuff that the PS5 and Xbox Series X for example are based on (with the latter apparently comparable to a 2080 Super).


and frankly the RX 5700 series drivers were pretty bad for months after release
They worked great on Linux. :p

(I know I know, AVS Forum members are probably some of the least likely people to use Linux)


Newbie here. Got the LG CX 65 this weekend and was expecting and upgrade in all aspects compared to my only sony 8500D.

So far I am noticing juddering, artifacts and noise in moving content, even with Tru motion on with any content, gaming included. My old sony handled motion a lot better and didnt have that soap opera effect which strains my head.
Unfortunately you picked a very bad time to ask for help... I'm no expert, but sadly the more knowledable folks seem more interested in the 40Gbps vs 48Gbps stuff right now.

The only thing I can say is that maybe you've sensitive to the fast pixel response time and are therefore noticing frame rates being choppier despite not actually being any different framerate-wise.

One thing you may want try to fiddle around with the black frame insertion options.
 

·
Registered
Joined
·
592 Posts
welp, returned my 65CX and got a 65C9...put 600 back in my pocket and hopefully now dont have to think about this anymore. I really wonder if the cec issues I was having with my shield have anything to do with the new HDMI 2.1-lite ports in hindsight...
 

·
Registered
Joined
·
100 Posts
Newbie here. Got the LG CX 65 this weekend and was expecting and upgrade in all aspects compared to my only sony 8500D. Got it because I wanted a better overall set for all content and especially the new gaming consoles to be released later this year with better input lag and hdmi 2.1 support.



So far I am noticing juddering, artifacts and noise in moving content, even with Tru motion on with any content, gaming included. My old sony handled motion a lot better and didnt have that soap opera effect which strains my head.



I have the latest firmware, I really hope there is something I am missing in the settings as I am starting to think I have made a huge mistake in buying this set. The picture is phenomenal with slow moving content I can see that. Any help or info would my greatly appreciated! TIA
You need to turn tru-motion off. Or set the judder slider very low to 1 or 2 if you want a small amount of smoothing without too much soap opera.
 

·
Registered
Joined
·
488 Posts
welp, returned my 65CX and got a 65C9...put 600 back in my pocket and hopefully now dont have to think about this anymore. I really wonder if the cec issues I was having with my shield have anything to do with the new HDMI 2.1-lite ports in hindsight...
Let us know.
 

·
Registered
Joined
·
276 Posts
so having a weird issue, HDMI-CEC related I guess? I prefer to not use HDMI CEC but I need it turned on in order to use arc. I have the power sync setting under device connection turned off. on my apple tv I have control tv and receivers set to off. Here’s what is happening. I turn the tv on, switch to the apple tv input, and then I manually turn the apple tv on. If I switch to another input (with apple tv still on), and then back to the apple tv input, the TV seems to forcefully turn the apple tv off

anyone have any idea what combo of HDMI-CEC wonkyness I need to tweak? thanks
 
2761 - 2780 of 8337 Posts
Top