AVS Forum banner

1 - 20 of 85 Posts

·
Registered
Joined
·
35 Posts
That's some grade A BS. how are these good for future proofing anymore... Reconsidering that 77inch CX I wanted over black Friday...
 

·
Registered
Joined
·
107 Posts
I think this is important to keep in mind. While it sucks that it was in the previous model, nothing is currently able to take advantage of it and who knows whether it would truly make a difference since the panels are 10-bit.

So what does this all mean in practical terms? Well, while we don’t yet know for certain the absolute maximum potential output capability of the next generation of consoles, it seems likely that a TV being able to support 10-bit 4K at 120Hz with 10-bit RGB 4:4:4 will be enough to get the job done.

Even if a game was potentially able to up its output to 12-bit, it’s perhaps debatable how visible the difference would be given that all TVs are currently only 10-bit. And LG (and other brands taking a similar not-quite-full-48Gbps-approach to HDMI 2.1 this year) might argue that it can make a more visible difference with the extra power it’s making available to its video processing systems by limiting the HDMI 2.1 bandwidth.
 

·
Registered
Joined
·
57 Posts
It is a 10-bit panel. Accepting a 12-bit input is essentially pointless.


As long as it accepts and displays 4k120 4:4:4 10-bit HDR, it's good and any extra bandwidth is unnecessary. I would like to overclock it to 123 Hz for gsync though, since lots of games have an ingame framerate limter at fixed intervals like 30, 60, 120 etc. So hopefully there is a bit of bandwidth to spare

If the panels were 8k, higher than 120 Hz, or 12-bit you may want the extra flexibility that the higher bandwidth could provide, but they aren't so there is no need for it. I was planning on getting a 48" CX and this won't effect my purchase at all
 

·
Registered
Joined
·
3,315 Posts
Yeah, what a bunch of BS.

Last year's C9 has it - but this year's CX doesn't?!

Plus LG also deleted DTS from the CX - that the C9 had.

What the hell LG?!
 
  • Like
Reactions: iatacs19

·
Registered
Joined
·
3,315 Posts
It is a 10-bit panel. Accepting a 12-bit input is essentially pointless.


As long as it accepts and displays 4k120 4:4:4 10-bit HDR, it's good and any extra bandwidth is unnecessary. I would like to overclock it to 123 Hz for gsync though, since lots of games have an ingame framerate limter at fixed intervals like 30, 60, 120 etc. So hopefully there is a bit of bandwidth to spare

If the panels were 8k, higher than 120 Hz, or 12-bit you may want the extra flexibility that the higher bandwidth could provide, but they aren't so there is no need for it. I was planning on getting a 48" CX and this won't effect my purchase at all

Not from what I have been told...

From what I have been told a 12-bit signal still looks "better" - even though it is a 10-bit panel.

Not as good as it would look on a 12-bit panel - but it does look "better".

In any event - it's BS that LG would do this.
 

·
Registered
Joined
·
54 Posts
Not from what I have been told...

From what I have been told a 12-bit signal still looks "better" - even though it is a 10-bit panel.

Not as good as it would look on a 12-bit panel - but it does look "better".

In any event - it's BS that LG would do this.

Who told you this? I’d like a source to read up a bit, thanks.


Sent from my iPhone using Tapatalk
 

·
Premium Member
Joined
·
431 Posts
It is a matter of who does the better 12 bit to 10 bit conversion. The TV or the graphics card. My money would be on the graphics card doing a better job. Especially since graphics cards get updated yearly and the TV would stay the same.
 

·
Premium Member
Joined
·
2,102 Posts
It is a matter of who does the better 12 bit to 10 bit conversion. The TV or the graphics card. My money would be on the graphics card doing a better job. Especially since graphics cards get updated yearly and the TV would stay the same.
Nope. You just drop the right two bits.

000000000000 -> 0000000000
000000000001 -> 0000000000
000000000010 -> 0000000000
000000000011 -> 0000000000
000000000100 -> 0000000001
000000000101 -> 0000000001
000000000110 -> 0000000001
000000000111 -> 0000000001
000000001000 -> 0000000010
000000001001 -> 0000000010
000000001010 -> 0000000010
000000001011 -> 0000000010
etc.
 

·
Registered
Joined
·
2,837 Posts
Discussion Starter #12
While I’ve owned only LG OLEDs, I think the point I’d be upset about is that they knew this all along and never clarified it before putting them on sale and only had to make a statement once someone figured it out through the EDID data. That’s a little bit shady...
 

·
Registered
Joined
·
2 Posts
Posting my twitter commentshere for reference:


No HDMI 2.1 4K or 8K TVs do 12-bit - you know that John. Not even 2019 LG TVs with HDMI 2.1. I'm sorry John but the headline is misleading.
So the effective bandwidth for 4K120 HDR in an 4K TV or 8K60 HDR in an 8K TV is 40 Gbps over FRL (HDMI 2.1) - 1/4


Sony & Samsung HDMI 2.1 4K/8K TVs do not do 12-bit either. That's why they're also specifying 40 Gbps for FRL (HDMI 2.0). Because that's what the viewer gets. Yet there is no mention of this in the story?
By the same definition virtually no HDMI 2.0 TVs are "full HDMI 2.0" (see chart here: flatpanelshd.com/pictures/hdmi20_chart.jpg ) - 2/4


Technically, in some instances, you can claim 48 Gbps for first leg of the journey (4x HDMI 2.0 pipes -> 1x HDMI 2.1 - like LG 2019 TVs) but that doesn't get passed to the TV processing circuit (today). HDMI 2.1 is defined by the interface standard, not TV's limitations - 3/4


Like Rec.2020, HDMI 2.1 and DisplayPort 2.0 are not just about the immediate benefits. They are designed as systems that also enable the next generation of video. Besides overhead, there is a 'buffer' in there for further improvements (EDIT: 12-bit systems) - 4/4
 

·
Premium Member
Joined
·
2,102 Posts
By the same definition virtually no HDMI 2.0 TVs are "full HDMI 2.0" (see chart here: flatpanelshd.com/pictures/hdmi20_chart.jpg ) - 2/4
TV’s can receive the formats you circled there. I just had my Xbox send in the first one into my C9 earlier today as a test. The second one is what it uses for HDR10 (4:2:2 is always sent as 12-bit).

But yeah, it’s converted to 10-bit sooner or later in the TV’s processing pipeline.
 

·
Registered
Joined
·
2 Posts
4K60 4:2:2 12-bit HDR10 from your Xbox? You sure about that?
Did you check with the built-in LG TV HDMI diagnostics tool? Last time I checked with an LG 2019 TV, Xbox dropped its output to 8-bit if you forced it to do 4K60 4:2:2 HDR10. Switching to 10-bit HDR10 would force the Xbox to go to 4:2:0 chroma subsampling.

I don't have an LG 2019 OLED and an Xbox One X available at this time to re-check.

But either way it's always converted to 10-bit before it gets passed to the TV's processing circuit. In the case of LG 2019 TVs and FRL signals (HDMI 2.1) they technically use 4x HDMI 2.0 pipes behind the HDMI sink. Bandwidth is then merged into one signal before it gets passed to the TV's processing circuit. That's most likely why it's EDID identifies as 48 Gbps capable. But the TV will not let 12-bit pass to the TV circuit. There are no 12-bit TVs available on the consumer market.

By the same definition a 4K TV with HDMI 2.1 would not be "full HDMI 2.1" if it didn't accept 8K signals.
 

·
Registered
Joined
·
2,098 Posts
I'm going to re-repost what I said elsewhere in the other threads:

The only situation I could see 12bit input being beneficial at all even with a 10bit panel is if games used Dolby Vision without / instead of HGIG, but considering that HGIG looks to be the go-to solution for gaming HDR going forward, I question if there's really going to be any 12bit 4k 120Hz 4:4:4 sources in the next decade.

(for clarification, 40Gbps is capable of 10bit 4k 120Hz 4:4:4 and 16bit 4k 120Hz 4:2:2)
 

·
Registered
Joined
·
64 Posts
LG, get your crap together. Although really the blame lies with HDMI Forum for not creating an all inclusive HDMI 2.1 standard.
 

·
Premium Member
Joined
·
2,102 Posts
4K60 4:2:2 12-bit HDR10 from your Xbox? You sure about that?
Did you check with the built-in LG TV HDMI diagnostics tool? Last time I checked with an LG 2019 TV, Xbox dropped its output to 8-bit if you forced it to do 4K60 4:2:2 HDR10. Switching to 10-bit HDR10 would force the Xbox to go to 4:2:0 chroma subsampling.
There is only one HDMI 4:2:2 pixel format and it's 12-bit. Any 8 or 10 bit signal is padded with zeros on the low end before it's sent. You can see this in that chart that's posted pretty much everywhere:

HDMI 2.0 resolutions.png

The C9 can't tell if it's 8, 10 or 12 bit so just displays 8 bit. That's also why your chart says 8, 10 or 12 for that resolution and why all three bit depths use the same bandwidth for 4:2:2. The Xbox is most likely rendering at 10-bit but it could very well be 12-bit but we have no way of telling.

My point is that if a TV advertises in its EDID that it can do 4K 60Hz it must accept a 4:2:2 12-bit signal. But they are free to immediately lop off the bottom 2 or 4 bits and process it as an 8 or 10 bit signal afterwards.

By the same definition a 4K TV with HDMI 2.1 would not be "full HDMI 2.1" if it didn't accept 8K signals.
A 4K TV would never advertise 8K in it's EDID so it still complies with the HDMI 2.1 standard when it rejects an 8K signal. The 4:2:2 12-bit pixel format is mandatory for whatever resolution and refresh rate combo it advertises in its EDID however. But I agree with you that it doesn't have to process it at 12-bit.
 

·
Registered
Joined
·
1,098 Posts
Is this thee Ramus Larsen or an Imposter?
 
1 - 20 of 85 Posts
Top