Originally Posted by obveron
This is just straight confusing. Surely LG have industrial tools and test equipment that can stand in as an HDMI 2.1 source. I don't believe they could have built an HDMI 2.1 display and validated its performance without a source. So just because consumers don't have sources yet, LG must have them. It just doesn't add up that the FW is presently incapable of accepting the source. I doubt LG would lie about the 2.1 compatibility, but something seems fishy to me.
There's nothing confusing about it. It's just you confusing yourself with unnecessary questions.
HDMI 2.1 has been talked about since 2017. Because the standard has been ready then. We were waiting for manufacturers to implement it. But because of the sudden increase in bandwidth, a lot of things must change down the line.
For example, suddenly, 8K panels and even up to 10K were possible. Videos and games no longer needed to be chroma sub-sampled. New features such as ALLM and VRR is now available on a HDMI port level. Videos now can be remastered in 8K even at 60fps. So much overwhelming opportunities but... the other components are not yet ready such as what I mentioned, building new panels and all.
So it didn't come in 2017. In 2018, some manufacturers start introducing some HDMI 2.1 features by backporting it to HDMI 2.0 like the Xbox One X. And some manufacturers like to squeeze out as much juice out of the current generation so that people will be forced to upgrade again in the future like the AVR receivers who are also taking their time. This suddenly lets us know how broken HDMI 2.0 eARC really is. The audio and video syncing issues and all.
The hardware has always been ready since a few years back. Just need to implement the features that is all. Thus, the physical HDMI port will be capable of receiving 48Gbps. Most importantly, is to validate cables to be able to send that much data through it.
As for the chipset controller for the HDMI, we are now using HDMI 2.0 chip on the TV but there is another chip meant for HDMI 2.1 utilization as seen on this forum. Since there is still no HDMI output devices like BD players, Android TV boxes, no Graphics Cards with HDMI 2.1 ports. Why must you then need to be worried when the HDMI 2.1 chip to be activated? It has already been standardized. If HDMI Forum keeps on changing the physical components, then no manufacturers can properly implement it for years to come. Thus, the physical parts needs to be standardized for improved compatibility and to ensure things will work.
There was someone here in the forum working in the industry where they are testing an output device with HDMI 2.1 with the C9 having a firmware that utilizes the HDMI 2.1 chip.
So with LG being the few TVs that support HDMI 2.1, you can be assured a lot of manufacturers in the industry might be using this TV as reference to ensure compatibility with their hardware. So even more, you shouldn't be worried of device incompatibility.
The HDMI 2.1 chip will be used by the time the first HDMI 2.1 output devices gets out on the market next year.
My bet is on the DP 1.4 to HDMI 2.1 adapter. But based on the current adapter not being able to use VRR, it's not looking too promising. Guess the next earliest device would be a graphics card or console since they have the most to gain from upgrading the bandwidth as opposed to the movie industry which don't see much of a gain adopting HDMI 2.1 so quickly.