AVS Forum banner

1021 - 1040 of 1157 Posts

·
Registered
Joined
·
284 Posts
It’s late. I’m tired. I’m sorry but I’m still missing the point you are trying to make.
The point I am trying to make about HDMI it is not very well thought out design. Quite embarrassing as a matter of fact. Wouldn't you agree?
 

·
Registered
Joined
·
3,031 Posts
HDMI is cost sensitive. Unless you can point me to a PC with 40G Ethernet for $500, or a small desktop switch for $1000. And the receiver also has to do audio decoding and amplification and many other things other than switch packets. Or 40G equipped display devices.

When you're designing enterprise equipment, it's a lot easier when you have a much bigger budget to play with. Hdmi is a cost sensitive interface. It's got to make concessions to backwards compatibility and has to cost as little as possible because price is everything.

The good news though is that HDMI 2.1 might cause the price of 40G Ethernet to drop because the standards groups will see what HDMI has done and possibly incorporate them into the standard as a lower cost interface.
 

·
Premium Member
Joined
·
413 Posts
This thread has outlived its usefulness...
You are free to leave whenever you wish. Nobody is stopping you. HDMI 2.1 problems in AVRs are not a continuous stream of news like Covid-19. No one would want that. So, you should not expect to hear about it every single day. And we are not tech journalists either, unless there are members who publish texts.

The fact that the members are creative, asking questions, willing to inform us, exchange thoughts and use humour in keeping the thread alive is to be applauded and respected. Instead of complaining about its usefulness, you could also make a positive contribution while we are waiting for more news. I have learnt a lot from other members here. And it is going to be more interesting in weeks and months ahead.

We all know that a lot of important news, releases and testing results will start coming in from Yamaha, Integra, Onkyo, Pioneer and several high-end brands too. It's great to keep the thread alive, as relevant stuff will arrive soon. HDMI 2.1 transition will last for a few years, so there is plenty of information to expect, verify and digest. By the way, what happened to Sony in AVR department?

The first gen of HDMI 2.1 chips had been a disappointment and we have discussed it extensively here. The second gen of chips is in testing devices already and we are eagerly anticipating more news. We heard that from Trinnov too. And the third gen is certainly in design by now. It's going to be fascinating to see how many gens of chips would it take to get all important features right. The thread is going to be alive, one would hope, until those matters are ironed out and until members wish to contribute.
 

·
Premium Member
Joined
·
413 Posts
The good news though is that HDMI 2.1 might cause the price of 40G Ethernet to drop because the standards groups will see what HDMI has done and possibly incorporate them into the standard as a lower cost interface.
By the way, what is the speed of Ethernet in HDMI cables with Ethernet?
 

·
Premium Member
Joined
·
3,215 Posts
The point I am trying to make about HDMI it is not very well thought out design. Quite embarrassing as a matter of fact. Wouldn't you agree?
No. It has its warts but it does do what it was designed for. My Xbox has no issue talking to my C9.

What do you think has not been well though out design wise?
 

·
Premium Member
Joined
·
3,215 Posts
The good news though is that HDMI 2.1 might cause the price of 40G Ethernet to drop because the standards groups will see what HDMI has done and possibly incorporate them into the standard as a lower cost interface.
I don't see anything from HDMI making it back to 40GBase-T. They use completely different encoding schemes. Plus the fact that 40GBase-T is full duplex.

By the way, what is the speed of Ethernet in HDMI cables with Ethernet?
100Mbps half duplex. That feature has been abandoned long ago. eARC now makes use of that channel.
 

·
Premium Member
Joined
·
413 Posts
I don't see using an off-the-shelf PC motherboard as being an advancement. Trinnov hardware itself is not expensive. What you are paying for is their software and the development that went into it.
Shall we say a modernisation instead of advacement?
 

·
Registered
Joined
·
284 Posts
No. It has its warts but it does do what it was designed for. My Xbox has no issue talking to my C9.

What do you think has not been well though out design wise?
Not using the OSI model - Wikipedia resulting in problems with extending the features of the protocol. HDMI can not even transfer subtitles separately so display devices can not control subtitle setting individually. Half duplex Ethernet as a part of HDMI spec was an icing on the cake. Who designs half duplex protocol over short distance copper in 2010?
 

·
Premium Member
Joined
·
3,215 Posts
Not using the OSI model - Wikipedia resulting in problems with extending the features of the protocol. HDMI can not even transfer subtitles separately so display devices can not control subtitle setting individually. Half duplex Ethernet as a part of HDMI spec was an icing on the cake. Who designs half duplex protocol over short distance copper in 2010?
TCP/IP does not use the OSI model either. Not using that model is not a design flaw. And for a point to point A/V link it adds even more complexity where vendors can get things wrong. Not to mention wasting precious bandwidth on all the overhead of the various layers.

Looks like there was not much of a demand for having the display overlay the subtitles. This seems to be more of a high level "who should do what" decision than an actual HDMI design flaw. It would be easy enough to add data packets to carry subtitles just like the audio packets do.

Yes, adding Ethernet was a dumb idea in hindsight. But they only had one pair to work with without redesigning the cable. It was not until gigabit Ethernet where each pair became bidirectional. Most likely they would have made it full duplex but the explosion of WiFi just killed off that feature.

What I think is a major flaw in HDMI is that they embedded the audio in the video stream. They should have gave audio a separate pair of wires with a separate audio EDID block. Basically like eARC does now but in the forward direction.
 

·
Premium Member
Joined
·
413 Posts
What I think is a major flaw in HDMI is that they embedded the audio in the video stream. They should have gave audio a separate pair of wires with a separate audio EDID block. Basically like eARC does now but in the forward direction.
True. Ghost display can be pain in the backside when trying to route lossless audio from PC to AVR. Clumsy.
 

·
Registered
Joined
·
284 Posts
TCP/IP does not use the OSI model either. Not using that model is not a design flaw. And for a point to point A/V link it adds even more complexity where vendors can get things wrong. Not to mention wasting precious bandwidth on all the overhead of the various layers.
Why is the bandwidth precious? Have you heard of HDCP? It hinders the performance more than the 5% overhead cost of TCP

TCP/IP does use the concept of layered architecture and implements most of the OSI layers. The exception being the ARP protocol which is somewhere between layers 2 and 3 depending on who you ask.

Looks like there was not much of a demand for having the display overlay the subtitles. This seems to be more of a high level "who should do what" decision than an actual HDMI design flaw. It would be easy enough to add data packets to carry subtitles just like the audio packets do.
Yeah, the HDMI consortium was too busy preaching the superiority of the new media link to bother with details like subtitles.

Yes, adding Ethernet was a dumb idea in hindsight. But they only had one pair to work with without redesigning the cable. It was not until gigabit Ethernet where each pair became bidirectional. Most likely they would have made it full duplex but the explosion of WiFi just killed off that feature.
They should have made the lowest protocol above the physical layer Ethernet and not worry about copper pairs.


What I think is a major flaw in HDMI is that they embedded the audio in the video stream. They should have gave audio a separate pair of wires with a separate audio EDID block. Basically like eARC does now but in the forward direction.
Now you are thinking like a real HDMI engineer. A separate pair of wires for each stream. Have you heard of Multiplexing - Wikipedia ?
 

·
Premium Member
Joined
·
3,215 Posts
Why is the bandwidth precious? Have you heard of HDCP? It hinders the performance more than the 5% overhead cost of TCP
HDCP use XOR. This does not add any overhead bandwidth wise.

TCP/IP does use the concept of layered architecture and implements most of the OSI layers. The exception being the ARP protocol which is somewhere between layers 2 and 3 depending on who you ask.
Again, the overhead of a network model with multiple receivers and transmitters is overkill for a point to point A/V link.

They should have made the lowest protocol above the physical layer Ethernet and not worry about copper pairs.
Let's add even more overhead. HDMI TMDS is a continuous byte stream. Packing that into Ethernet frames would just add overhead for no real benefit. A source and destination address is not needed. The preamble is not needed for a continuous stream of data.

Now you are thinking like a real HDMI engineer. A separate pair of wires for each stream. Have you heard of Multiplexing - Wikipedia ?
It's multiplexing is what gave us this mess with AVRs having to dig into the video stream to get at the audio. It would be so trivial to split off the audio if it were sent separately. Then we wouldn't need the mistake that was ARC to send audio back as the audio pair data would just be reversed. It would solve a lot of problems we're dealing with now.

tl;dr: Designing HDMI to run as a network protocol is a waste of bandwidth and more added unnecessary complexity.
 

·
Registered
Joined
·
12,236 Posts
The point I am trying to make about HDMI it is not very well thought out design. Quite embarrassing as a matter of fact. Wouldn't you agree?
IMHO...
The biggest issues about HDMI is not the technical specifications but the lack of non-certified HDMI products. HDMI has 20 test centers in Japan, China, Korea, Taiwan, USA and Europe but unfortunately many brands try to do this testing in-house. And the respective in-house testing procedures falls short and minimal testing is done for interoperability with other brand HDMI products. For the early versions of HDMI there were many issues about HDMI and compatibility with other HDMI products but eventually the brands got better with their testing programs.. Except when a new later version of HDMI is released such as 2.1, one can read the various forums and see the multiple issues of HDMI 2.1 for the brands that don't submit their products to an HDMI test center for certification. Also whenever a later HDMI version upgrades the video specification like to 8K and 48GBPS, the AVR brands which have limited video design expertise face a major challenge. Additionally later, new processors and integrated circuits are being released that have not been fully tested add to the challenge.

Just my $0.02... ;)
 

·
Registered
Joined
·
284 Posts
Again, the overhead of a network model with multiple receivers and transmitters is overkill for a point to point A/V link.
HDMI is not point-to-point. There are possible configurations with more than 2 devices. More like a broadcast medium.


Let's add even more overhead. HDMI TMDS is a continuous byte stream. Packing that into Ethernet frames would just add overhead for no real benefit. A source and destination address is not needed. The preamble is not needed for a continuous stream of data.
How do you monitor error rate? Oh, you don't. Nice. Out of sight out of mind.

It's multiplexing is what gave us this mess with AVRs having to dig into the video stream to get at the audio. It would be so trivial to split off the audio if it were sent separately. Then we wouldn't need the mistake that was ARC to send audio back as the audio pair data would just be reversed. It would solve a lot of problems we're dealing with now.
What planet are you from? ;)


tl;dr: Designing HDMI to run as a network protocol is a waste of bandwidth and more added unnecessary complexity.
 

·
Premium Member
Joined
·
413 Posts
Gents, your HDMI quarrel is fascinating , but it's not going to make us unstuck with it on AVRs.

Let's focus energy and minds on how to try to make AVR companies offer us a new, additional interface on that dinosaur board.

Anyone with contacts in the industry? How can we convince them to introduce new displayport over usb-c? We need modernisation of ports.
 

·
Registered
Joined
·
284 Posts
Gents, your HDMI quarrel is fascinating , but it's not going to make us unstuck with it on AVRs.

Let's focus energy and minds on how to try to make AVR companies offer us a new, additional interface on that dinosaur board.

Anyone with contacts in the industry? How can we convince them to introduce new displayport over usb-c? We need modernisation of ports.
Let's hire a marketing agency to push for the new port touting its superior capabilities and then charge for the privilege of using it. Let's not pay attention to the fact when watching streaming/physical media/broadcasts or anything outside PC/gaming there is a 1000x bandwidth bloat current crop of cables are having trouble handling.
 
1021 - 1040 of 1157 Posts
Top