AVS Forum banner
  • Our native mobile app has a new name: Fora Communities. Learn more.

Is HDMI screwing up Our signals due to FAULTY Design?

2072 Views 14 Replies 10 Participants Last post by  HDMI_Org
AVS has posted an article that apears to say "yes, it is".


Here's the most pertinant section, though it's worth reading the whole thing.


Impedance: if the characteristic impedance of the cable doesn't match the impedance of the source and load circuits, the impedance mismatch will cause portions of the signal to be reflected back and forth in the cable. The same is true for variations in impedance from point to point within the cable.

Crosstalk: when signals are run in parallel over a distance, the signal in one wire will induce a similar signal in another, causing interference.

Inductance: just as capacitance smears out changes in voltage, inductance--the relationship between a current flow and an induced electromagnetic field around that flow--smears out changes in the rate of current flow over time.

Impedance, in particular, becomes a really important concern any time the cable length is more than about a quarter of the signal wavelength, and becomes increasingly important as the cable length becomes a greater and greater multiple of that wavelength. The signal wavelength, for one of the color channels of a 1080p HDMI signal, is about 16 inches1, making the quarter-wave a mere four inches--so impedance is an enormous consideration in getting HDMI signals to propagate along a cable without serious degradation.


Impedance is a function of the physical dimensions and arrangement of the cable's parts, and the type and consistency of the dielectric materials in the cable. There are two principal sorts of cable "architecture" used in data cabling (and HDMI, being a digital standard, is really a data cable), and each has its advantages. First, there's twisted-pair cable, used in a diverse range of computer-related applications. Twisted-pair cables are generally economical to make and can be quite small in overall profile. Second, there's coaxial cable, where one conductor runs down the center and the other is a cylindrical "shield" running over the outside, with a layer of insulation between. Coaxial cable is costlier to produce, but has technical advantages over twisted pair, particularly in the area of impedance.


It's impossible to control the impedance of any cable perfectly. We can, of course, if we know the types of materials to be used in building the cable, create a sort of mathematical model of the perfect cable; this cable has perfect symmetry, perfect materials, and manufacturing tolerances of zero in every dimension, and its impedance is fixed and dead-on-spec. But the real world won't allow us to build and use this perfect cable. The dimensions involved are very small and hard to control, and the materials in use aren't perfect; consequently, all we can do is control manufacturing within certain technical limits. Further, when a cable is in use, it can't be like our perfect model; it has to bend, and it has to be affixed to connectors.


So, what do we get instead of perfect cable, with perfect impedance? We get real cable, with impedance controlled within some tolerance; and we hope that we can make the cable conform to tolerances tight enough for the application to which we put it. As it happens, some types of impedance variation are easier to control than others, so depending on the type of cable architecture we choose, the task of controlling impedance becomes harder or easier. Coaxial cable, in this area, is clearly the superior design; the best precision video coaxes have superb bandwidth and excellent impedance control. Belden 1694A, for example, has a specified impedance tolerance of +/- 1.5 ohms, which is just two percent of the 75 ohm spec; and that tolerance is a conservative figure, with the actual impedance of the cable seldom off by more than half an ohm (2/3 of one percent off-spec). Twisted pair does not remotely compare; getting within 10 or 15 percent impedance tolerance is excellent, and the best bonded-pair Belden cables stay dependably within about 8 ohms of the 100 ohm spec.


If we were running a low bit-rate through this cable, it wouldn't really matter. Plus or minus 10 or 15 ohms would be "good enough" and the interface would work just great. But the bitrate demands placed on HDMI cable are severe. At 1080i, the pixel clock runs at 74.25 MHz, and each of the three color channels sends a ten-bit signal on each pulse of the clock, for a bitrate of 742.5 Mbps. What's worse, some devices are now able to send or receive 1080p/60, which requires double that bitrate.


Impedance mismatch, at these bitrates, causes all manner of havoc. Variations in impedance within the cable cause the signal to degrade substantially, and in a non-linear way that can't easily be EQ'd or amplified away. The result is that the HDMI standard will always be faced with serious limitations on distance. We have found that, at 720p and 1080i, well-made cables up to around 50 feet will work properly with most, but not all, source/display combinations. If 1080p becomes a standard, plenty of cables which have been good enough to date will fail. And it gets worse...


In June 2005, the HDMI organization announced the new HDMI 1.3 spec. Among other things, the 1.3 spec offers new color depths which require more bits per pixel. The HDMI press release states:


"HDMI 1.3 increases its single-link bandwidth from 165MHz (4.95 gigabits per second) to 340 MHz (10.2 Gbps) to support the demands of future high definition display devices, such as higher resolutions, Deep Color and high frame rates."


So, what did they do to enable the HDMI cable to convey this massive increase in bitrate? If your guess is "nothing whatsoever," you're right. The HDMI cable is still the same four shielded 100-ohm twisted pairs, still subject to the same technical and manufacturing limitations. And don't draw any consolation from those modest "bandwidth" requirements, stated in Megahertz; those numbers are the frequencies of the clock pulses, which run at 1/10 the rate of the data pairs, and why the HDMI people chose to call those the "bandwidth" requirements of the cable is anyone's guess. The only good news here is that the bitrates quoted are the summed bitrates of the three color channels -- so a twisted pair's potential bandwidth requirement has gone up "only" to 3.4 Gbps rather than 10.2."


You can read the rest of the article here:
http://www.audioholics.com/education...tter-with-hdmi


This does not sound good to me. It sounds like a design by committee for a different purpose than it is being used for and that is causing all kinds of problems we probably weren't even aware of.


Whether or not this provides us with significant PQ or AQ interference issues is an interesting question. Anyone know the answer?
See less See more
1 - 15 of 15 Posts
Looks like the bottom line is metal HDMI cable lengths are limited. Nothing new. In the real world (as opposed to the AVS world) very few people are affected by the length restrictions. And there are other solutions for longer lengths - fiber, cat5, repeaters, amplifiers.


larry
Any metal cable, whether it's HDMI, DVI, cat5, etc. is subject to these parasitics and Xtalk. The manufaturers who make the better cables do a better job at controlling these parasitics and minimizing Xtalk.
The only thing which is surprising about the article is that they state that HDMI has no error correction. If that's the case, then they take the position that one time bit errors are unnoticeable or negligible.


The sky has not fallen. I would say the real world issues are HDCP handshaking errors and the lack of a physical lock holding the connector in the first design iteration.
The "transport" of HDMI (and DVI), TMDS, does not use error correction. However, the way TMDS encodes the data to minimize transitions (hence the name Transition Minimized Differential Signaling) results in random data bits getting corrupted if a bit is missed. This type of data corruption usually results in "sparklies" on your display. Also, HDMI audio data is encoded for error detection/correction. I'm guessing 1 bit correction/2 bit detection but I don't recall.


larry
While I'm not going to try to argue with the subjective opinions offered in the Audioholics article, I will state the following points regarding the robustness of the HDMI technology:


1) The HDMI spec did implement technology specifically designed to mitigate the technical challenges of sending very high data rates over a cost effective copper cable. The HDMI 1.3 spec added the provision for an equalizer to be designed into a receiver chip (in fact, it is required for any receiver chip that support >5Gbps, and optional in all). From our experience, the EQ has a very significant effect of compensating for the attenuation and other losses commonly seen. In fact, we have seen that a 10.2Gbps HDMI signal can reliably be send over a cost effective 10m+ cable when the receiver has an EQ. Some chip companies have designed receivers with EQ that is so good, we've seen 1080p run at 20m. How much does an EQ add to the margins? I'm told by chip designers in the industry that it's easier to reliably decode a 10.2Gbps signal with a receiver with an EQ than a 5Gbps signal to a receiver without an EQ, so quite significant.


The HDMI 1.3 spec also added selective pre-emphasis, which allows a transmitter to overdrive transitions, so that the signal gets a boost from the sending side. This, too, gives more margin to driving longer cables that attenuate with greater losses.


Finally, we added Source Termination to also add to the bandwidth robustness. The way this is works is based off some basic serial transmission line characteristics, such as:

- Changes to impedance are bad, as they cause reflections back to the source of the signal.

- Connectors frequently have impedance mismatches and cause some reflection

- High impedance sources will then bounce that reflection back down the line, resulting in a bad quality signal.

So with Source Termination, any signals that are partially reflected from the sink get absorbed by the source instead of bouncing back to the sink. Empirical testing shows that Source Termination can open up an eye diagram significantly, especially on cables with high impedance at the input end.


2) the encoding methods used in HDMI are very similar to error correction in their effect in that they reduce the probability of an error (so it's like a pre-emptive error correction by preventing it from happening in the first place). We use:
Video data: TMDS 8b to 10b encoding
Audio data: TERC4 4b to 10b encoding
Control data: 2b to 10b encoding

Additionally, we use the industry established ECC (Error Correction Code) parity for the data islands packets (which contains the audio and control data). You can see here that these encoding method took careful consideration into giving higher amounts of error correction to data that is most critical and noticeable (control signals and audio).


On a last note, I'll add that the HDMI specification was designed by a consortium of industry experts with a great deal of experience in audio, video, cabling, serial signaling, etc. Yes, there are a great number of technical hurdles to make a reliable, high speed serialized data communication interface, but that's where innovation comes into place. Further, the technology put into HDMI is not merely theoretical, but validated with actual data from real systems so that the products that hit the shelves will just work.
See less See more
There were receivers with equalizers before HDMI 1.3 so what is the point? Adding optional "provisions" for equalization while "dictating" a lesser 5dB cable for the new bit rates is disgraceful. The only benefactors here are the cable vendors of the world.


There is nothing wrong with a cable within 10-15% of the characteristic impedance. Ethernet did 1.25Gbps at 10^-12 bit error rates with worse. There is, however, something wrong with defining DVI/HDMI to be a transport system that does not use equalization or timing recovery. An artifact of DVI being a short reach PC video transport and not being designed for its current application.


If consumers are seeing sparkles, that likely means the receiver is doing worse than 10^-6 bit error rate. Again, disgraceful. Even with the inherent spec issues, that is poor implementation at the silicon level.


In short, some of the issues with HDMI are fundamental in the spec, others are in the implementation. HDCP/DDC interoperability is by far the biggest issue which should take priority over all others and counter efforts like SimplayHD are flawed as well. These compatibility issues are usually due to the source device which inherently uses too much software for handshaking leaving room for bad implementations. As such, I'm not sure we'll ever see the end of these issues. Third order HDMI issues like cable and jitter sensitvity can be and should be eliminated by good implementation at the silicon level (not by paying $200 for cable or defining new cable types)
See less See more
while i do notice a major increase in picture quality with hdmi i always though it would be a little better. i always though it was my cable box though
HDMI can be cumbersome because of all the handshake issues... If all types worked well together life would be much easier...

Quote:
Originally Posted by dalmeida /forum/post/0


In short, some of the issues with HDMI are fundamental in the spec, others are in the implementation. HDCP/DDC interoperability is by far the biggest issue which should take priority over all others and counter efforts like SimplayHD are flawed as well. These compatibility issues are usually due to the source device which inherently uses too much software for handshaking leaving room for bad implementations. As such, I'm not sure we'll ever see the end of these issues. Third order HDMI issues like cable and jitter sensitvity can be and should be eliminated by good implementation at the silicon level (not by paying $200 for cable or defining new cable types)

The other comment I'll make is that we have yet to find a single set of HDMI/HDCP compliant devices that fail to interoperate, and we have investigated a large number of devices for interoperability issues. This data is what leads me to maintain the position that there are not inherent flaws with the HDMI specification that cause devices to fail, and failures are due to non-compliant implementations.
So what exactly does the HDMI Org, DCP, or Simplay do with this information once they investigate interoperability issues? Do they share or publish common implementation issues for others to learn from?


Saying compliant devices never fail to interoperate is a vague statement when one considers there is often room for interpreting a spec in the first place. I suspect most vendors thought they had followed the spec before putting their product in the market. I would also suspect that many of the identified interop issues are often from the same mistake (i.e. wrong EDID contents in display, software issue in source) so what is being done to help eliminate these grey zones in the spec?


That is where I believe Simplay is flawed. If there was a genuine interest in eliminating these issues, a more proactive effort would be taken in educating vendors rather than just being a gatekeeper.
What is the BER spec for the video portion of HDMI?

Also what sort of testing do you do on cables to make sure the design is compliant?

Quote:
Originally Posted by HDMI_Org /forum/post/0


The other comment I'll make is that we have yet to find a single set of HDMI/HDCP compliant devices that fail to interoperate, and we have investigated a large number of devices for interoperability issues. This data is what leads me to maintain the position that there are not inherent flaws with the HDMI specification that cause devices to fail, and failures are due to non-compliant implementations.

For a simple device to device (DVD to Monitor) I would have to agree, add a "compliant" switcher of some sort and the problems begin (and don't end)


If I pull anymore hair out trying to get all my HDMI sources to work with my projector through a switch I won't have any left. By the way it is not a cable length issue because I tried using a 1.5' cable on either end of the switchbox and I still can't get a reliable handshake with my cable box, blu ray, or HDDVD.


My latest solution is to put all my components on a table behind me with a HDMI cable hanging down from the projector so I can turn around and plug the cable into the right component.


Thanks HDMI, now I have to work to be entertained.
See less See more
update: I replaced my Monoprice 5x1 switch with a 6x1 switch from satechi which actually works. HDMI is to blame for incompatability due to handshake issues but if you are willing to keep on spending money and trying different products you can get it to work eventually.

Quote:
Originally Posted by dalmeida /forum/post/0


So what exactly does the HDMI Org, DCP, or Simplay do with this information once they investigate interoperability issues? Do they share or publish common implementation issues for others to learn from?

I can't speak for DCP or Simplay, but this is an example of how we may typically take action: when deemed appropriate, we may take this information and perform an investigation to find the root cause of the HDMI interoperability issues. Then we work with the manufacturer of the equipment that has the compliance issue to help them come up with a solution that can be distributed to the consumers with equipment in the field, and if applicable, a plan to make a change to production units.

Quote:
Saying compliant devices never fail to interoperate is a vague statement when one considers there is often room for interpreting a spec in the first place.

I would welcome any feedback pointing out a specific section of the specification that is technically vague. From our experience, this is not the case.

Quote:
I suspect most vendors thought they had followed the spec before putting their product in the market. I would also suspect that many of the identified interop issues are often from the same mistake (i.e. wrong EDID contents in display, software issue in source) so what is being done to help eliminate these grey zones in the spec?

Naturally, most respectable vendor do think they followed the specification in their products. However, there have been a few dynamics contributing to interoperability failures. First is the aspect of compliance testing, and in the past there was not a compliance test spec or testing program for HDCP (which has been a significant cause of interoperability failures). Fortunately, this has been resolved. Why else have products been designed in a non-compliant manner? Make no mistake, HDMI technology is not simple, and it takes an experienced technical team to understand how to design a compliant product. I would say that the second dynamic is the learning curve that manufacturers have had to climb. This is common for any new advanced technology (USB, Bluetooth, 1394, etc). Fortunately, manufacturers have for the most part caught up and their designers understand the high speed mixed signal challenges of HDMI much better.
See less See more
1 - 15 of 15 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top