AVS has posted an article that apears to say "yes, it is".
Here's the most pertinant section, though it's worth reading the whole thing.
Impedance: if the characteristic impedance of the cable doesn't match the impedance of the source and load circuits, the impedance mismatch will cause portions of the signal to be reflected back and forth in the cable. The same is true for variations in impedance from point to point within the cable.
Crosstalk: when signals are run in parallel over a distance, the signal in one wire will induce a similar signal in another, causing interference.
Inductance: just as capacitance smears out changes in voltage, inductance--the relationship between a current flow and an induced electromagnetic field around that flow--smears out changes in the rate of current flow over time.
Impedance, in particular, becomes a really important concern any time the cable length is more than about a quarter of the signal wavelength, and becomes increasingly important as the cable length becomes a greater and greater multiple of that wavelength. The signal wavelength, for one of the color channels of a 1080p HDMI signal, is about 16 inches1, making the quarter-wave a mere four inches--so impedance is an enormous consideration in getting HDMI signals to propagate along a cable without serious degradation.
Impedance is a function of the physical dimensions and arrangement of the cable's parts, and the type and consistency of the dielectric materials in the cable. There are two principal sorts of cable "architecture" used in data cabling (and HDMI, being a digital standard, is really a data cable), and each has its advantages. First, there's twisted-pair cable, used in a diverse range of computer-related applications. Twisted-pair cables are generally economical to make and can be quite small in overall profile. Second, there's coaxial cable, where one conductor runs down the center and the other is a cylindrical "shield" running over the outside, with a layer of insulation between. Coaxial cable is costlier to produce, but has technical advantages over twisted pair, particularly in the area of impedance.
It's impossible to control the impedance of any cable perfectly. We can, of course, if we know the types of materials to be used in building the cable, create a sort of mathematical model of the perfect cable; this cable has perfect symmetry, perfect materials, and manufacturing tolerances of zero in every dimension, and its impedance is fixed and dead-on-spec. But the real world won't allow us to build and use this perfect cable. The dimensions involved are very small and hard to control, and the materials in use aren't perfect; consequently, all we can do is control manufacturing within certain technical limits. Further, when a cable is in use, it can't be like our perfect model; it has to bend, and it has to be affixed to connectors.
So, what do we get instead of perfect cable, with perfect impedance? We get real cable, with impedance controlled within some tolerance; and we hope that we can make the cable conform to tolerances tight enough for the application to which we put it. As it happens, some types of impedance variation are easier to control than others, so depending on the type of cable architecture we choose, the task of controlling impedance becomes harder or easier. Coaxial cable, in this area, is clearly the superior design; the best precision video coaxes have superb bandwidth and excellent impedance control. Belden 1694A, for example, has a specified impedance tolerance of +/- 1.5 ohms, which is just two percent of the 75 ohm spec; and that tolerance is a conservative figure, with the actual impedance of the cable seldom off by more than half an ohm (2/3 of one percent off-spec). Twisted pair does not remotely compare; getting within 10 or 15 percent impedance tolerance is excellent, and the best bonded-pair Belden cables stay dependably within about 8 ohms of the 100 ohm spec.
If we were running a low bit-rate through this cable, it wouldn't really matter. Plus or minus 10 or 15 ohms would be "good enough" and the interface would work just great. But the bitrate demands placed on HDMI cable are severe. At 1080i, the pixel clock runs at 74.25 MHz, and each of the three color channels sends a ten-bit signal on each pulse of the clock, for a bitrate of 742.5 Mbps. What's worse, some devices are now able to send or receive 1080p/60, which requires double that bitrate.
Impedance mismatch, at these bitrates, causes all manner of havoc. Variations in impedance within the cable cause the signal to degrade substantially, and in a non-linear way that can't easily be EQ'd or amplified away. The result is that the HDMI standard will always be faced with serious limitations on distance. We have found that, at 720p and 1080i, well-made cables up to around 50 feet will work properly with most, but not all, source/display combinations. If 1080p becomes a standard, plenty of cables which have been good enough to date will fail. And it gets worse...
In June 2005, the HDMI organization announced the new HDMI 1.3 spec. Among other things, the 1.3 spec offers new color depths which require more bits per pixel. The HDMI press release states:
"HDMI 1.3 increases its single-link bandwidth from 165MHz (4.95 gigabits per second) to 340 MHz (10.2 Gbps) to support the demands of future high definition display devices, such as higher resolutions, Deep Color and high frame rates."
So, what did they do to enable the HDMI cable to convey this massive increase in bitrate? If your guess is "nothing whatsoever," you're right. The HDMI cable is still the same four shielded 100-ohm twisted pairs, still subject to the same technical and manufacturing limitations. And don't draw any consolation from those modest "bandwidth" requirements, stated in Megahertz; those numbers are the frequencies of the clock pulses, which run at 1/10 the rate of the data pairs, and why the HDMI people chose to call those the "bandwidth" requirements of the cable is anyone's guess. The only good news here is that the bitrates quoted are the summed bitrates of the three color channels -- so a twisted pair's potential bandwidth requirement has gone up "only" to 3.4 Gbps rather than 10.2."
You can read the rest of the article here:
http://www.audioholics.com/education...tter-with-hdmi
This does not sound good to me. It sounds like a design by committee for a different purpose than it is being used for and that is causing all kinds of problems we probably weren't even aware of.
Whether or not this provides us with significant PQ or AQ interference issues is an interesting question. Anyone know the answer?