Based on my experience, which is an analog chip designer who once designed an HDMI 1.2a compliant receiver IC, most of the issues with HDMI relate to interoperability. I found that most of the interoperability issues relate to the HDCP protocol. That's essentially the copy protection that "scrambles" the data going over the wire. It has to do an authentication "handshake" over an backchannel to get things going. When it doesn't work, you'll see things like a picture that "blinks" on and off, or perhaps "snow" on the screen. Eventually a source may just give up and stop transmitting. Different manufacturers had different interpretations of the protocol. Unfortunately, HDCP was never part of the original compliance test procedure. It is now. HDMI, to their credit, is addressing that. But it doesn't fix legacy devices that are out there, and behave poorly.
HDMI is an improvement over DVI in that it introduced actual compliance test. To make a DVI part, you essentially just have to claim it's DVI compliant. No-one will check the validity of that statement. So, as a result, many DVI links didn't work well. That's why it's relegated essentially to computer video. I shouldn't divulge what's in the HDMI compliance test specification but suffice it to say it enforces better designs. The HDMI connector does improve electrical characteristics over DVI. I can say that based on network analyzer measurments that I've seen. As for its mechanical quality, that remains debatable.
HDMI 1.3 and up have better recommendations, such as cable termination on BOTH the source and sink ends and reference equalizer designs. The equalizers boost high frequencies which get attenuated over long cable lengths (HDMI is digital, so you'd think the actual electical signals on the wires are "square" waves. But put that through 10+ meters of twisted pair copper wire and see what it looks like! The loss of high frequency information "rounds" off the edges, and they shift in time a bit. This is what causes errors at the receiver - it "sees" a high where it should be a low or vice versa. Classic serial link bit error stuff. EQ's can fix that). Electrical transmission lines (any kind of cable) also have reflections. Electical waves are similar to sound waves - when they move through a medium (wire) and encounter an abruptly different medium (something on a chip) there will be a reflection (like a sound echo). To fix that, transmission lines are terminated in what's called "characteristic impedance". You've seen that before, 75 ohms is standard for video, 50 ohms is standard for radio, 100 ohms is standard for ethernet (cat5) cables. But, a reflection can happen on either end of the cable (source end or sink end). So it's realy good practice to terminate both ends. HDMI originally only used sink end termination. I think 1.3 and up allows for source end as well, which is a big improvement. My background prior to HDMI was in even higher speed serial links (SerDes) where double termination is the rule of the day.
So as time goes on it will get better, both through improved standards and improved compliance testing.
Another thing to watch, though, is wireless HD connectivity. Viable technologies exist. I think one company already has a design win. After re-hooking up all my AV equipment last weekend, let me tell you it would be a welcome change!