It's good to break-down the problem by looking at what HDMI is and is not. HDMI was designed as a *secure* method for transmitting uncompressed video and audio from component to component. The emphasis on secure was intentional since many people in the content-owner camp were paranoid about people stealing their high quality HD connection (not realizing, of course, that most real pirates don't really care about image quality - hence the camcorder footage that shows up as a bootleg).
It was also designed for point-to-point source-to-sink distribution. I'm sure someone thought about an AVR switching between sources but it sure wasn't a priority. Also at the time HDMI came out, the maximum (at that time) speeds were pushing the available chipsets. There were multiple reports of overheating HDMI chipsets at the time.
The way HDMI works is with a series of handshakes. One of the handshakes, called the EDID, allows the sink (the destination) to tell the source what capabilities the sink has. This allows the source to only send out a signal that is compatible with the sink. If you are just doing point-to-point communications then this is easy. Also remember that due to the original HDMI limitations, the system was designed to only send out one video and one audio stream at any time. So, you can't, for instance, send out a stereo *and* a multichannel audio output at the same time. It can only be stereo *or* (really xor) multichannel.
Another handshake was the copy protection of HDCP. This is designed to encrypt the signal going out on the wire. The source and sink have unique keys and then exchange a continuous set of keys during transmissions which allow the transmission to be secure (except, of course, that the keys have been compromised). The expected result of invalid keys is to have no picture or audio.
That brings us to bit errors. I've often felt the engineers and lawyers that designed the original HDMI standard were very optimistic people. They actually believed that HDMI would work most of the time. So when it doesn't, it doesn't do it gracefully. Now some of that is the copy protection paranoia but some is just because the methods for telling the customer that something is wrong is very poor.
When you introduce bit errors into the equation, that could result in an EDID handshake that indicates the sink can only accept a source type that doesn't exist (due to the bit errors generating an invalid signal) or it could be an HDCP handshake that said you do not have a valid sink (trustworthy sink). Or it could be that the picture data is so error-prone that it can't be de-encrypted.
Of course bad firmware can do all of the things that bit errors can do. If the firmware can't understand one of the handshakes then the result can be the same as if the source or sink was reading error-prone data. For whatever reason cable STB manufacturers seem to have the most problems getting their firmware correct (even satellite boxes, on average, seem to have better firmware).
Now throw in a matrix switch or AVR, where the source is going to multiple sinks and you have where HDMI is today. Luckily the chipsets and firmware are improving. The level of complaints for simple setups seem to be going down over the years. Now people are getting more complex setups (Denon now offers multi-zone over HDMI, for instance) which is going to push HDMI to new problems.
There used to be a thread about what we would like to see in HDMI in the future. My choices would be:
1) Multiple audio streams (optional) - send both a multichannel and stereo at the same time, which would solve most of the current matrix switcher problems people have
2) A more robust error detection system - if the EDID is in error indicate that on the display. If it's an HDCP problem show that. If there are too many bit errors to tell, show that. Of course then you would have to make sure the firmware was telling the truth.
Add in what Joe said earlier and I think you'll have an answer to your questions.