Originally Posted by joeydrunk
Oh god, ok I'm done here. I don't drink by the way. You guys and your hdmi cables get so weird about everything. Just trying to ask questions and learn once again. Have fun in your amazingly diverse forum topic.
Thin skinned, eh? He who throws stones should expect a few back. He who spouts nonsense (or non-science) will get some grief back in this forum.
But...the answer to your question is that noise is random. So, if you assume a single bit noise effect then you're talking about a single pixel flashing 1 time in all of the frames. However, the effect is not likely to occur - noise doesn't usually travel one bit at a time, particularly with a high-speed link.
So if you get a noise group then that will effect multiple bits and as Colm said the effects will be multiple pixels that are unreadable. One thing we haven't mentioned is that there is also encryption on the video/audio (HDMI philosophy that all users are probably crooks - OK, I paraphrased). Since the encryption methods are not widely available (unlike the key) it's hard to say that once you introduce noise into an encrypted signal, how much of it can you get back to the original signal on the other end?
So combine that with random noise affecting multiple bits and you get a very observable effect, as stated previously or HDCP might get in the way and not decode the signal at all. But, what will not happen is that all of the low order bits will be changed in exactly the same way. That would not be noise but an intelligent pattern.
It's not a wireless transmission medium, like TV or satellite, with a compressed video signal where you can have bad BER and so error correction is included. HDMI is uncompressed and designed for a wire only. BER should be non-existent for most users at the 1080p/60 (or any of the current HDMI formats), if they are using a good non-damaged certified-length cable. If not, then they'll have obvious errors.
And, if someone is really worried about this, then use component video. No bit errors and the picture quality, if properly tuned with a good cable, should be the same for the resolution between HDMI and component. Or someone could pump a signal down an HDMI pipe and compare the results with the source. That's essentially what an eye pattern test does (we talked about this last time) and it is required for high speed certification at data rates exceeding those of all current HDMI chipsets.http://www.hdmi.org/installers/eyediagram.aspxhttp://www.bluejeanscable.com/articl...c-versions.htm
And, glad we talked before (thanks Chris for the link). I went back and re-read the February appends. I think Colm gave you really good advice back then that still applies,
"Neither of us has knocked the cables you asked about. The only thing we have knocked is the way companies like Monster and Chord advertise.
Perhaps you should follow your own advice and research how HDMI cables work and the factors that affect the signal..."