Originally Posted by RWetmore
This is not correct. If it were, there wouldn't be the need for error correction circuitry at the receiving end.
Is there a reputable source about error correction and how the "cable" relates?
A digital cable most definitely can and does distort the signal somewhat.
What is a "digital cable"? It's just a cable.
What is distorted and how is it distorted where it becomes "visible or audible"?
The better the cable, the better and less distorted the 1s and 0s come through at the other end.
be correct, but it's just a cable. A well constructed cable and following distance recommendations are all that's needed.
What's the difference between and non-distorted vs.a less-distorted 1 or 0?
as long as the receiver can ultimately interpret all the 1s and 0s the same (and send the same exact bitstream to the display), then a better cable won't make any difference, of course.
How does the cable make sure the interpretation is accurate?
I'm saying this isn't universally the case at all, especially at very high data rates. If there is a difference (however small), it's a higher quality transmission of what amount to little square waves, i.e. 1s and 0s for digital. They don't come through as perfect square waves at the other end of the cable, but are always somewhat distorted (and sometimes a lot distorted). A higher quality cable delivers the signal with less distortion, i.e. closer to the original square waves, than a low quality cable.
Sounds good, but not quite accurate. Given a $5 cable or a $500 (passive) HDMI of equal length and construct (with no "crap") in between source and destination, the cable shouldn't be a distortion factor of sine waves or square waves (or tsunami waves).
I'm also saying I don't think that the error correction circuitry always provides lossless recovery (i.e. bit for bit exact to the original), but instead often provides interpolated recovery of damaged or distorted bits.
And if there is a difference, this is why.
Wonder what HDMI.org has to say?