Originally Posted by HogPilot
The point of the review was to not to purport the idea that some HDMI cables pass a "better" picture than others. What it did illustrate is that once a cable's capabilities are exceeded, it becomes painfully apparent in the form of major data loss or complete picture loss. Basically, if a cable works as advertised, you'll see the whole picture bit for bit - if not, the picture will be unwatchable. There's no middle ground.
This is a completely different idea than the one pushed by some of the more "over-marketed" cable manufacturers - that somehow a better HDMI cable will pass a sharper, more saturated, or more 3D picture. True, one should pick a cable designed and engineered for the purpose which it is intended, but that hardly means that you're not getting all you can from your system if you didn't pay out the nose for ridiculously overpriced HDMI cables.
Originally Posted by nicholc2
HDMI is a digital solution. 1's and 0's. With digital, there is no degredation of signal like there is with analog. You either get it or you don't. If you've ever had a bad hdmi cable, you know what I'm talking about. A $10 cable you can get from monoprice will give you the exact same quality picture as a $100 cable you get at BB. Don't let anyone tell you any different. And if you don't believe me, just do a quick search and you'll find plenty of articles on this subject to back me up.
Also, I'm a computer engineer in my day job, so I know a little about 1's and 0's. ;-)
I was surfing through the avs forums after a long time away, and just had to respond to this, and other similar posts.
HDMI is a digital solution, yes, but thinking it suffers no degradation if there is a picture is absolutely incorrect.
Just like Ethernet, which uses Differential Manchester Encoding, the cable medium actually carries an encoded signal. That is, dips and peaks in the electrical signal at a set oscillation. However, these are more of what you would call a logical dip and peak, with the actual signal actually having quite a bit of sloping and other non-uniform structures to the signal. The encoding and decoder then use a particular tolerance level for judging the main underlying signal.
Thing is, wires are not made of super conductive material (unless you live in liquid Nitrogen
), so they suffer from resistance. Even more, they still suffer from cross-talk. Examining the HDMI cable structure, it seems they have attempted to minimize cross-talk, but outside influences can still affect them. The result? Periodic fluctuations in the signal along with, at times, incorrect decoding of the signal if the resistance of the wires, over a certain distance, start to decrease the difference between the peaks and dips in the encoded signal.
The signal may be digital, but it still uses electricity to carry the signal between sources. If the signal gets degraded, like mentioned above, you'll still get a picture (assuming the signal degradation doesn't mess with the security handshaking) but it could very well affect the information on the pixel level. This may represent itself in incorrect color representation, flickering pixels...etc, while still receiving a full image. Some of which, i have experienced personally.