Originally Posted by caunyd
or is it irrelevant, since it's digital, either you get the signal, or you don't?
I am sorry, but this is incorrect. I'll copy and paste my response to a topic on HMDI to explain why:
"HDMI is a digital solution, yes, but thinking it suffers no degradation if there is a picture is absolutely incorrect.
Just like Ethernet, which uses Differential Manchester Encoding, the cable medium actually carries an encoded signal. That is, dips and peaks in the electrical signal at a set oscillation. However, these are more of what you would call a logical dip and peak, with the actual signal actually having quite a bit of sloping and other non-uniform structures. The encoder and decoder then use a particular tolerance level for judging the main underlying signal.
Thing is, wires are not made of super conductive material, so they suffer from resistance. Even more, they still suffer from cross-talk. Examining the HDMI cable structure, it seems they have attempted to minimize cross-talk, but outside influences can still affect them. The result? Periodic fluctuations in the signal along with, at times, incorrect decoding of the signal if the resistance of the wires, over a certain distance, start to decrease the difference between the peaks and dips in the encoded signal.
The signal may be digital, but it still uses electricity to carry the signal between sources. If the signal gets degraded, like mentioned above, you'll still get a picture (assuming the signal degradation doesn't mess with the security handshaking) but it could very well affect the information on the pixel level. This may represent itself in incorrect color representation, flickering pixels...etc, while still receiving a full image. Some of which, i have experienced personally."
Coaxial can run into the exact same problems that HDMI runs into. Slowly the degradation of the signal can affect how each bit is interpreted, giving rise to bit level errors that will turn into audio errors even when receiving a full signal.
Though it gets a bit more complex when we're talking about optics to carry that signal. Fiber optic communication generally either does short burst, or a constant stream that is modulated to carry the signal (the later being for far faster transfer speed needs). Optic transmission is usually limited by the quality of the optic cable, and the light source. Some use LED's for simple, close point to point communications, while lasers are used for longer distance runs (and also provide the capability for multiplexing). I would assume SPDIF uses diode's for practicality economical sake.
So, coaxial would, in theory, still be vulnerable to the same old same old (EMF, resistance...etc), while optical would be vulnerable to distant runs. Though, it is unlikely that you would notice any error's in coaxial as its transmission band is in essence expanded compared to analog and so it is less vulnerable to any fluctuations even if they do occur.
So, really, they both are just as equal, with optical being the best if you're paranoid about interference, but possibly being worse for distant runs as the light output decreases over an increasing distance which may also lead to bit level errors (optics too have tolerance levels for interpretation of the signal, due to stray light). Although this is only conjecture on my part, i have not looked into the specs of SPDIF over optical, so i am not sure what the standard transmission source is (LED? Laser?), and then it also depends on the grade of your optical cable. If you have good optical cable you may be able to easily outrun coaxial, LED optics can still run quite a long ways before needing repeated.
Eitherway, such distance runs are probably out of the question for any sane HT project, so my top choice would be optical. But, realistically coaxial will work just as well, unless your HT room is generating an amazing amount of interference.