Originally Posted by MarkHotchkiss
Originally Posted by arnyk
Characteristic impedance doesn't matter until a cable is more than 1/8 of a wavelength long, at the highest frequency transmitted.
And, of course, the manufacturers know this. I don't believe they even bother to control the impedance when molding short cables.
Exactly. I do suspect that if a cable is billed as being a video or digital audio cable, the actual wire part has a characteristic impedance of something like 75 ohms.
At digital audio data sample frequencies, the reflections due to impedance mismatch across RCA or BNC connectors are moot.
One can learn a lot about this sort of thing if you play around with high resolution video signals, such as one of the pieces of coax in a RGB monitor cable running at 2048 x 1080 x 60. The signal frequencies are easily 10 times what you would see with digital audio, and any reflections in the cable will show up as fringing around the picture on the monitor.
The gold standard for this sort of thing is 75 ohm coax and impedance-matched BNC connectors. Generally, sleazing off to RCA connectors has no visible effects, even at like 10 times the frequency of a digital audio signal. However, maybe 6 inches or more of badly impedance-mismatched line, such as might be found in a cheap video switch, can cause visible effects. Multiply that by 10 as a guide to what it takes to possibly cause problems with digital audio. Most DACs are far more tolerant of dirty signals than video monitors.