Originally Posted by jimshowalter
If bits are streamed and clocking is done on the receiving end, the cable is irrelevant. The bits could be typed in by hand on a manual typewriter, scanned, OCR'd, and turned into bits on the receiving end, and provided the same bits wind up there, the transport medium is utterly irrelevant.
I agree that timing is critically important when the samples are taken, but the cable isn't taking samples, it's just transmitting samples.
I agree that timing is critically important when the samples are converted back to analog, but the cable isn't performing D/A, it's just transmitting samples.
If there are digital transport standards idiotic enough to make the timing depend on the cable, they should be abolished. Are there such standards? And, even if there are, what's to keep the receiving end from buffering and reclocking the data?
Completely agree with this - the cable either works or doesn't. Lost samples sound VERY bad.
It is very easy to stay in the digital domain using SPDIF to SPDIF, AES/EBU to AES/EBU, etc and verify bit for bit transfers with no corruption or degradation in the audio data. Studios all over the world do this regularly. The cable, if it works, works. That analog conversion is where the rubber meets the road. However, you would be surprised how well good converters and good analog gear can reproduce audio.
I did a test several years ago to help me understand what was and was not important on a basic level with digital audio workstations. I started with 16 tracks of digital audio recorded onto a pair of TASCAM DA88s - multi-track drums, bass, guitar, keys. My point was to test several DAWs including Protools, Nuendo, Samplitude, and Sonar to determine if there would any sonic differences in an identical mix done on each. I also had a small Allen & Heath 16 channel analog mixer that I used to live sound - not very high-end or expensive, but good basic quality so I decided to also do the same mix using that. I carefully calibrated the A&H inputs to provide the same channel gain as the DAWs - all faders set to 0 and all tracks panned hard left or hard right (this required because of differences in panning law from mixer to mixer). It was also imperative that the recording deck (TASCAM DAT) from the A&H be clocked to the DA88s or otherwise there would be drift between the playback and the recording. Now I had 5 mixes from the various DAWs along with the A&H all equal length - I put them on a CD and proceeded to blind compare them on random playback and making notes....
cymbals clearer on #1, piano smoother on #2, bass more definition on #4...cymbals dull on #1 - say what? My notes after a half hour of listening were completely inconsistent and I started really paying attention. Putting away expectations and being honest with myself, they all sounded the same. So back into the studio and into some digital analysis software - turns out all 4 DAWs indeed were bit for bit identical - proving that math is math and computers are quite good at math. More impressive was the frequency response from the A&H mixer. While it was of course not bit identical to the digital mixes, it was incredibly close across the board. Granted very simple "lab'ish" mixes, but it just goes to show how robust audio processing can be with decent equipment.
It also illustrated that sending the digital mixes across SPDIF into the TASCAM DAT, with the TASCAM DAT locked to the SPDIF clock no less, produced exactly the same bit-for-bit mix as an internal software bounce would create. Digital audio cable transmission is impeccably reliable as long as the specs are followed reasonably well and the equipment is working properly.
When I read the White Paper about the Oppo DACs and how they deal with re-clocking - http://www.esstech.com/PDF/sabrewp.pdf
- it seems like they have it covered and should be doing a very good job with the DA conversion.