Originally Posted by Tcatman
This comment about jitter in the recording stream of analog to digital was made by a well known DAC designer. who always asserts that reducing jitter to as low as possible results in audible improvements in reproduction through the latest and greatest DACs.
And it is certainly true - clock jitter at the clock input of either a DAC or an ADC can cause audible distortion. When designing a DAC, clock-jitter used to be
a serious headache - especially when the clock had to be recovered from the source material, such as with S/PDIF. Not anymore - it is now trivial to have a jitter-free clock. It seems clock-jitter has gone the way of the boogie-man (it doesn't really exist, but people are scared-to-death of it anyway).
There should be almost zero
clock jitter in a recording signal chain, as the clock is derived from a quartz oscillator and is used to drive all elements of the digital chain. In my designs, I always start with a quartz oscillator for the master clock (MCLK), and derive the serial clock (SCLK) and the channel clock (LRCLK) from MCLK. I use those clocks to drive all
ADCs and DACs, as well as the DSP, so all components operate in lock-step. It's quite simple, and the result is that jitter is in the parts-per-million range, orders of magnitude lower than what will cause distortion.
. . . However, I wonder how jitter associated with each track interacts as you go through a mixing process. When did recording set ups get this kind of gear? Does something like pro tools propagate error?
Good question. But first, let me explain that the audio tracks will not have jitter - jitter is only in the clock. If a clock with excessive jitter drives an ADC or DAC, then the audio will contain distortion
as a result of the clock-jitter. The distortion will then propagate throughout the chain, as the tools would not know to remove it. There wouldn't be any interaction between channels, however.
Just to be anal-retentive, let me say now that jitter
cannot be heard, as it is in the (inaudible) clock. It is the distortion
that the jitter might cause that could be audible, and that distortion behaves as any other distortion, including the difficulty in removing it.
The Sheffield engineer of the video mag report was trying to capture a vital recording by going direct to disk without mixing (great attention to the recording setup) and that his high resolution ADC was the magic needed... Very different philosophy from how most music is put together. He is a huge proponent of the high resolution standard 24/192..
Although it is questionable whether we can hear the higher resolution that 24-bit/192kHz affords, there are very good reasons to record
at high resolution, even if you later decimate down to a lower resolution. It is partially an issue of what the analog front-end can do, and partially an issue of what the DSP algorithms are capable of. For those reasons, many CDs are mastered at 352.8kHz (8x) and then decimated down to 16-bit/44.1kHz Redbook.