Originally Posted by Sheer Lunacy
I always wonder about the use of an external master clock with a single-unit transport-DAC combination.
An external DAC ought to generate its internal clock from the input signal with a high degree of accuracy
There seems to be a massive misunderstanding of modern technology in how such systems operate and what clock domains are present.
In the vast majority of modern "DAC" units i've seen, internally they look like:
input signal -> sampling rate converter -> DSP/microcontroller -> DAC -> output filters.
Under this structure, the clock of the input signal is generally not used for anything but clocking the data stream of the input signal. The SRC or DSP determine the sampling rate and bit resolution of the input signal by embedded header information in many formats or by empirical analysis of the signal in the case of something like pure PCM. At this point the DSP/microcontroller sets up the clocks for the actual DAC based on this information. The source of the clock for the DSP/microcontroller, the SRC, and ultimately the DAC is an on board oscillator, the input clock does not influence the system clock at all. Of course you could build a system that runs off the extracted input signal clock but there is little to no reason to do so unless you are really in dire need of saving $0.20 on the build cost by not putting a crystal/oscillator on board. The existence of modern SRC's and controller made such designs completely unnecessary.
The only place where this "external clock" concept even makes any sense is in the case of extremely low latency recording as running multiple clock domains generally implies buffers, however small they may be, this of course was mentioned like 30 post ago by ap1 but appears to have been ignored.
If you want to be constructive in this thread, how about someone provide a detailed analysis of what an external clock is supposed to do for sound quality?
Honestly i'd be really, really interested in what these devices even do with an external clock, i wouldn't at all be surprised if half of em don't even use it and the difference you see in performance is a result of changes in system noise as a result of having 2 similar, but not the same clocks bouncing around. If you think jitter is a problem, send a data stream and a clock through two completely different cables of different lengths at moderate speed. There is a reason to use encoding schemes that include the clock in the data stream.