Originally Posted by rider
I'm probably being dense here (and apologies to others if this is fixating too much on detail)...but I'm not seeing how all data on the HDMI received from the source - including the clock - is not perfectly intact prior to decoding. I understand various jitters introduced in successive stages of decoding and D/A conversion, but isn't that all produced by the processor?
As I said it is non-intuitive so naturally it is a bit hard to follow
Assuming HDMI clock has 1% timing error.
Assume that the HDMI receiver, can handle 5% timing error. Anything less than that means that the data can be captured perfectly. So, we are able to recover all the samples.
Now, we need to output those samples. We can't just send them as fast we can. If the source sample rate is 48Khz (typical for movies), we need to output 48,000 samples per second. So that is where the clock comes in.
Where do we get a clock? Well, we can make one and while at it, make it highly accurate. But this won't work. Why? Because "48KHz" is the nominal sampling rate of the input audio. In reality, we could be off by one or two samples every few seconds. In other words, the data rate of the audio stored on BD disc might be 47.9995 Khz, and not 48KHz exactly.
If we use our local clock, we would be outputing samples too fast if the input is 47.9995 instead of 48 Khz. And by doing so, the audio will get ahead of video and we lose sync.
Simple solution then is to not create a local clock but to use the HDMI clock and drive our sample rate from that. Since HDMI "speaks the truth" regarding how fast we could output the samples, then we stay in lockstep with video.
With me so far?
OK, now if the HDMI video has a 1% timing error, a percentage of that winds up on our output audio clock because we are deriving our clock from that. Since the data was not digitized with the same exact timing error, we have now introduced distortion into our our audio stream (sampling theory only works if the two timings are exact).
More sophisticated solutions exists. One can use a PLL (phase locked loop) which allows a clock circuit to vary but have a precise source for its core frequency. This filters some of the jitter from the source. Alas, a single PLL can't filter it all. So another solution calls for using two PLLs and together, they can get rid of much of the jitter. Good circuit design expertise is needed though to get this right.
Hope this is more clear now.