Originally Posted by MichaelJHuman
I did some googling, but I did not see anything concrete on this topic. HDMI seems to use a single clock for all three TDMS channels, and the clock denotes the "video pixel rate." The clock itself seems to use TDMS, which implies that the clock is sent over a balanced pair of wires (I am not an engineer, I am speculating based on my interpretation of the HDMI document.)
None of that is very helpful. Firstly, jitter is one of those highly debated topics. Many people think it's irrelevant with properly made cables and electronics. Other people are spooked by it, and have persued various options such as getting expensive aftermarket DACs. Secondly, I suspect few of us really understand the engineering. HDMI, in particular would have to build up some sort of audio clock from the HDMI clock - there can't be too many people who understand the ramifications of the process.
I've worked with both HDMI and S/PDIF, so I'll give it try!
Comparing the jitter in HDMI and its effect on the audio information, and the jitter in S/PDIF coax / optical is a bit like comparing apples and oranges.
The HDMI clock is sent using 1 of 4 TMDS channels, the other three channels are the data channels. 10 bits of information are sent for each clock. The receiver must lock onto the clock, and using a Phase Locked Loop, create an internal clock that is 10 times faster than the clock being sent over the TMDS line and it then uses this x10 clock to capture the 3 data streams.
The audio information being sent over HDMI is sent as data packets in what they call the "Data Island Periods". These are little islands of otherwise unused time during horizontal and vertical sync times. The newer HDMI standards simply defined a protocol that made use of this unused time by injecting audio packets.
For audio to be heard, the receiver must extract these packets, and perform any error corrections that may be needed. Since these packets come in high speed bursts, the data must be buffered, if it's a compressed format, the data must be uncompressed, it must be routed to the proper channels (stereo, 5.1, 7.1), it then must be delayed (more buffering) the proper amount of time to be decoded in sync with the video, especially if any type of video scaling is being done.
So while HDMI is very susceptible to jitter, the results are sparkles, blue screens, loss of HDCP, complete loss of audio, that sort of thing.
Since HDMI audio is sent in packets (similar to streaming audio over the net), any jitter in the sample rate clock will be (as stated earlier) an engineering problem, completely independent of the HDMI clock. Just like streaming audio has nothing to do with when TCP/IP packets are received, as long as the data is there before you need it, there won't be any gaps in the audio.
S/PDIF on the other hand sends its data as a continuous stream and the clock is in sync with the sample rate. A S/PDIF decoder locks onto the data stream's clock, and as each word is received, it is to sent to a DAC and the clock from the S/PDIF stream is used to latch the data into the DAC. So if there is jitter in the S/PDIF clock, depending upon the design, there can be jitter in the latching of each word into the DAC.
So which one is better from a jitter point of view? It's hard to say, S/PDIF jitter can be affected by the source jitter, where HDMI audio jitter is independent of the HDMI clock and is dependent upon how well the audio decoder was designed.
For my ears, I'm much more concerned about compression than jitter. The loss of high frequencies and dynamic range in compressed audio (in general) drives me crazy. My CD collection is ripped in FLAC (a lossless) format. I have a Sirius sat. receiver in my truck, but I mostly listen to the comedy channels and NPR, the music channels are horrendous sounding, but I'll bet the internal jitter is as stable as a quartz crystal!