Is HDMI asynchronous (does clock/jitter matter)? - AVS Forum
Forum Jump: 
 
Thread Tools
Old 12-09-2010, 07:40 AM - Thread Starter
Advanced Member
 
Brucemck2's Avatar
 
Join Date: May 2004
Posts: 756
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 24 Post(s)
Liked: 18
Is the HDMI interface / standard asynchronous, or, does it operate more akin to standard Toslink or Spdif protocols?

My reason for asking is to better understand to what degree outbound jitter matters in comparison to "standard" legacy digital outputs, and correspondingly, to understand to what degree higher precision clocks impact HDMI sonics.

Please, no diversions around "legacy format jitter doesn't matter because receiver chips overcome that."
Brucemck2 is offline  
Sponsored Links
Advertisement
 
Old 12-09-2010, 08:13 AM
AVS Special Member
 
edorr's Avatar
 
Join Date: Feb 2009
Posts: 2,469
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 41 Post(s)
Liked: 26
Quote:
Originally Posted by Brucemck2 View Post
Is the HDMI interface / standard asynchronous, or, does it operate more akin to standard Toslink or Spdif protocols?

My reason for asking is to better understand to what degree outbound jitter matters in comparison to "standard" legacy digital outputs, and correspondingly, to understand to what degree higher precision clocks impact HDMI sonics.

Please, no diversions around "legacy format jitter doesn't matter because receiver chips overcome that."
In what seems a lifetime ago I had a Denon 2500 HDMI transport and an Onkyo 885 SSP. This sounded like crap on BR although my Sony XA5400 sounded very good DSD direct over HDMI into the same receiver. I thought upgrading the clock in the Denon could solve the problem, so I send out the Denon for a $1000 superclock upgrade. While the Denon was out for the upgrade I did some more research on the subject and ran into some information that explained since HDMI is asynchronous, the clock upgrade would be of little if any benefit. I had it shipped back without the upgrade, bought a Pioneer BDP-09 that I ran analog into the Onkyo and never looked back (I sold the Denon for $200). As should be obvious from this response, I have absolutely no idea how these protocol really work and what the theoretical benefits (or lack thereof) of a clock upgrade for HDMI would be.
edorr is offline  
Old 12-09-2010, 04:05 PM
AVS Club Gold
 
Greg_R's Avatar
 
Join Date: Mar 2000
Location: Portland, OR USA
Posts: 3,912
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 10
Quote:


While the Denon was out for the upgrade I did some more research on the subject and ran into some information that explained since HDMI is asynchronous

You need to go back and do some more research. HDMI specifically includes a TMDS clock as a pixel clock (look at the pin-out of the connector). However, the standard uses ECC so any timing errors or data loss will be corrected or an error will result i.e. you aren't going to get a "soft" image or loss of audio content; you're going to get a complete failure. Additionally, different sections (control, video, audio) are sent in different packets so it's not really a continuous stream of audio or video (audio is stuffed in between the gaps in the video transmission).

If your signal is dropping & you have a long HDMI route then there are numerous options (Ethernet repeater, fiber, thicker gauge wire, etc.).
Greg_R is offline  
Old 12-09-2010, 04:57 PM - Thread Starter
Advanced Member
 
Brucemck2's Avatar
 
Join Date: May 2004
Posts: 756
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 24 Post(s)
Liked: 18
GregR, if I follow correctly then the key to improved sonics is the quality of the PLL implementation in front of the DACs in the receiver?
Brucemck2 is offline  
Old 12-09-2010, 08:15 PM
AVS Special Member
 
edorr's Avatar
 
Join Date: Feb 2009
Posts: 2,469
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 41 Post(s)
Liked: 26
Quote:
Originally Posted by Greg_R View Post

You need to go back and do some more research. HDMI specifically includes a TMDS clock as a pixel clock (look at the pin-out of the connector). However, the standard uses ECC so any timing errors or data loss will be corrected or an error will result i.e. you aren't going to get a "soft" image or loss of audio content; you're going to get a complete failure. Additionally, different sections (control, video, audio) are sent in different packets so it's not really a continuous stream of audio or video (audio is stuffed in between the gaps in the video transmission).

If your signal is dropping & you have a long HDMI route then there are numerous options (Ethernet repeater, fiber, thicker gauge wire, etc.).

I never had any signal dropping type issues. I just did not like the sound on BR and initially thought a clock upgrade could help. Your comment seems to confirm that this was indeed a misconception and I spending $1000 on a superclock 4 would have been a waste of money.
edorr is offline  
Old 12-10-2010, 09:39 AM
AVS Club Gold
 
Mark_H's Avatar
 
Join Date: Aug 2001
Location: England
Posts: 1,933
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 4 Post(s)
Liked: 16
Meridian went to a lot of effort to clean HDMI audio of jitter in their 621 interface...

"Meridian's highest-performance FIFO buffering technology removes HDMI jitter, consistently delivering a significant improvement in audio quality from all your HDMI sources."

http://www.meridian.co.uk/product-mo...processor.aspx

My cinema: The Cave!

My kit: 15' 2.35:1 Screen Research CP2 4-way mask, Sony vw1000es, Lumagen 2144, Meridian 861/621/7x5500/2xSW5500

Mark_H is offline  
Old 12-10-2010, 12:39 PM
AVS Special Member
 
Glimmie's Avatar
 
Join Date: Sep 2000
Location: Los Angeles, CA
Posts: 8,018
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 182 Post(s)
Liked: 252
Quote:
Originally Posted by Mark_H View Post

Meridian went to a lot of effort to clean HDMI audio of jitter in their 621 interface...

"Meridian’s highest-performance FIFO buffering technology removes HDMI jitter, consistently delivering a significant improvement in audio quality from all your HDMI sources."

http://www.meridian.co.uk/product-mo...processor.aspx

Any HDMI receiver implementation, even a $179 HT-in-a-box needs an input FIFO prior to processing. And a FIFO is nothing more than a basic shift register which these days is just a macroblock in an ASIC/FPGA design.

So this claim by Meridian while true, is just marketing hype.

Glimmie's HT Page
Being redone - comming soon!

Glimmie is offline  
Old 12-14-2010, 10:01 AM
Senior Member
 
ex0du5's Avatar
 
Join Date: Jun 2007
Posts: 341
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
As I'm researching digital gear, I'm absolutely puzzled by what I'm hearing. Now, I'm a computer engineer, and though I have little experience in hardware/interface design (more of a software developer now), I still understand how things should work.

People are talking about jitter being transported over spdif and HDMI. How is this possible, exactly? I would have assumed that any decent digital output would buffer and perform error correction on the receiving end, like ethernet does. If not, then what's the point of using a digital transfer? Assuming the receiving device buffers the input, then that should completely eliminate the sending device's jitter, should it not? The only way jitter would be preserved is if the buffer is strictly FIFO with no ECC, but that sounds absurd to me.
ex0du5 is offline  
Old 12-14-2010, 10:14 AM
AVS Special Member
 
edorr's Avatar
 
Join Date: Feb 2009
Posts: 2,469
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 41 Post(s)
Liked: 26
Quote:
Originally Posted by ex0du5 View Post

As I'm researching digital gear, I'm absolutely puzzled by what I'm hearing. Now, I'm a computer engineer, and though I have little experience in hardware/interface design (more of a software developer now), I still understand how things should work.

People are talking about jitter being transported over spdif and HDMI. How is this possible, exactly? I would have assumed that any decent digital output would buffer and perform error correction on the receiving end, like ethernet does. If not, then what's the point of using a digital transfer? Assuming the receiving device buffers the input, then that should completely eliminate the sending device's jitter, should it not? The only way jitter would be preserved is if the buffer is strictly FIFO with no ECC, but that sounds absurd to me.

Everything that can be said on the issue has been said (although a lot of folks get great pleasure out of rehashing it so you may actually get a response to your question). You can literally find 1000s of post on the subject. Suffice it to say that source digital technology matters (otherwise we would all be using $50 CD Rom drives), DACs matter and protocols matter. I would focus your energies on getting good recommendations on architectural options, and ultimately what stuff to buy given your requirements (design goals) and budget.
edorr is offline  
Old 12-14-2010, 10:26 AM
AVS Club Gold
 
Mark_H's Avatar
 
Join Date: Aug 2001
Location: England
Posts: 1,933
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 4 Post(s)
Liked: 16
Quote:
Originally Posted by Glimmie View Post

Any HDMI receiver implementation, even a $179 HT-in-a-box needs an input FIFO prior to processing. And a FIFO is nothing more than a basic shift register which these days is just a macroblock in an ASIC/FPGA design.

So this claim by Meridian while true, is just marketing hype.

From what I understand, Meridian delayed this box for quite some time until they were happy with the audio performance over HDMI. Yes, there's undoubtedly an element of hyperbole in their press release but I believe their commentary on the problems of audio over HDMI and the sincerity of their claims to have resolved the issues.

My cinema: The Cave!

My kit: 15' 2.35:1 Screen Research CP2 4-way mask, Sony vw1000es, Lumagen 2144, Meridian 861/621/7x5500/2xSW5500

Mark_H is offline  
Old 12-14-2010, 10:32 AM
Senior Member
 
ex0du5's Avatar
 
Join Date: Jun 2007
Posts: 341
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by edorr View Post

Everything that can be said on the issue has been said (although a lot of folks get great pleasure out of rehashing it so you may actually get a response to your question). You can literally find 1000s of post on the subject. Suffice it to say that source digital technology matters (otherwise we would all be using $50 CD Rom drives), DACs matter and protocols matter. I would focus your energies on getting good recommendations on architectural options, and ultimately what stuff to buy given your requirements (design goals) and budget.

Yes, I understand that these matter. I want to understand the science behind the madness. Sorry to say, but I don't trust most audiophiles when it comes to digital gear. I have worked with and designed digital gear and synchronous communication, and I understand that it is entirely possible to eliminate jitter at a given stage by simply storing the information in memory. That is why I'm puzzled that there should be any jitter whatsoever introduced into the DAC by the transport. Now, I understand it's completely possible if the protocol was badly designed, and that very well may be the case. I'd like some clarification here, because as I have worked in the field, I haven't worked specifically with HDMI or SPDIF.

I'm focusing on jitter at the moment because it's obviously the main problem that can be introduced into the system when moving from high-end CD player to digital player -> DAC combo.

Now, back on topic...since HDMI has error correction, it needs a buffer on the receiving end. As such, I can't see the jitter from the sending device being preserved, since I assume an ECC buffer would need to be random accessible, and not FIFO (unless it's FIFO + addressible, I guess). This is, again, purely assumption, but I assume that the data is stored in the buffer, and essentially retransmitted to the device's processor at that point.
ex0du5 is offline  
Old 12-14-2010, 04:20 PM
AVS Addicted Member
 
amirm's Avatar
 
Join Date: Jan 2002
Location: Washington State
Posts: 18,375
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 981 Post(s)
Liked: 547
Quote:
Originally Posted by ex0du5 View Post

Yes, I understand that these matter. I want to understand the science behind the madness.

I will give a brief answer. For more, search for my name and Glimmie and you will find a huge thread where he took your role, and I took the opposite .

Quote:


Sorry to say, but I don't trust most audiophiles when it comes to digital gear. I have worked with and designed digital gear and synchronous communication, and I understand that it is entirely possible to eliminate jitter at a given stage by simply storing the information in memory. That is why I'm puzzled that there should be any jitter whatsoever introduced into the DAC by the transport.

As one engineer to another, let me say there is more to the story than part of the chain which ended above .

Quote:


Now, I understand it's completely possible if the protocol was badly designed, and that very well may be the case. I'd like some clarification here, because as I have worked in the field, I haven't worked specifically with HDMI or SPDIF.

As a digital transport, nothing is wrong with either. Well, I take that back. HDMI sucks at that also but that is for another topic, unrelated to jitter and fidelity .

Quote:


Now, back on topic...since HDMI has error correction, it needs a buffer on the receiving end. As such, I can't see the jitter from the sending device being preserved, since I assume an ECC buffer would need to be random accessible, and not FIFO (unless it's FIFO + addressible, I guess). This is, again, purely assumption, but I assume that the data is stored in the buffer, and essentially retransmitted to the device's processor at that point.

All correct. Let's make this very simple and assume that all digital data is extracted from the link perfectly. You now have audio samples (or bit stream from the compressed audio) ready to be played.

What happens next? As you know, a DAC requires a clock. Where do you get that clock? You have two choices:

1. Use any old oscillator. You take the sampling rate of the source and enable the clock at that frequency. Well, this won't work! Why? Because the sampling rate is the nominal value of the audio samples, not actual. When content is encoded for example, one could choose to put out 47,999 samples/sec instead of 48,000 and still be correct. If you clock the audio one sample faster than the source, over time you drift and pretty soon, the audio is no longer in sync with video.

2. Derive a clock from the HDMI source. You use a PLL and lock your frequency to the source. This gives you the correct data rate since you now are in sync with the incoming samples. But now, you have a performance issue. The HDMI clock is designed typically to be good enough for you to recover the data samples. Once there, designers think they are finished. Yet, we now need a very high precision clock to drive the DAC.

How precise? For 16-bit audio at 20 KHz, you need to achieve 500 picoseconds accuracy or you lose the low order bit. That will be challenging to maintain in a typical system. You have HDMI clock itself varying to some extent due to clock instability, cable induced jitter, etc. You have interferences inside the receiver which cause cause their own jitter.

There are solutions to this problem of course but they get complex and expensive since you only want to filter the jitter, but not true changes in source rate. Some use double-PLL circuits. Others use proprietary techniques.

To avoid the next phase of this discussion and what makes people frustrated with these arguments , let's avoid talking about what is audible and what is not. Instead, let's agree that audio reproduction is part digital, part analog. Audio samples are digital in value. Timing of the samples is an analog event which must be in sync with the source. And to add insult to injury, we have high precision sample values which mean jitter has to be quite small for transparent reproduction.

Hope this gives you an overview. Now you are armed to do the above search and read through the rest of the arguments.

Amir
Retired Technology Insider
Founder, Madrona Digital
"Insist on Quality Engineering"
amirm is offline  
Old 12-14-2010, 05:12 PM
Advanced Member
 
welwynnick's Avatar
 
Join Date: Jul 2005
Location: Welwyn, Herts, UK
Posts: 976
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 13
Briefly, yes, HDMI suffers from jitter, but storing data in buffers in itself makes no difference to jitter, as jitter is carried on the audio clock, not the audio data.

It is what happens to the clock while the data is in the buffer that matters. Jitter is reduced by buffering by low-pass filtering the clock, and the lower the filter roll-off, the more jitter is filtered. Such filters are slow to react to varying data rates, and short-term data storage is needed to accomodate the over- and under-runs.

Having said all that, jitter is never eliminated by anything, it can only be minimised. Meridian's implementations are among the most successful, and the HD621 seems to be pretty successful at making the most of a wide range of sources.

What has been most surprising to me this year is how much impact the quality of HDMI sources has on system quality, and expensive players like the Pioneer 09 and Denon A1 are pretty successful IMHO.

Nick

Edit - Ha!
welwynnick is offline  
Old 12-15-2010, 08:04 AM
AVS Special Member
 
baddgsx's Avatar
 
Join Date: Dec 2005
Location: CT
Posts: 1,220
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by Glimmie View Post
Any HDMI receiver implementation, even a $179 HT-in-a-box needs an input FIFO prior to processing. And a FIFO is nothing more than a basic shift register which these days is just a macroblock in an ASIC/FPGA design.

So this claim by Meridian while true, is just marketing hype.
I totally agree!

- Independent Dreams - IDStudios
baddgsx is offline  
Old 12-15-2010, 08:38 AM
Senior Member
 
ex0du5's Avatar
 
Join Date: Jun 2007
Posts: 341
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Thanks for the post amirm.

I think I see where I might be going wrong here. I had assumed that audio data received at the DAC, once stored into memory, should be identical to the source data. What I guess I must have not realized is that the source is decoding time-safe data into a time-dependant audio/video stream?

I'd like to walk through an example, if we can:

Say we have 48KHz music. Let's imagine a sound clip with 48,000 samples (1s long). Does the source send exactly 48,000 samples? What I'm getting at is:

1) Does the jitter add variability to the amount of time it takes to transfer the samples?
2) Does the jitter add variability to the amount of samples that are transmitted (meaning some samples might get dropped altogether)?
3) Am I completely wrong in my assumption of how the data is transmitted? I had assumed the data would be a series of samples that is transmitted from the source to the DAC. Is there a conversion step I'm missing here?

If it's case #1, I would have thought that the samples, once received by the DAC's buffer, should then be identical to the source samples. Therefore, the DAC could have its own, more precise clock to drive the conversion.
ex0du5 is offline  
Old 12-16-2010, 09:16 PM
Senior Member
 
ddean's Avatar
 
Join Date: Jan 2003
Location: Silicon Valley, CA, USA
Posts: 207
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Note that there are exactly two places in the chain where jitter matters: at the ADC and DAC chips. (I say chips, but this applies equally to MSB's Platinum DAC modules of discrete resistor ladders, etc.)

Unfortunately, the SPDIF interface was designed to be cheap: the data and clock are multiplexed over a single pair of wires. Separating the clock and data with very high degrees of both precision and accuracy is hard: we're talking about tens to hundreds of picoseconds. In the case of HDMI, you have to derive the audio clock from a video clock -- again, with very high accuracy and precision.

Amir gave a good primer on the synchronization problem. Some DACs (e.g. the Chord DAC64) do use memory as a buffer, but those a relatively rare. You'd have to do some math to figure out how much buffering is required (and how much variation in input sample rates you can support) to ensure that you don't have a buffer overrun (i.e., the buffer fills up) or underrun (i.e., the buffer becomes completely empty) in the course of playing a CD (or DVD, Blu-Ray, etc. Streaming audio is yet another challenge.). Needless to say, this gets even more challenging in an A/V system -- you'd have to buffer the video as well...
ddean is offline  
 
Thread Tools


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off