or Connect
AVS › AVS Forum › Display Devices › Ultra Hi-End HT Gear ($20,000+) › How high is the jitter level on HDMI vs S/PDIF on a High-End Processor?
New Posts  All Forums:Forum Nav:

How high is the jitter level on HDMI vs S/PDIF on a High-End Processor?

post #1 of 164
Thread Starter 
With the focus on HDMI even in high-end processors over the last year. Can HDMI even provide us with something that surpasses S/PDIF?

I heard reports that audio over HDMI has 10 times the jitter of regular S/PDIF. Is this true, and why are we screaming for HDMI in high-end processors in that case?

Don't we want an interface that is equal to or better than our current S/PDIF?

PS. In Europe we have been fed the SCART interface for combined picture/audio, but no one would ever use the audio portion of it. DS.
post #2 of 164
Ah jitter has been disproven over and over and no longer is a problem in audio and has never been.
post #3 of 164
Quote:
Originally Posted by bluray_1080p View Post

Ah jitter has been disproven over and over and no longer is a problem in audio and has never been.

No it hasn't. Yes it was and remains so. Glad I could help.


But to answer the OP's question, I don't have any raw numbers but the Arcam engineer I spoke with at CEDIA pointed to this very problem as one the main hurdles they need to squash before they do full HDMI audio implementations. In a nutshell he said they had figured out a "unique way of solving the problem".
post #4 of 164
Quote:


No it hasn't. Yes it was and remains so. Glad I could help.


But to answer the OP's question, I don't have any raw numbers but the Arcam engineer I spoke with at CEDIA pointed to this very problem as one the main hurdles they need to squash before they do full HDMI audio implementations. In a nutshell he said they had figured out a "unique way of solving the problem".

And higher end high priced cables help reduce this right? Jitter Is not a problem since receivers and other sources have been using it for quite some time without any problem. just look at your computer, if jitter was a problem for us a lot of data would go missing on a day to day basis.
post #5 of 164
Your remark about computers not having problems with jitter is, unfortunately, completely irrelevant to the discussion. There is plenty of jitter in PC clocks, by audio standards at least, but it's maintained at a low enough level so as not to cause catastrophic bit loss. Thankfully for computer manufacturers that's not a tough level to meet. Clock skew---where clock signals are offset by fractions of a cycle at different points in a system---is a far more important problem for them to address these days.

But jitter in the audio context is not an issue of lost bits at all. It is entirely a D/A clocking issue---the effect of clock variations on the analog signal.

I have personal experience, through both theoretical analysis, simulation, and physical measurements of the effects of jitter on wireless communication equipment. Obviously the clock rates are quite a bit higher in that case, so jitter is made worse. But the data rates involved were quite reasonable, and yet jitter was the single most dominant source of internal noise in our system. We could quiet everything else down below the jitter level; but getting a more precise oscillator was simply too expensive to justify. (Second place, incidentally, was power amp nonlinearity. Don't overdrive your amps!)

The only relevant question is just how audible is jitter. I tend to be quite skeptical about its audibility, but because I have first-hand experience with its deleterious effects in other contexts, I am unwilling to quickly judge. Still, I think it is a bit overblown, especially since there are simple ways to effectively elliminate source-induced jitter like buffered reclocking and master clock topologies.

It really is a shame, bluray---I think you and I think quite a like about many of these issues. You just have this unfortunate lack of social grace about it.
post #6 of 164
Quote:


I heard reports that audio over HDMI has 10 times the jitter of regular S/PDIF. Is this true,

I don't know. It may be that they're quoting the jitter of a full-bandwidth HDMI signal. But that's really not fair because for the audio data you can downsample the HDMI clock and clean out a good chunk of that. I don't know if people have actually considered that or if they're seeing the raw jitter number and freaking out.
Quote:


why are we screaming for HDMI in high-end processors in that case?

I can answer this question in two ways. First of all, we don't yet have any standard method for transmitting high-resolution multichannel audio data from source device to prepro. S/PDIF can't do it unless you compress it down pretty heavily using DD or DTS. DTS-HD and DD+ needed a new digital transmission format. Firewire was certainly an option, but I'm not sure that it has any better jitter characteristics.

Secondly, as I alluded to above, I honestly believe that source-to-prepro jitter can be made a non-issue with the right kind of designs. I know I am a jitter skeptic, but even if I wanted to be conservative about it (in case I'm wrong) I could still do a fair amount of buffering of the HDMI data in order to dejitter it.
post #7 of 164
http://www.rtcmagazine.com/home/article.php?id=100178

There is jitter in the PC world and many different types but they obviously found ways to solve this long ago, much like in the communication world.

Quote:


It really is a shame, bluray---I think you and I think quite a like about many of these issues. You just have this unfortunate lack of social grace about it.

I will agree that I have the personality of a wet dish cloth and I snap when someone reads a high end audio article writen by old analog record guys saying how horrible digital audio is.

But that is the problem with the home audio industry, nothing is that regulated or tested and it all comes down to "I heard a difference". That's why there is no silly business in the pro audio world becuase it either makes a difference or not and that can be measured and proven.
post #8 of 164
Quote:


There is jitter in the PC world and many different types but they obviously found ways to solve this long ago, much like in the communication world.

Again, it depends on the context. Jitter that is so large as to produce catastrophic bit loss does indeed have to be addressed in an all-digital system. And it regularly is. And indeed, in audio jitter is successfully held down well below the threshold where it is going to cause regular bit errors. In the bit error sense it is a non-issue.

As for the communication world, I just got done telling you that jitter is a problem in that context. It is not a problem in every application but certainly ones that require wireless transmission and which push the spectral efficiency envelope. Is there a reason I should trust your words over my study and experience?
Quote:


I will agree that I have the personality of a wet dish cloth and I snap when someone reads a high end audio article writen by old analog record guys saying how horrible digital audio is.

Will peer-reviewed AES articles do? There are a number of such articles on the audibility of jitter.

As I've said, I'm a skeptic, but not in the sense that jitter is totally inaudible, but rather in the sense that I think keeping jitter below audible thresholds is achievable---and is achieved more often than believed. I also think that jitter is a scapegoat for other audible differences, real or perceived. I believe it was Chu Gai who recently noted that Julian Dunn, who has published studies on the audibility of jitter, suggested that power supply imperfections may actually be the cause of some of the audible differences often attributed to jitter.

As for "snapping", that's really the problem I and many others have with you. "Snapping", at least with the frequency that you do it, is simply unacceptable. We are trying to reshape this forum into a more civil place for discussion and debate---even about objectivist/subjectivist issues. You are more than welcome to participate in that reshaping by improving your tone. But as of now you remain one of the very bad apples we're trying to weed out.
post #9 of 164
Quote:
Originally Posted by Michael Grant View Post

First of all, we don't yet have any standard method for transmitting high-resolution multichannel audio data from source device to prepro.

I would say that before HDMI was adopted, we had some very good, secure, industry-standard, high-definition digital audio interfaces already in place.

I-Link and Denon Link. There are others, but not really for HD audio.

These achieved much the same thing, but using different physical interfaces, and USB might do the same. My understanding is that they carry the digital (amplitude) data from the source to the amp, but not the timing info / clock, which is what actually carries the jitter.

Read any description of HATS or PQLS etc, and it will say that the amp generates the clock and feeds it back to the player over the link, so the optical drive is effectively slaved to the DAC, rather than the other way round. This is a great way of doing it, but unfortunately HDMI does it the bad old way. And uses a video clock, instead of an audio clock, to boot.

HDMI has a trick up it's sleeve that MAY help it get out of this position, but that may be alittle way off.

regards, Nick
post #10 of 164
Quote:


Is there a reason I should trust your words over my study and experience?

I am talking about the big guys with an unlimited budget, they have solved all jitter problems.

Quote:


Will peer-reviewed AES articles do? There are a number of articles on the audibility of jitter.

I have seen and heard some cheap systems over the years and I have never experienced problems with jitter. Jitter would cause bit problems before it would become "audible" but are those articles writen by people who think they can hear a difference between power cords?
post #11 of 164
Quote:


I am talking about the big guys with an unlimited budget, they have solved all jitter problems.

I was unaware that Cisco Systems was a small company with an insufficient development budget. Bluray, you simply do not know what you are talking about. Give it up while you are ahead.
Quote:


Jitter would cause bit problems before it would become "audible"

Absolutely false.
Quote:


are those articles writen by people who think they can hear a difference between power cords?

They are typically (though not unanimously) written by people who study the real science, including properly controlled listening tests, before rendering their verdict on the issue. Unfortunately you do not seem to follow that practice.
post #12 of 164
Quote:


They are typically (though not unanimously) written by people who study the real science, including properly controlled listening tests, before rendering their verdict on the issue. Unfortunately you do not seem to follow that practice.

Are you sure about that because when it comes to the high end there seems to be a lack of proof.

Quote:


I was unaware that Cisco Systems was a small company with an insufficient development budget. Bluray, you simply do not know what you are talking about. Give it up while you are ahead.

So cisco could not buy a proper oscilloscope? Maybe its just american technology?
post #13 of 164
Quote:


Are you sure about that because when it comes to the high end there seems to be a lack of proof.

I didn't point you to the "high end", I pointed you to peer-reviewed research. I have many of the same beefs with high-end companies you do.
Quote:


So cisco could not buy a proper oscilloscope? Maybe its just american technology?

We had some pretty sweet equipment, actually. I mean, I figure Cisco wouldn't have bought us for $180 million if they weren't willing to invest a few more to make sure we were fully outfitted.

And that was precisely why we knew exactly how much jitter we are dealing with. Of course, it also helps that every single oscillator manufacturer in that product space quotes jitter figures. Of course, they often call it "phase noise" in that application.

Keep digging, bluray. Every new post is one closer to getting you tossed off this forum. I will not shed a tear when it happens.
post #14 of 164
Quote:


We had some pretty sweet equipment, actually, and that was precisely why we knew exactly how much jitter we are dealing with. Of course, it also helps that every single oscillator manufacturer in that product space quotes jitter figures. Of course, they often call it "phase noise" in that application, so that may be one of the many reasons why this knowledge escapes you.

And jitter can be cause by serveral problems so what is your point? Since jitter has been solved in the computer world which is far more complex why would jitter still only affect the simple world of home audio?
post #15 of 164
Quote:


Keep digging, bluray. Every new post is one closer to getting you tossed off this forum. I will not shed a tear when it happens.

And why would that happen? because I question what people say?
post #16 of 164
It is sort of humorous to see someone try to approach objectivism by using snake-oil as a vehicle.... I'm referring to bluray_1080p of course (in case he doesn't realize it...).
post #17 of 164
Off Topic a bit...

But I am adding a Radiance processor nd was going to purchase new HDMI cables for my digital devices.

I have been looking into HDMI calbes and see that 1.3b Cat 2 is the latest spec.

I don't know what the sonic impact is of a 'good' HDMI cable but in researching, I have come across some 'Active' HDMI cables with great specs and as well as passive one..


What is the thinking on this?

I bought a Monoprice cable but found the connector to be loose and garbagey.
post #18 of 164
Quote:


And jitter can be cause by serveral problems so what is your point? Since jitter has been solved in the computer world which is far more complex why would jitter still only affect the simple world of home audio?

Because the levels of jitter that are audible, despite your uninformed protestations to the contrary, are well below the levels of jitter that would cause catastrophic bit loss.
Quote:


And why would that happen? because I question what people say?

No, it is your tone and your attitude. You have been warned before, and your posts have been deleted before. If you are banned, it is not because you didn't know it was coming.
post #19 of 164
Quote:
Originally Posted by thebland View Post

I bought a Monoprice cable but found the connector to be loose and garbagey.

The connector issue is an annoying issue. They (female/male connectors) really are not designed well to latch securely into components...
post #20 of 164
Quote:


Because the levels of jitter that are audible, despite your uninformed protestations to the contrary, are well below the levels of jitter that would cause catastrophic bit loss.

And other forms of jitter do not come into play in the home audio market becasue they do not occure in a simple audio system. If anyone could prove jitter in home audio is audible then it would be fixed but it is not audible and not a problem.
post #21 of 164
Don't feed the troll.

Nick
post #22 of 164
Quote:


And other forms of jitter do not come into play in the home audio market

This is a failure of terminology. It is not about different "forms" of jitter---it is about difference consequences of jitter.
Quote:


If anyone could prove jitter in home audio is audible then it would be fixed but it is not audible and not a problem.

Circular logic at its best! And it ignores cost concerns, too.

Gotta go now, life calls. Troll feeding shall cease.
post #23 of 164
Quote:
Originally Posted by bluray_1080p View Post

If anyone could prove jitter in home audio is audible then it would be fixed but it is not audible and not a problem.

The guy gave you everything you need to find that proof yourself. Go read recent AES papers on it (AES = Audio Engineering Society). All you are doing is making yourself look like an idiot at this point.
post #24 of 164
Doc, I admire your patience. I would have given up long ago. What is it they say? Something about leading a horse to water?
post #25 of 164
Quote:
Originally Posted by Ron Party View Post

Doc, I admire your patience. I would have given up long ago. What is it they say? Something about leading a horse to water?

"You can lead a 'horticulture,' but you can't make her think."

-- Dorothy Parker
post #26 of 164
Proof and opinion seem to be one in the same in the high end, but everytime I prove something the first thing you guys do is run to the mods.
post #27 of 164
Quote:
Originally Posted by bluray_1080p View Post

Proof and opinion seem to be one in the same in the high end, but everytime I prove something the first thing you guys do is run to the mods.

the next infraction you receive will result in a ban / suspension: so do be careful from here on in please: enough
post #28 of 164
Quote:
Originally Posted by bluray_1080p View Post

Proof and opinion seem to be one in the same in the high end, but everytime I prove something the first thing you guys do is run to the mods.

That's the funny thing, you say a lot about what you think about things, but you haven't proved anything yet.
post #29 of 164
Thread Starter 
bluray_1080p, from your posts I can tell that you don't quite grasp what jitter does to the signal. It's not loss of bits, it's timing and drifting of the clock. It's not something you solve with CRC. This causes trouble for analog parts, not the digital parts. There are some excellent posts with depth in the archives both here and on the net, please RTFM.

Now if we could go back on topic.

Welwynnik: "HDMI has a trick up it's sleeve that MAY help it get out of this position, but that may be alittle way off."

Could you elaborate please?
post #30 of 164
Lets get back on track.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Ultra Hi-End HT Gear ($20,000+)
AVS › AVS Forum › Display Devices › Ultra Hi-End HT Gear ($20,000+) › How high is the jitter level on HDMI vs S/PDIF on a High-End Processor?