Quote:
Originally Posted by
antiacid
are we talking about picoseconds here? You kind of have to understand wtf you are talking about before putting on your "surprised, shocked and annoyed" face.
I hate to say this but he is quite right in being alarmed. Your post is the classic pitfall of using layman logic to analyze something that is quite complex mathematically in comparison. See below.
Quote:
Here's a link to wikipedia for those who don't know their units:
http://en.wikipedia.org/wiki/Picosecond
Basically, the highest quoted figure of 7660psec (A picosecond is one millionth of one millionth, or one trillionth of a second (0.000 000 000 001 seconds) amounts to 7.7ns or 0.000077ms (If I counted my zeros properly...)
So the true question is: Does the human body hear/see this difference?
I'll guess no.
The right reference would be an authoritative paper on the topic of jitter such as this one presented at AES:
http://www.nanophon.com/audio/jitter92.pdf
Using the formula in that paper, and assuming the jitter spectrum is simpler than it really is (i.e. NOT the worst case scenario), we can compute what level of jitter we need to obliterate the low order bit in a 16-bit audio sample:
"For sinusoidal jitter of amplitude J=500ps, a 20 KHz maximum level tone will produce sidebands at -96.1 dB relative to the input tone."
Just expanding, a 16-bit audio sample has 16*6 = 96 dB of dynamic range. Meaning the signal will be 96db higher than noise floor. If the difference between the signal and noise floor decreases, then you have a corresponding reduction in sample resolution (i.e. drops below 16-bits).
According to the math then, a 500 ps peak-to-peak (250ps relative to zero) is the maximum jitter that can be tolerated if we want to preserve the full 16-bit value. As otherwise, the sidebands (distortion) created by the jitter modulating the signal rise aboe -96db relative to a full magnitude source.
As you go higher in sample resolution, the jitter must be even lower.
So assuming you are playing CDs and not higher resolution audio formats, then none of the units tested, sans Pioneer have low enough jitter on their HDMI input to fully resolve the input samples.
And again, keep in mind that the math is only easy to analyze if we assume a simple, sine wave spectrum. Jitter is usually far more complex and can do more harm than even the above math shows (although the reverse is also sometimes true in that some jitter is harder to hear than others).