Originally Posted by kellybob
I think somewhere you explained this, but I do not know where. Anyway, the issue is HDMI jitter for LPCM versus bitstream.
The below is from here: http://www.avsforum.com/t/1439524/of...s-thread/10740
With a properly engineered AVR you will hear no audio difference between 720p output and 1080i, 1080p or even 4K (if the AVR can accept that).
Try this: Set HDMI Audio Bitstream and play a track with Bitstream content. Do you hear an improvement when you select 720p output? If so, then your AVR is definitely at fault as there is no such thing as "jitter" when sending a Bitstream over HDMI (because the Bitstream data gets decoded in the AVR and jitter can only happen AFTER that point and before the data reaches the AVR's DACs)...."
Apologies to all for the long absence. The AVS Forum apparently changed their software and when they did they disabled my notifications. I had no idea that I was not receiving any private messages, nor that any posts were made to this thread.
As far as sending PCM versus Bitstream, we deliberately chose to send PCM for a very simple reason - no matter what the capabilities of your AVR, it would be able to handle PCM. The player can decode all of the existing formats. That was a very easy decision.
As far as the jitter goes, I disagree with the claims made in the post you quoted.
The sad truth is that all of the new AVRs use asynchronous sample rate converters (ASRC) for all inputs. They measure wonderfully, and they sound absolutely wretched. We have tried this experiment at Ayre and I was shocked at the audible degradation introduced by an ASRC. In a way it is not surprising, as an ASRC literally throws away ALL of the original data and reconstructs what it thinks the data would have been if there had been no jitter. This "thinking" is called an "algorithm" to make it sound scientific, but it is actually just a guess.
If we take ASRC out of the equation, the master audio clock has to come from somewhere. If we want the audio to be synchronized with the video, the clock HAS to come from the transport. The problem with HDMI is there is NO audio master clock transmitted! HDMI only sends a video master clock, along with some information about how to derive an audio master clock from the (generally unrelated) video clock. The result is a massive amount of jitter, and a degradation of sound quality compared to even S/PDIF (which is already a poor method of sending the clock, as it is mixed in with the word clock, bit clock, and audio data).
But believe it or not, even the jitter from HDMI will sound better than the mangled audio that results from ASRC...
The bottom line is that the poster you quoted does not know what he is talking about.
Ayre Acoustics, Inc.