Originally Posted by ILoveTeufelSub
I bet most people wouldn't even hear the difference between bitstream and LPCM with their home theater setup
LPCM on HDMI has more jitter than on coax, and jitter (direct, or due to reflections caused by impedance mismatch, which is why it's recommended not to use a cable shorter than 1m) is the only thing that can affect sound all others things (the DAC) being equal.
In theory, if the processor reconstructs the LPCM for the bitstream, it can lead to less jitter.
In practice, this can be heard only on high end systems, and good processors are also good at handling jitter.
So, difference between bitsream and LPCM with a good enouhg processor should be null.
That's just my opinion, I actually never heard a good processor that can decode advanced audio codecs. Meridian stuff for example (again, my subjective opinion) sounds awful despite its high end price and statute in the high end audio community.
Here is an interesting article that shows that HDMI can have worse objective measurements than coax, because of jitter:
Whether or not it's possible to hear that in a blind testing is another matter. People generally fail in blind audio tests, including myself.
My subjective scientifically unproven opinion is that subtle differences can be only felt over a long time listening, not by switching constantly between A and B. I have no proof of that, just 40 years of using high end gear and reaching several time the conclusion that different gear sound vastly different. Again, no proof, just consistency over a long period of time.
There is unfortunately so much snake oil in high end audio, that its background noise and the subjective opinion I have that blind A/B tests are not appropriate because they don't represent the sensation you get with prolong real life listening makes it very difficult to sort things out.
But something tells me all this is vastly off-topic and extremely controversial subject in this thread.