Originally Posted by Bob Pariseau
The recommendation is primarily to keep people from constantly calling saying they don't see the TrueHD and DTS-HD MA lights turning on in their AVR. Seriously, you'd be shocked how many folks get worried about that.
Secondarily, proper handling of HDMI LPCM input requires buffering and re-clocking the LPCM digital audio input stream. Most receivers do that properly as a matter of course -- in general if your receiver lets you apply lip-sync adjustment to HDMI LPCM input then it must be doing that, and so it just comes down to whether there's a bug in that. Some OLDER HDMI receivers do NOT buffer and re-clock HDMI LPCM input. Now the trick is, such receivers are also usually old enough that they don't accept lossless Bitstream input EITHER. If you have a receiver from the transition -- one that DOES accept lossless Bitstreams, but does NOT properly buffer and re-clock LPCM input, then the odds favor the Bitstream sounding better. That's still not a guarantee as processing the Bitstream is complicated, and a Receiver with the LPCM processing weakness is more likely to also be of a vintage that screws up Bitstream input, TOO!
The bottom line is that for most people LPCM and Bitstream input will sound the same -- assuming the AVR is set to handle the two of them identically.
Sound quality aside, it is more complicated for the AVR to start up a Bitstream. The problem is that it first has to validate that the incoming stream is well formatted -- that it isn't just garbage. Now it should be doing that for startup of an LPCM stream as well, but validating LPCM is easier and faster.
Understand that if you send a Bitstream to an AVR, what it does with it is decode it into LPCM -- everything else happening in the AVR stays LPCM until it finally converts that to Analog for output to your speakers. So the LPCM stuff (other than the buffering and re-clocking at the point of the HDMI Input) is happening ANYWAY in the AVR -- even if you send Bitstream. Bitstream requires ADDITIONAL processing to decode it into LPCM.
So why is it there? Marketing reasons.
To follow that requires some background. (For folks who get bored easily, this is the point to skip to the next post.)
Bitstream formats are packing formats. They take master audio (which is LPCM) and convert it into something which takes up less space on disc. The whole POINT of the Bitstream formats is to create audio that takes up less space on disc -- and perhaps even more importantly, takes a smaller bit-rate to read OFF the disc! The reduction in file size and necessary bit rate is referred to as the "compression factor".
LPCM digital audio is best thought of as the "simplest" form of digital audio. By the way, the "L" in that does not stand for "Lossless". It is "Linear Pulse Code Modulation" -- one flavor of the more general type of digital audio style called simply Pulse Code Modulation, or PCM. There is one LPCM stream per speaker channel. LPCM is "simple" but it is not compact, and then you also have to manage a SET of LPCM streams for multi-channel audio. The Bitstream formats pack up a set of LPCM streams into a single Bitstream. But processing of digital audio happens as LPCM. So before you can hear a Bitstream, it first has to be decoded BACK into LPCM.
Traditional, lossy Bitstream formats (DD and DTS) achieve a higher compression factor by the trick of discarding portions of the audio that tests show are hard to hear. They are "lossy" in the sense that the LPCM that comes out of the decoder is not bit for bit identical to the LPCM that went INTO The encoder when the studio created the Bitstream track. Again, this works because what is "lost" is cleverly chosen to be hard to hear.
The new, "Lossless" Bitstream formats for Blu-ray (Dolby TrueHD and DTS-HD MA) are "lossless" in the sense that what comes out of the decoder IS bit for bit identical to the original LPCM. Nothing is lost. Note carefully that "Lossless" doesn't equate to "high quality". If the original LPCM master audio was crappy then selling it in the form of a "Lossless" Bitstream aint going to make it any better.
Because of the way the Lossless Bitstreams work, they can not achieve as high a compression factor as the Lossy Bitstreams. Fortunately Blu-ray has both higher disc capacity than SD-DVD and a much higher maximum disc read bit-rate than SD-DVD. Which is why the Lossless Bitstreams work for Blu-ray. Indeed Blu-ray is SO capacious in these two specs that even NON-compressed, raw, high-bit rate, multi-channel LPCM tracks work!
Note that I've talked only in terms of capacity on disc and bit-rate to read the disc. What about transmission over HDMI? Well there the advantage of Bitstream no longer applies!
How can that be? Well, there's no such thing as a separate HDMI Audio signal. HDMI Audio is embedded in the "blanking intervals" of HDMI Video. Always. Even if the only video being transmitted is a static, black image.
That is each "frame time" of video has a portion set aside to carry the audio format. A percentage of the "video bandwidth" is actually reserved as "audio bandwidth". There are a couple of consequences of this design.
First, you can't use the Lossless Bitstreams, or multi-channel, high bit-rate LPCM either, in video that's lower than 720p resolution. Why? The percentage of space left over in the blanking intervals (i.e., between frames) is not big enough. So instead you get what's called "compatibility" audio -- i.e., you get a LOSSY Bitstream, which is smaller and thus fits. Every Blu-ray disc is required to have a compatibility audio track for just such purposes -- also applies if you use Optical or Coax S/PDIF outputs. The compatibility track may not be visible in the disc's audio selection menus, but it is there nonetheless. (For DTS-HD MA, the compatibility, lossy DTS track -- called the core audio -- is actually embedded inside the DTS-HD MA file. For TrueHD and raw LPCM high-bit rate tracks, the compatibility audio is a separate file called the associated audio.)
Second, for 720p video resolution and higher, the space set aside for HDMI Audio is big enough REGARDLESS of which format you use! By design. I.e., the supported formats (on Blu-ray and for the HDMI spec) are CHOSEN so that they will fit!
The way HDMI works, data gets transferred in "pixel clock" chunks. The data is always flowing, even during the "blanking intervals" between frames. So it doesn't matter whether the audio portion of each "frame time" has Bitstream, or LPCM, or even silence (no audio) in there -- the same amount of bits -- the same amount of "pixel clock" chunks get transmitted.
So there is no advantage to Bitstream over LPCM on the HDMI cable itself. There MAY be an advantage for LPCM when viewed from the standpoint of the HDMI transmitter and receiver chips, because LPCM is "simpler". But by this point, this is pretty much commodity technology, so not even worth worrying about.
Now, if the disc track you are playing is a Bitstream track, it has to get decoded into LPCM at some point before it gets converted to Analog to go to the speakers. If the decoding is being done in the player, then that complexity is in the player. As far as Blu-ray is concerned, the Bitstream has already done its job -- reducing the bit-rate needed to read the disc. The HDMI connection doesn't care. So if the player knows how to decode, there is nothing lost by letting the player do the decoding. If, on the other hand, the AVR is doing the decoding then that complexity moves to the AVR. The AVR has the ADDED complexity of verifying that the incoming digital audio stream is well formatted. (The Blu-ray player doesn't have to deal with that since what it sees as Bitstream or LPCM comes out of the disc reading portion of the player -- i.e., it is properly formatted by definition.)
Now, with that technical background there's one other piece of MARKETING background you need.
The original idea for Blu-ray was that decoding would be done IN THE PLAYER. ALWAYS. There are portions of the Blu-ray spec which depend upon this simplification -- Secondary Audio Mixing being the big one. Secondary Audio Mixing can't happen before the Bitstream is decoded, and the HDMI cable doesn't provide a way to transmit TWO audio steams (primary and secondary). So Secondary Audio Mixing can ONLY happen inside the player -- and that MUST be preceded by decoding.
The idea was simple: Bitstream audio is a technology for providing compression ON DISC -- just like the various video compression formats (e.g., MPEG2 or VC1). Nobody expects an AVR to have to handle the decoding of the VIDEO compression formats. OBVIOUSLY the player should be responsible for that. So why should audio be different? Well, the answer is, it shouldn't. And so Bitstream audio -- the compression format for audio -- is to be decoded INSIDE THE PLAYER. The HDMI cable then carries only LPCM audio.
OK, cool idea, so what happened? Well, marketing happened.
First of all, the AVR makers were already shipping AVRs that accepted HDMI LPCM audio. Even high bit-rate, multi-channel LPCM audio. So wait, now that these new Bitstreams are coming out for Blu-ray, how is that going to make people want to throw away their perfectly good, current, HDMI-capable AVR and buy a new one? Well they won't! So the AVR makers insisted that THEY TOO be allowed to offer Bitstream decoding! Of course that meant folks would be paying for it twice -- once in the player and once in the new AVR -- but that's just fine. Getting people to pay for things, preferably multiple times, is the whole idea behind the consumer electronics industry.
So the AVR guys were lobbying hard for Lossless Bitstream over HDMI -- with decoding in (new) AVRs. The HDMI spec would have to be changed to accommodate that, but HDMI.ORG is a creature of the industry, so no problem
And that's when the DTS screw up threw a monkey wrench into the Blu-ray plans!
DTS was late with DTS-HD MA. That meant decoding chips were late. And behold! Once DTS-HD MA became "known", the amount of processing power needed to decode its over-the-top complexity was beyond what had been pre-sold! Pioneer, for example, lost a whole model cycle of players when they found the processing power they had built into them couldn't actually handle what DTS now required.
But AVRs have processing power up the wazoo! They've got DSP chips that have to handle LOTS of problems -- with the cost amortized over ALL the things the AVR is supposed to do.
The upshot was that DTS-HD MA decoding showed up in AVRs FIRST.
At which point the Blu-ray guys had no choice but to cave on the design goal that Bitstream decoding ALWAYS happens IN THE PLAYER.
Fast forward to today, and of course now you have Lossless Bitstream decoding as a given in players. But it is also still in AVRs. Folks pay for it twice. (Applause from the manufacturers!)
But just as with video de-interlacing before it -- i.e., do you pay for it in a "progressive" SD-DVD player? Or in your AVR? Or in your TV? ANSWER: You pay for it in all THREE! And so it goes with Bitstream decoding as well.
Aren't you glad you asked?