Originally Posted by BIslander
Lossless codecs use variable bitrates, taking whatever bandwidth they need at any given moment.
Also worth mentioning that lossy codecs have bitpools, and while the bitrate may stay constant it is impossible for the activity in each channel to stay constant. Thus whichever channels need bandwidth at any given moment can take the majority of the bitpool. Even in the extremely rare case where all 6 channels need maximum bandwidth (virtually impossible given the limited frequency range of the LFE), you still have a ~3:1 compression ratio with DTS Core - a very mild compression ratio around which studies have shown is indiscernable from lossless.
Not necessarily. There's a rather steep improvement slope at the lower bitartes that levels off as the resolution increases. At some point, increasing bitrates offers no audible improvement. When does that occur? Some say the maximum lossy bitrates (DD at 640kbps and DTS at 1509 kbps) is about as good as it gets.
IMO DTS Core is significantly better than DD640. But, DTS-MA/TrueHD is not significantly better than DTS core. DTS Core has just enough bandwidth to ensure reference sound even though it is technically lossy. Even that being said DD640 still sounds damn good and I wouldn't avoid a title simply because it has DD640 only - though DTS core or better is preferable.
You are comparing more than bitrates there. 448 kbps is the maximum Dolby rate on DVD. 1.5 mbps is the rate used for the DTS core tracks on Blu. The fact that the DTS core on BD sounds better than standard DD 5.1 on DVD does not mean that lossless is going to sound better than the DTS core.
It is also worth mentioning that usually DTS and DD tracks have different mixes.
Originally Posted by Dan Q
My player (pio 51) won't decode DTS-HDMA over analog yet, so I haven't finalized my opinion. Does anyone know the difference in data rates between the the DTS-core and DTS-HDMA rate for the same sound track? I thought DTS-HDMA was made up from DTSCore + extension to make DTS HDmaster bit for bit accurate.
So if DTS HDMA = (DTS core) + (DTS extension); does it make sense that if the typical DTS HDMA rate is ~3.8Mbps and DTS core = 1.5Mbps that the DTS extension is about 2.3Mbps of additional detail? That "seems" like it should be noticeable to me.
You sort of have it but not exactly. The actual DTS-MA stored data track is DTS Core + an extension packet. But the actual final DTS-MA stream is a totally seperate entity created on the fly from the data between both the core and the extension; this is why it is such a pain to "decode" - create is probably a better word.
And yes, it does take up a lot more space for lossless, DTS in particular due to the way they did it to maintain backwards compatibility in a single bitstream. But that space is not translated to actual sound quality differences, especially considering the percentage of bitrate increase.
For example, take a Blu-ray Disc Video encoding. You see a big difference between a video encoded at 5mbps and 10mbps. A decent difference between 10mbps and 15mbps. A little difference between 15mbps and 20mbps. Once you get up to the difference between 25mbps and 30mbps that same 5mbps means a whole lot less than it did in the first example (5mbps and 10mbps). The same thing happens in audio, there reaches a point of diminishing returns. In the case of audio, this is generally around a 5.5:1 compression ratio with lossy audio - when you hit that magic number, generally the differences heard as you allocate more and more bitrate become less and less audible; so, when you are talking about DTS Core (~3:1) and lossless, generally the difference is inaudible or so minor you'd never notice it when watching a movie.