AVS Forum banner

1 - 17 of 17 Posts

·
Registered
Joined
·
17 Posts
Discussion Starter · #1 ·
Perhaps somebody out there can help me cut through the confusing world of audio formats.


So I've got a new Sony Blu-Ray player which I've mated to my Kenwood AV reciever using SPDIF and 6ch analogue....


Been thinking, though, what config ought to give me the best audio performance?


If the software has a 24/48 pcm track I guess that's a no-brainer--use the 6ch out to 6ch in on the Kenwood?


However, what if the software only offers 16/48 PCM? DTS claims transparency @ 1.5 mbits. So what format is a better representation of the 24/48 master? Lossless 16 bit PCM (dithered down) or lossy 24-bit 1.5mbit DTS?


What if there is no DTS option on the disc? Dolby claims tranparency @ 640 kbits (I believe)--so how does AC3 @ 640 compare to 16 bit PCM--both imperfect representations of the lossless master.


What I further don't understand is if DTS 1.5 mbit and AC3 640 kbit are transparent--what is the point of DD+ and DTS HD HR--both still lossy but somehow 'more transparent' or something??


I guess the other question is am I better off using the Sony's processor or the Kenwood's? When I'm using the 6ch input on the kenwood, I loose the ability to tell the reciever I have 'small' center speaker. So say I'm playing a LPCM track (or a DTS track using the Blu-Ray to decode) I'm feeding my center speaker the full 20-20khz (or whatever)--is that bad for it?

Also, my front and rears are better, but hardly "full range" so again, if I use the 6ch input I can't tell the Kenwood to divert bass to the sub...


Anyway...lot's of questions from a relative noob. Perhaps somebody can help me out here...
 

·
Registered
Joined
·
6,288 Posts

Quote:
What I further don't understand is if DTS 1.5 mbit and AC3 640 kbit are transparent--what is the point of DD+ and DTS HD HR--both still lossy but somehow 'more transparent' or something??

Maybe DTS recanted their 1.5Mbps transparency claims.
I guess their MA is even more transparent-er.
 

·
Registered
Joined
·
3,656 Posts
Are there no bass management settings in the player? That would be odd, especially since at least rudimentary bass management has been available since, oh, 1998. If Sony did not include it, they are shortchanging their customers.
 

·
Premium Member
Joined
·
3,133 Posts
The PCM track should ALWAYS be better than the DD or DTS track. True HD and DTS-HD MA may be the same or better depending on they are produced on the disc. However, I don't think your Sony BD can handle those formats anyway.


As Ovation said, the Sony player should have bass management settings for the analog outs.
 

·
Premium Member
Joined
·
3,133 Posts

Quote:
Originally Posted by Ovation /forum/post/12886836


Are there no bass management settings in the player? That would be odd, especially since at least rudimentary bass management has been available since, oh, 1998. If Sony did not include it, they are shortchanging their customers.

Tell that to the Samsung 5000 owners....well, Samsung gives you very rudimentary bass management but no distance or level settings. doh!
 

·
Registered
Joined
·
3,656 Posts
I don't have an HDMI receiver (and I won't upgrade just for that--I like it too much as is) so which ever Blu-ray player I get will have to have level matching settings at the very least (I can apply my bass management/time alignment settings on my MCH input but not levels).
 

·
Registered
Joined
·
1,025 Posts

Quote:
Originally Posted by shinookk /forum/post/12892417


I have hdmi 1.1 receiver and I feel analogue multi channel sound better than pcm.

By this, I assume you mean that you prefer the sound of your player's digital/analog converters over the sound of your receiver's DACs... correct?


Lee
 

·
Registered
Joined
·
16,876 Posts

Quote:
Originally Posted by Ovation /forum/post/12886836


Are there no bass management settings in the player?

This is quite common in these HD players as they simply assume everyone will be using an HDMI connection (or toslink/coax). Problematic now, but in a couple of years, there won't even be any multichannel analog outs, so...............
 

·
Registered
Joined
·
17 Posts
Discussion Starter · #10 ·

The PCM track should ALWAYS be better than the DD or DTS track. True HD and DTS-HD MA may be the same or better depending on they are produced on the disc. However, I don't think your Sony BD can handle those formats anyway.


An analogy comes to mind:


I'm given a 3000*2000 8bit/ch (6MP) TIFF file (my 'master') and told I have to reduce the size.


I can use LZW or zip to losslessly compress it yielding a perfect copy. (This is like DTS Master Audio)


I can interpolate down to 3MP and re-save as TIFF (lossless but inofrmation is thrown away--Just like 16 bit LPCM--no?)


I can leave the image @6MP but save as a JPEG with a mild compression ration (maybe 3:1)--lossy yes but I'm betting a more accurate representation of the original than down sampling the tiff.


Bottom line: 16 bit LPCM is surely 'lossy' representation of the 24bit original as 24 bit DTS--no? My question: If I had to choose between the two options which is better? Most photographers i know--given an 8MP DSLR--would choose to save 8MP jpegs over 4MP tiff.
 

·
Premium Member
Joined
·
3,133 Posts
Your analogy is a poor one. A TIFF resize/compression is not perception-based like DD/DTS codecs are. DD/DTS don't just randomly throw out data, they are scientifically based compression schemes.


Also, I'm not sure where your 24-bit to 16-bit comparison is coming from. Most soundtracks are not mastered in 24-bit in the first place. Another thing is less compression is better. I would not be suprised if a DD 640kbps code from a 16-bit master sounded as good or better than the same code from a 24-bit master.


Even in your 16 vs 24-bit scenario, do you honestly think a 640kpbs or 1.5Mbps track is going to sound as good as or better than a several Mbps PCM track? The only time a compressed codec may be better than PCM is if they have a TrueHD or DTS-HD MA track from a 24-bit master vs a 16-bit downrezzed PCM.


Of course, why bother posting here and debating when you could just listen for yourself and see?
 

·
Registered
Joined
·
17 Posts
Discussion Starter · #12 ·

Quote:
Originally Posted by rynberg /forum/post/12900676



Even in your 16 vs 24-bit scenario, do you honestly think a 640kpbs or 1.5Mbps track is going to sound as good as or better than a several Mbps PCM track? The only time a compressed codec may be better than PCM is if they have a TrueHD or DTS-HD MA track from a 24-bit master vs a 16-bit downrezzed PCM.


Of course, why bother posting here and debating when you could just listen for yourself and see?

I'm not interested in 'debating' --I'm just trying to understand this stuff. I don't have a background in audio. In general, I just like to know how stuff 'works'.


It doesn't seem so far fetched to me--that lossy 24-bit could sound better than lossless 16-bit (when coming from a 24-bit master). I think most people would choose a 16-bit AAC rip from a CD (@128 kbps) over 8-bit LPCM (@750 kbits). Isn't this why we use perceptual codecs--because they make better decisions about what to throw away and what to keep (vs. simply downsampling)?


Granted, there is a lot in this field I don't know but it can't be that absurd to wonder whether a higher bit-rate lossy codec could actually outperform lower bit-rate uncompressed audio.
 

·
Registered
Joined
·
1,857 Posts
As I understand, going from 16 to 24 bits results in an increase in the S/N ratio but people have pointed out that many if not most people at home cannot 'hear' this. And that is before we go to the issue of lossy compression. Also, the difference between 16 and 24 bits isn't necessary the same as between 8 and 16 bits.


According to the Unofficial Blu-ray Audio and Video Specifications Thread, there is only one BD that has LPCM 7.1 6144Kbps 16/48 and DTS-ES 6.1 1509Kbps 24-bit. There is another BD that has LPCM 5.1 4608Kbps 16/48 and DTS-HD High Resolution 5.1 3018Kbps 24-bit. There isn't a BD with LPCM 5.1 16/48 and DTS 1.5Mbps. So I'd not really be too bothered with the LPCM v. lossy DTS issue.


For DD, LPCM 5.1 16/48 is 4608Kbps whereas DD AC3 5.1 is 448 or 640Kbps, so it's not too hard to imagine that with such a drastic reduction in bitrate, a seven-fold reduction in the case of DD 640Kbps, DD is nowhere near transparent. In the graphics analogy, you went from 6MP TIFF to 3MP, are we talking about similar magnitude of compression for any meaningful comparison?


In any case, movie soundtracks aren't always music so I'd have thought the differences between LPCM and DD would vary during the course of the movie. In quiet or silent scenes perhaps there won't be any discernable differences.


Back to the graphics analogy, the effect of jpeg compression varies with the degree of compression but it also depends on what is in the graphic. So it might also be the case that some music and soundtracks compress better than others.


I haven't seen anyone in the Blu-ray & HD DVD Areas arguing if DD 640Kbps is transparent to LPCM, if indeed there is anything left to say about it. People have instead discussed 16-bit v. 24-bit LPCM and LPCM v. advanced lossless codecs.


If you're still in doubt, why not ask people with authority like Roger Dressler from Dolby and FilmMixer from Hollywood studios. They are around in the HD forums but hardly come to the Surround Music forum. In fact threads like this (on the HD movie audio formats) should belong there, not here. I thought this forum is for surround music as in SACD and DVD-A, not movie audio tracks; but we're getting more threads on the latter.
 

·
Premium Member
Joined
·
3,133 Posts

Quote:
Originally Posted by sjblakey314 /forum/post/12903135


I'm not interested in 'debating' --I'm just trying to understand this stuff. I don't have a background in audio. In general, I just like to know how stuff 'works'.


It doesn't seem so far fetched to me--that lossy 24-bit could sound better than lossless 16-bit (when coming from a 24-bit master). I think most people would choose a 16-bit AAC rip from a CD (@128 kbps) over 8-bit LPCM (@750 kbits). Isn't this why we use perceptual codecs--because they make better decisions about what to throw away and what to keep (vs. simply downsampling)?


Granted, there is a lot in this field I don't know but it can't be that absurd to wonder whether a higher bit-rate lossy codec could actually outperform lower bit-rate uncompressed audio.

My point was, you have the player, try both and see for yourself!


My other point was that you seem to think that 24-bit masters with 16-bit PCM tracks on the disc are common, and it isn't so. Again, your audio analogy doesn't apply, there is a WORLD of difference in a comparison between 16 vs 24-bit, and 16 vs 8-bit. A WORLD of difference.


Without trying for yourself (if that is even possible, as kilian points out), you can debate all day long. But I'll put money down that a 16-bit uncompressed PCM track sounds better than a DD/DTS track, whether it's from a 16 or 24-bit master.
 

·
Registered
Joined
·
326 Posts
Basically it all depends on what sounds good to YOU. Every format that you mentioned -- including DTS -- meets or exceeds the capabilities of the mix facility in which the original was made. DTS waveforms, when analyzed and compared with the original waveform input, match almost bit-for-bit.


The PCM is the original. A first generation DTS or DD data compression is nearly a perfect representation of the PCM.


Yeah, you can compare it to JPEG compression, but only if you were to use JPEG at a level at which you can't see any visible difference between the original and the result.


At 384, 448, and 1 meg data rates, the actual difference in SOUND is inaudible. And yes, I have some of the best ears in the business.


-John
 

·
Registered
Joined
·
29,988 Posts

Quote:
Originally Posted by sjblakey314 /forum/post/12899648


16 bit LPCM is surely 'lossy' representation of the 24bit original as 24 bit DTS--no?

Yes it is. PCM tracks are uncompressed, but not always lossless (those two terms are often conflated). For example: the Blu-ray of 'Casino Royale' has a 16-bit PCM track, which likely came from a 24-bit original. However, if they created a 16-bit encoding master for the home video PCM track, then they probably used the same encoding master for the lossy and lossless compressed tracks. So if you see a 16-bit PCM track accompanied by a 24-bit lossy track, chances are that the 24-bit words are 16-bit with some zeros added. From everything I've read, studios typically don't create two encoding masters for home video release.
Quote:
If I had to choose between the two options which is better?

As others have mentioned, you'd have to listen to compare. Difficult to predict which will sound better on your system, to your ears, when so many variables exist (degree of compression, efficiency of codec, fidelity of original, etc).
Quote:
I guess the other question is am I better off using the Sony's processor or the Kenwood's?

Unless a listening comparison demonstrated huge differences, I would continue to use the S/P-DIF connection that you currently are. For me, the convenience of transmitting the signal digitally (and using all of the receiver's features) would outwheigh the possibly better sound quality from the 6-channel analogue outputs.


Sanjay
 
1 - 17 of 17 Posts
Top