AVS Forum banner
1 - 20 of 28 Posts

·
Registered
Joined
·
921 Posts
Discussion Starter · #1 ·
Could someone explain to me why DD 5.1 and LPCM 5.1 with the same receiver, same playback device, and same source material would result in significantly different sound mixing?


My playback device is a PS3, my receiver is a Sony STR-DN1000 with HDMI audio, and my source is a demo for the game Castlevania, which has a native DD 5.1 soundtrack.


I can change the PS3 audio settings so that game will either output in DD 5.1 or LPCM, per my receiver's display.


In LPCM mode, I have a stronger bass sound. In DD 5.1 mode, the bass is weaker, but the surround channels are mixed considerably more loudly and "cleanly". I have switched back and forth with the same results. All other settings are left exactly the same between tests. Shouldn't they sound identical? We are talking about 100% digital data here!
 

·
Registered
Joined
·
18,923 Posts
DD has dialnorm. If understanding serves me, dialnorm will almost always result, in a properly implemented system, in a reduction in volume vs. what you would hear if there was no dialnorm (or it was set such that no adjustment would occur.)


PCM would not have this.


If your receiver can display the dialnorm value, you can probably figure out how to level match PCM to DD (raise the volume for DD or lower it for PCM,) and then try to compare them.
 

·
Registered
Joined
·
921 Posts
Discussion Starter · #3 ·
Fascinating... does dialnorm have anything to do with dynamic range commpression? I fiddled with those settings, but they really didn't make a difference to my findings above.
 

·
Registered
Joined
·
921 Posts
Discussion Starter · #4 ·
Reading up on dialnorm a bit... if the original source was DD 5.1 encoded, but is being decoded by my PS3 as opposed to my receiver, wouldn't the PS3 respect the original dialnorm information as the receiver seems to be doing? Or am I missing something?
 

·
Registered
Joined
·
219 Posts
I believe what you are saying is, that when the PS3 decodes the DD5.1 to LCPM it sounds different than when the PS3 bitstreams the DD5.1 and your receiver decodes it, correct?


If so it may be that your receiver only applies your level settings and bass management to the bitstreamed data?
 

·
Registered
Joined
·
18,923 Posts
Seems I noticed the same thing in the past - that the PS3 was louder with one than the other. My only guess was that the PS3 was not properly applying dialnorm when it did the decoding. Interesting question, but I really have no answer.


I have my PS3 set to do all decoding, by the way, because then lossless audio is decoded. Seeing how my PS3 can't send lossless bitstream, it's the simplest solution. And IMO, it works just fine, even though there's apparently a difference in levels (if you compare them at the same level, by compensating with the volume control, you should not hear any real difference.)
 

·
Registered
Joined
·
1,211 Posts

Quote:
Originally Posted by fullhorn /forum/post/19555715



If so it may be that your receiver only applies your level settings and bass management to the bitstreamed data?

I bet it has to do with this for sure. If I remember correctly, many receivers wouldn't apply the room correction system toa non bitstreamed signal.
 

·
Registered
Joined
·
30,788 Posts

Quote:
Originally Posted by the rick /forum/post/19556245


I bet it has to do with this for sure. If I remember correctly, many receivers wouldn't apply the room correction system toa non bitstreamed signal.

Not so. Many receivers will not apply room correction to analog signals (unless they can re-digitize them). AFAIK, all receivers that do room correction can do it with all digital signals.
 

·
Registered
Joined
·
18,923 Posts

Quote:
Originally Posted by the rick /forum/post/19556245


I bet it has to do with this for sure. If I remember correctly, many receivers wouldn't apply the room correction system toa non bitstreamed signal.

Back in 2006 or 2007 that was true of some receivers. The opposite is now more often true - some receivers can't apply some processing to lossless bitstream as their DSP power is not up to that task.
 

·
Registered
Joined
·
1,857 Posts
The limitations in processing that I've read of only apply to higher sample rates like 192kHz; I very much doubt it is of any relevance to the OP's case here as his source is only DD, which isn't likely to be even 96kHz.


Just try another BDP and see in the first instance if it's PS3 specific or have a look in the PS3 forum. It isn't that difficult to investigate further for issues like this in a methodical manner.
 

·
Registered
Joined
·
14 Posts
As someone who works in the audio post production field, and gets to hear the mixes coming off console, i can tell you that there is a huge difference between the 6 channel 24/48 stream, and the one that is made during the dolby printmaster.


Here are a few reasons why:


1. Dolby Digital, with all of it's good points, is a very lossy format. Think of the difference between a CD and a regular MP3 from Itunes. Not only does this make it sound worse, it can effect dynamic range, again, much like MP3.



2. Also during this final dolby printmaster process, in many dynamic scenes a 'container' or limiter is applied to prevent clipping. this is controlled by the Dolby engineer, and is applied at their discretion. This could change many times throughout the film.


3. what's on the PCM and DD streams is not consistent from disc to disc. That PCM audio may be the actual 6 files from the mix, meant for theatrical release. It might have been massaged for the home consumer by a third party. The DD stream might be the same, it might be treated different. It depends on budget, time, what the whim of the producers are.... This might make things sound different...


As to the comments to dialnorm. This is simply a metadata 'flag' that tells the decoder how loud the program material is, (decided by a Dolby proprietary algorithm during the mixing and output process, that bases this off of dialog.) and how loud it should be played. It is meant to help generally keep program volumes the same from dvd to dvd and from tv movie to tv movie. This is a single number, that is for the entire piece. It is for Broadcast, DVD and bluray only, not for theater playback. Think of it as a piece of paper that someone hands you.... The movie will be ..... loud. you say, well, if that's the case, i'll put my output at .... level and it should match other material. It should not effect the end user in any way.


Good ears for the catch, the vast majority of consumers miss this kind of stuff....
 

·
Registered
Joined
·
4,819 Posts
miilleman,


thanks for all the info, but that really doesn't apply here since the same DD5.1 track is being played, it is just a matter of decoding in the player or the receiver. So in this case there should be no difference.
 

·
Registered
Joined
·
1,807 Posts

Quote:
Originally Posted by miilleman /forum/post/19566393


As someone who works in the audio post production field, and gets to hear the mixes coming off console, i can tell you that there is a huge difference between the 6 channel 24/48 stream, and the one that is made during the dolby printmaster.


Here are a few reasons why:


1. Dolby Digital, with all of it's good points, is a very lossy format. Think of the difference between a CD and a regular MP3 from Itunes. Not only does this make it sound worse, it can effect dynamic range, again, much like MP3.

Keep in mind that most of your experience w/ Dolby Digital is, ostensibly, w/ theatrical Dolby Digital, which is limited to one bit rate: 320 kbps. For a primary audio track on BD, Dolby Digital can utilize a multitude of bit rates but is typically maxed out at 640 kbps, sometimes 448 kbps, rarely 384 kbps. And critical listening tests have shown that the audible difference between high bit rate Dolby Digital and PCM is marginal, at best. For example, see this oft cited article:

http://www.hemagazine.com/node/Dolby...compressed_PCM


AJ
 

·
Registered
Joined
·
14 Posts
Sorry, i thought the original poster was referring to the sound of DD versus the 6 (if it's 5.1) channels of raw LPCM audio that is on many Blu-Ray discs themselves. If the original poster was referring to that, that would explain a lot, i would think as the playback of those six files would represent a huge difference in audio .


I guess in this case, it goes to show that decoders can be very different.
 

·
Registered
Joined
·
12,417 Posts

Quote:
Originally Posted by Flavius /forum/post/19554874


Could someone explain to me why DD 5.1 and LPCM 5.1 with the same receiver, same playback device, and same source material would result in significantly different sound mixing?


My playback device is a PS3, my receiver is a Sony STR-DN1000 with HDMI audio, and my source is a demo for the game Castlevania, which has a native DD 5.1 soundtrack.


I can change the PS3 audio settings so that game will either output in DD 5.1 or LPCM, per my receiver's display.


In LPCM mode, I have a stronger bass sound. In DD 5.1 mode, the bass is weaker, but the surround channels are mixed considerably more loudly and "cleanly". I have switched back and forth with the same results. All other settings are left exactly the same between tests. Shouldn't they sound identical? We are talking about 100% digital data here!

There is no reason these two playback methods should sound any different.


Is the PCM mode reading as 5.1 into the AVR? Stereo?


Might be worth trying the same test with a movie DVD.
 

·
Registered
Joined
·
14 Posts

Quote:
Originally Posted by WiWavelength /forum/post/19566527


Keep in mind that most of your experience w/ Dolby Digital is, ostensibly, w/ theatrical Dolby Digital, which is limited to one bit rate: 320 kbps. For a primary audio track on BD, Dolby Digital can utilize a multitude of bit rates but is typically maxed out at 640 kbps, sometimes 448 kbps, rarely 384 kbps. And critical listening tests have shown that the audible difference between high bit rate Dolby Digital and PCM is marginal, at best. For example, see this oft cited article:




AJ

Obviously, this has to be thought of, and i'm sure bitrates have been discussed and debated ad nauseum. But, no matter what media you are discussing, the DD that comes of the DMU, or that is encoded on the Blu-ray, it could be an issue for the OP, and is worth mentioning.
 

·
Registered
Joined
·
1,807 Posts

Quote:
Originally Posted by Kilian.ca /forum/post/19560073


The limitations in processing that I've read of only apply to higher sample rates like 192kHz; I very much doubt it is of any relevance to the OP's case here as his source is only DD, which isn't likely to be even 96kHz.

Roger Dressler can probably confirm this, but I believe that Dolby Digital encoders accept a variety of word lengths & sample rates but Dolby Digital decoders always output 16 bit 48 kHz. I come to this conclusion mostly through empirical evidence, though, as I have just never encountered a Dolby Digital 24 bit or 96 kHz (or, for that matter, 44.1 kHz) track in the wild.


The OP uses the PS3. I am not that familiar w/ the PS3, as I do not use a video game system to play BDs. But if what I recall reading is accurate, the PS3 offers several sample rate conversion options. Could the OP have the PS3 set to decode Dolby Digital and upsample to 96 kHz or 192 kHz? If that is possible, the high sample rate could be beyond the Sony AVR's DSP capability to apply room correction, EQ, etc.


AJ
 

·
Registered
Joined
·
9,003 Posts
First, dialnorm is not a factor here. The PS3 decoder applies the dialnorm offset just the same as the receiver. Besides, dialnorm simply turns down the master volume a few dB. It would not cause the differences described here. In a similar vein, I've seen lots of posts over the years observing that the volume of the PCM from a PS3 is lower than the same track decoded by a receiver. But, that's easily fixed by turning up the volume a bit.


Beyond that, there should be no difference when the same DD 5.1 track is decoded in the player or the receiver as long as the receiver does the same processing to PCM and bitstream inputs.


I suggest you get a calibration disc with test tones and bass sweeps and spend some time measuring and listening to the outputs to see if there really is any difference when you decode in your player vs. your receiver.
 

·
Registered
Joined
·
12,417 Posts

Quote:
Originally Posted by WiWavelength /forum/post/19566717


Roger Dressler can probably confirm this, but I believe that Dolby Digital encoders accept a variety of word lengths & sample rates but Dolby Digital decoders always output 16 bit 48 kHz. I come to this conclusion mostly through empirical evidence, though, as I have just never encountered a Dolby Digital 24 bit or 96 kHz (or, for that matter, 44.1 kHz) track in the wild.

Dolby Digital encoders do not have sample rate conversion, so must be fed from the desired sample rate for the encoded signal--32, 44.1, or 48 kHz. DVDs and DTV are always 48 kHz.


The encoder internally treats all sources as 24-bits, but of course it's lossy encoding. The decoded wordlength is a function of the specific decoder, but any modern decoder outputs 24-bit words. Back in the 90s there were 20-bit and even 16-bit decoders.
 
1 - 20 of 28 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top