Originally Posted by darklordjames
"What I dispute is that there is no way to encode discrete information and have it come out as discrete in a matrix decoder (at least the Dolby variety we are talking about...)
Again, you are very simply, and very provably incorrect. This isn't a debatable point.
First page: "Although it is similar in design to the standard Dolby Surround system, there are significant improvements in the newer version. In addition to a much-improved steering action leading to more stable imaging, the surround channels are full bandwidth and stereo. This makes the system a 5-2-5 encoding system."
Oh look. Discrete math for matrixing and decoding discrete surround channels. What a surprise. I truly am shocked. Really.
"Regardless of the output of the decoder (5.1) you are always hampered only having 2 channels of input as source... the logic is only so good, thus bleed and steering errors will occur.
Errors will occur? Yup. You know where else errors occur? Dolby Digital streams. Pretty much every DVD up until about 2000 was 384k Dolby Digital, produced with pretty crap mixing hardware and littered with nasty aliasing artifacts. Would you now like to argue that those aren't proper surround tracks either, simply because they were bit-starved? It would make about as much sense as your current line of reasoning.
ProLogic II encoding out of the Wii, PS2 and Gamecube are good enough. As I have said many times here, take similar material such as Resident Evil 4 on Wii spitting out a PLII encode and Silent Hill 4 on the original Xbox spitting out a discrete DD5.1 encode and your results will be very similar. Metroid Prime versus Halo 2 also gives a very similar surround experience. The thing holding back Wii audio was that it was a Wii. There was simply not enough processing available for newer effects such as proper occlusion and whatnot. ProLogic II as the transport stream for surround though? Perfectly serviceable, and more importantly relevant to your original claim, true, proper 5.1 surround. In all of the 480p consoles, ProLogic II encodes exceed the console's ability to make believable sound. There are certainly deficiencies in the audio, but the matrixed encode is not the restricting factor.
James.. what happens when you play a 2 channel source and apply a matrix decoder to it?
The common information ends up in the center and the out of phase information goes to the surrounds.. it's unavoidable, even if you placed that 2 channel source into a 5.1 container pre encoding.
The leaps over DPL were great... they reduced crosstalk (especially in regards to center channel dialog,) enhanced steering, gave the users come usable customization, lost the surrounds filtering, etc..
I'm not sure what your definition of discrete is.. if you believe that you can send a 5.1 channel input into a PLII encoder and the output will retain the discrete positioning of all elements
you are mistaken..
That is not debatable, and I've never heard anyone on the production side claim otherwise..
The results will be pleasing in most cases, but it is always a compromise.. you run out of headroom fairly quickly(2 channels trying to reproduce 6) there is no discrete LFE channel, etc..
For full disclosure, I've got 22+ years in the sound production business, and I've consulted for Dolby in the past (however not regarding the matrix products.) I'm not sure where your knowledge base comes from.
Maybe you're missing my point, maybe we're debating semantics.. I don't know.
I've made PLII encodes of well over 20 films I've mixed (and I've mixed over 150...) I know what happens to the surrounds and LR when you do it...
It's always "unpleasant", but it can hardly be described as a discrete representation of what went into the encoder..
It is, by me at least, easily discernibly different than the 5.1 masters it comes from.
For you to characterize that artifacts would be induced into a 384k DD stream via "crap mixing hardware" tells me your don't have first hand production knowledge of such things..
If that were the case (and I don't agree with your position that it was on "pretty much every" 384kbps DD title until 2000) it would be due to less than ideal source tracks, incompetence or faulty encoding equipment.. mixing equipment wouldn't factor into the reasons why..
The errors induced by matrix encoding are wholly unique to what happens when you lower the bitrate in a CBR encode... don't try and equate the two.
Of course I fully agree with your assessment of why some console hardware is much more capable than others in how they handle audio..
The real time API's for doing DD and DTS mixing on the fly take a bit of power to run on top of everything else...
My recollection of what Retro and others were doing "back in the day" regarding PLII encoding was that it was only pertinent to "pre-rendered" elements, and was not a real time encoding based on any real time input based on the users interaction..
I don't think the Wii handles it any differently.. (i.e. there is no real time matrix encoding going on..)
If you have some detailed resources about it's capabilities I'd be interested in reading them.
Bonus points for deleting your last post with it's erroneous content.
You are correct in that it contained a bunch of erroneous information in regards to my response to you... I was on my phone and misread your statement and replied as such.. the information was pertinent to what I was talking about, and not what you have posited.
Since you didn't quote it, I thought it had little relevance to the conversation and I deleted it.
Sorry to all the others for the thread derail..
And to DLJ.. have a good weekend.