Originally Posted by FilmMixer
Not to open this can of worms again...
But until you've done a proper A/B level matched test, you aren't really "hearing the difference..."
You might prefer
it (and in this day and age, there is no reason to not have a lossless encode) but that doesn't mean you'd reliably be able to pick out the master vs. lossless vs. lossy when all things are equal (including the master, dialog normalization and most importantly, level).
In addition, the placebo of knowing what you are listening to is way too powerful...
And remember these tests were performed at the respective companies, in tuned and calibrated rooms.. while they may have been trying to prove the superiority of their respective lossy codecs, these tests, and those I've been involved in, produce almost unanimous results.... that unless you created/mixed the material yourself, a high bit rate lossy codec (i.e. 640k DD or 1509 DTS) will be very difficult to pick out most of the time over the master.
And in regards to LFE, this is the area least touched by encoders in terms of data reduction... what is done to the signal by the encoder (i.e. roll off, crossover, filtering) is another matter...
I have a long standing offer:
Come visit me on my mixing stage in Hollywood... if you can reliably pick out the lossy DD encode vs. the master (I'll make the percentage 50% of the time) theres a Mortons steak dinner in it for you..
This, of course, is only my opinion, and I am in no way being dismissive of anybody else's... in this day and age, almost all Blu Ray material is lossless, as is D-Cinema (which is exponentially growing in it's adoption...)
As a mixer, I am happy to move away from lossy encodes on our theatrical presentations (and they are, in regards to DTS and Dolby, much more of a compromise that what is available even on DVD...)
It's just, in my experience, a conclusion most make with any kind of objective conditions or circumstances, but a highly subjective one, clouded by the rationale that lossless must