Originally Posted by Jeremy Anderson
I never said it's a huge technical challenge. I'm saying no one's going to bother, because movies don't need them. Period. It's not supported by default because there's absolutely no reason to support 9.1 PCM. There is no 9.1 PCM content, nor any plan on the part of content creators to create such. So you want a game console to basically implement support for 9.1 or 11.1 discrete channels when no existing equipment supports it? And you want AVR manufacturers to then spend time implementing that for your one niche application that will eventually be made obsolete by object-oriented sound anyway, and which movies will never use? Because no offense... that's kinda' putting the chicken before the egg.
AVRs and A/V devices are chock full of niche features that have nothing to do with movies. Multi-zone output, dual HDMI out, various DSP modes, 192/24khz support, just tons upon tons of stuff to one up the competition in the numbers game. Movies havent been begging for 192khz support, and yet it's there. They'll add the higher channel discrete LPCM just to say they have it, even if nothing ever uses it. I'm sure BD players will follow suit with their own matrixing on-board, and they'll output using the discrete channels, just so they can say their new player supports 11.1.
If you're not mixing source files at the same resolution as the output, then what's the point? So you can reduce distortion above 48kHz... that no monitor can even produce? Because that's basically what you're talking about... a sample rate that lets you reduce distortion up to an inaudible frequency range. Even people into audio mastering think that's overkill. 24/48 covers you up to 24kHz and human hearing, short of people who have never heard a sound before, tends to run to about 22kHz. So again... WHY? If we were talking 16-bit vs 24-bit, maybe I'd be with ya' on this... but do you seriously see this as some kind of deficit? If so, I guess there really is no satisfying you.
No, it reduces distortion within the audible range. For a single track, it doesn't make a difference, like you say, humans can't hear that high. Mix multiple tracks at the same time, and the errors at the high end of the frequency range start to compound and it's definitely audible. Games can mix 100+ tracks at once to create the final output, and all those little errors add up. It's one of the reasons movies tend to sound more detailed than games, because they're mixing at higher sample rates. At the very end they can downsample the entire mix to 48khz to suit human hearing limits and save disc space, and the final result is much better than if they had mixed at 48khz all the way through. The only reason I'd want it to be able to output at 96khz, is because the downsampling is just an unnecessary extra processing step, since there's no need to save disc space in a real-time mix. This isn't something that requires any future support from any industry body, it's just a higher quality mix that everyone gets the benefit from.
Plain and simple, if you want the highest quality 7.1 48khz sound, heard on the A/V gear you're using today, you'd want to see them mixing internally at 96khz. Not that mixing at 48khz sounds terrible....it's merely "good enough."
No, I really don't. It's easy to make that parallel now, but when you consider the time when the 360 was designed and released - a full year before PS3 was out or Blu-ray mattered - Dolby Digital was GENEROUS. That's especially true when you consider that the 360 didn't come with HDMI until a later revision, so it's not like you could add the support for PCM output with the later hardware revision for HDMI without completely breaking the encoding standard of the original product. They would have needed one HELL of a crystal ball to release the 360 with HDMI and 5.1 PCM support.
The PS3 had HDMI a year after the release of the 360. You really think they needed a crystal ball to see 12 months out? They managed to get the HDMI video working at pixel perfect 1080p, despite component supporting a max of 1080i. Even the internal video scaler in the 360 was able to support scaling to 1080p, they didn't have any trouble "breaking that standard." My guess is that they simply don't put a premium on audio quality, dolby digital was considered "good enough."
HRTF comes with its own issues, because acoustic modeling aside, not everyone's head is the same. Older implementations in PC audio let you select from a few preset models, but... how much do you see HRTF being used right now? Now compare that to how much you see technology like Dolby Headphone being used for consoles to provide surround to headphone users. Seems pretty obvious why they're going that route.
I see HRTFs being used all the time...in Dolby Headphone. They're just doing it after the fact, and doing a worse job of it. The original xbox supported on board HRTFs, but both decided to skip sophisticated sound processors on the current gen, opting to do most of the work on the CPUs. Before the days of xbox live and PSN, you didnt see many people using headsets for gaming, but now they're all over the place. So now you've got the demand (tons of headset users), and MS is back to using a dedicated sound processor...and yet they didnt take the opportunity to build HRTFs in so games could present full blown high quality 3D audio instead of mediocre channel virtualization. Why? My guess is again that they simply don't care about achieving the best final sound quality, because dolby headphone and the like are "good enough."
Just like I always say, the X1's graphics will still look great to most people despite being weaker than the PS4, and I'm sure the same will go for the audio. It's just that it's not a very aspirational product when it comes to A/V quality. Keep in mind where you're posting. :p