Originally Posted by Ryan1
I find it hard to believe that people still think that they can hear a difference between high-bitrate lossy and lossless, if all other things are equal (such as listening to the identical mix, volume matched).
I guess it's the same people who hear differences between 16 and 24 bits, and who wax poetic about "spacious, wide, detailed sound" from Pass Alephs and Krells compared to other properly performing amps, or can hear the difference between silver and coper speaker wire. Thee wouldn't be a whole "audiophile" industry if such people didn't exist.
Think about it: if in double blind tests people cannot tell the difference between 192kbps AAC and lossless, in stereo and in controlled environment, why do you think they can tell the difference when multiple channels are involved, most carrying ambient sounds, in living rooms where the average noise floor is generally much higher than what you get with a good pair of headphones or in a studio?
It is entirely possible that sometimes the content provider has different mix for streaming than on disk, or that imperceptible differences in amplitude affect one's preferences -- humans perceive as "better" identical track played at as little as 0.1db higher amplitude.
This post is both:
A) Factually inaccurate, NUMEROUS examples exist of people telling the difference between 192 kbps bitrate compressed files and lossless files (see: McGill University Subjective Evaluation of MP3 Compression for Different Musical Genres). The difference is stark. Lines only start to get blurry (law-of-diminishing-returns) for most people when you get to 256 kbps.
B) Completely Off-topic and mixing apples and oranges. The part that you many fail to understand while citing various references for 2-channel tests at xyz bitrate is that on the "Networking Media Servers and Content Streaming" forum the topic almost always being discussed is xyz bitrate being split not two but five (or more) ways. If my memory serves me right DVD "Dolby Digital" is 640 kbps being split 5 (not 2) ways... and I believe Netflix uses a much lower bitrate flavoring of Dolby Digital for their streaming these days (and it sounds like it too, to those of us that aren't using soundbars).
C) Beyond that, and going back to 2-channel music (again, generally off-topic on this forum, but just to have the discussion), disc space is CAF these days, and I have no idea why so many people worry about/go out of their way justifying why they don't NEED the extra 1's and 0's. If your point is that sampling rates beyond the CD standard of 44.1 kHz is a great big waste of time for everyone in your house except your dog, I don't disagree. But most of the time, when someone goes way out of their way to try to PROVE what others can't hear/are placebo'ing as better sound, it turns out they either don't own/use "better/higher-end equipment" (2-channel music played through a 3-foot wide soundbar they bought for $200 at Best Buy vs. planar magnetic headphones connected to class A amplifiers), or they don't listen to much "well-mastered music that has a lot of dynamic range" (a lot of Red Hot Chili Peppers Californication and not a lot of Norah Jones Come Away With Me), or usually both.
It blows my mind why people spend so much money (many thousands of dollars) on something like a TV or an audio system, and then *pucker* at the idea of (literally) spending a couple dollars (at most) more on the Blu Ray or the CD compared to the HD Digital Download or MP3 digital download... ESPECIALLY when the Blu Ray and CD *come with* the Vudu digital redemption key and the MP3 autorip... seems so incredibly short-sighted to me.
I absolutely hear a difference between both Dolby Digital 5.1 surround and 256 kbps MP3 and Dolby True HD surround and 16-bit/44.1 uncompressed CD on my gear, and I can prove to myself and anyone that wants to administer the A-B test on me at my house... I've done so many times.