Originally Posted by A9X-308
The article was interesting, but rather than listening to the removed content, what is far more relevant to the average person, is can they tell the difference between 16/44 and 320kbps when listening to actual musical material? For most people, most of the time, that's no.
This is perhaps the more balanced view. Lossy compression is not a set of fixed conditions because it involves perceptual coding that responds to changes in the input signal. It is, therefore, variable by nature. That means that under some conditions it's going to leave tracks, under others, none at all, and frankly everywhere in between. It's simply not a black/white question, not at 320kpbs or most other bit rates. However, statistically, if it's acceptable or even transparent to most listeners most of the time, you have a win most of the time. Can you say it's always 100% transparent? Nope, but most of the time it doesn't matter. But it's not a yes/no, black/white, good/bad answer.
There are some similarities in visual compression, but not always. up the thread a ways Stereodude made a parallel with visual compression, "By the same standard JPEG and MPEG compression throw away most of the visual information too and at much higher ratios, but in general I don't see people worked up about those lossy compression algorithms like they do with lossy audio compression algorithms." With video compression, most people never actually attempt compression of an uncompressed source, so you don't hear the griping much. But if you actually do that, you're in for a real rough ride.
You don't hear people getting upset as much about video compression because a lot of the compression tweaks have been done for them, optimizing the results, without access to the uncompressed original. However, as someone who has authored just a few DVDs, where visual compression must take place, we often go to extreme lengths to adjust things about the compressor to trick it into doing a more transparent job. Dropping markers into the material at critical moments that tweak the compressor to work harder on that moment, carefully choosing maximum and minimum bit rates then manually adjusting them scene by scene, various types of pre-processing. Then the compressor is allowed make two or more passes through the material to fine-tune itself. All of that lets us create compressed content that looks largely transparent...so long as you don't have the uncompressed original to compare to, which most people don't have. But if you had the original, the degradation is almost always detectable at some point.
With still images, cameras have all be pre-set for optimal image compression, and only offer the user a few settings, each of which has already been optimized for its target resolution and file size for that camera. So the hard work has been done. But, if you want to hear people gripe about compression, pop into the pro photo world. No .jpgs being shot there at all. None. All camera-raw (uncompressed). Why? Because .jpg compression beats up your image. And for folk wiring in a world of grey scales, they're pretty black and white about that. For everyone else, the "most/most" concept works just fine, and they only shoot .jpg images.
When you run audio through an .mp3 coder you have only a single bit rate to select for the entire file. You can't manually tweak it on the fly. You can choose either fixed or variable rate, but how variable it is and how and when it chooses vary are pre-set and not optimizable. There may be a tiny fist full of other tweaks, but it's mostly a blanket codec that you can't really do much with, you just take what you get for a given bit rate, there's not a lot you can do. And people rip their own uncompressed originals all the time, it's an easy comparison to do. If you don't like what you get, you're only real option is to up your bit rate or find a better codec.
That's why I think you don't hear as much moaning about visual codecs, at least not in an audio forum.
Just because...I used to squash things into .mp3/320kbps files. Unfortunately, I got exposed to the "mp3 swish" decades before the .mp3 file was popular, and once you've heard it and recognized it, you become sensitized and hear it everywhere quite easily, even at high bit rates. So, even in my 320k files I could often hear the swish and warble. 128kbps is just agony. That got me to AAC, 320kbps, which is mostly artifact-free. And for me, that hit the most-of-the-time mark. I also do lossless (on the computer HDD), and that's where I've stayed, except for portable devices with tiny little brains that must have squashed files or they just won't hold enough media.