Originally Posted by David Susilo
The point is they don't anymore. Why? Because they can be sued for false advertisement since both technically (scientifically) and legally, they are not visually transparant to blu-ray. You are the only person who is adamant about that.
Not to mention the lack of lossless audio... Which I already know you will soon say that no human can hear the difference. LOL !!!
Visually transparent to blu-ray at what distance, on what material and on what equipment?
Yes, 10 mbps < 35 mbps. But also 20kHz < 100kHz. I have seen old ads for amps which claimed that they go higher than 20kHz. So what, unless you are a bat?
Same with this debate. Again, you either see the pixels from where you seat, or you don't. If you do not, making the pixel density greater will not have a visible effect to you.
As manufacturing capabilities improve, going to 4k is an easy and cheap way for manufacturers to create hype and generate upgrade sales. Nobody has ever lost money by underestimating the public.
But, if you do not see pixels sitting 10' from your 55" 2k screen now, you will gain no resolution benefit from replacing it with a 55" 4k screen.
You may well get picture quality improvements with the new set, such as better motion handling, or better color rendition, etc.. But the increased resolution will have nothing to do with it, since your eyes couldn't resolve the 2k pixels in the first place.
For the vast majority of consumers, 120"+ screens placed 10' from the couch are not in the cards in the next year or two. Which means that 4k will have a great marketing impact, but minimal real world benefits. It also means that whatever minor advantages physical media has today over streaming, will likely be gone by the time we all have 150" 4k screens hanging on the wall.
As to lossless audio..., yep, most ABX tests I've see show that for music, statistically reliably identifiable differences between lossless and lossy compression start to disappear at about 256kbps and pretty much completely disappear at 320kbps (for MP3). Of course, the results may vary, depending on the material and codecs used, but at some point, say 320kbps, lossless and lossy cannot be reliably distinguished.
Also keep in mind that lab testing is generally done under controlled conditions, while normal home and theater experiences are often subject to ambient noise, varying light, unknown original material sources and so on.
I have always found it ironic that 18 year-olds with "perfect" hearing are happy listening to their iPods, but older guys who probably can't hear much above 8kHz insist on lossless, or argue heatedly that they can hear differences between mainstream amps, or speaker wire, or power cords.... Ideally, there should be a difference between a hobby and religion.
For the record, I rarely use my BR anymore, since it's too much hassle and Netflix is more than sufficient for me in terms of content, as well as quality (60" Kuro plasma viewed from about 9-10').
I have also pretty much stopped listening to my 2GB lossless audio collection, since I discovered MOG, which streams at 320kbps -- on my Gallo Solos I really cannot reliably tell the difference in the vast majority of cases (and in the few cases where I think can, I bet dollars to donuts that the mastering has more to do with it than the compression).
For the most part, my physical media is in forgotten boxes in a couple of closets, together with my film camera.