or Connect
AVS › AVS Forum › News Forum › Community News & Polls › Is the End of Physical Media Inevitable?
New Posts  All Forums:Forum Nav:

Is the End of Physical Media Inevitable? - Page 10

Poll Results: Is the End of Physical Media Inevitable?

 
  • 2% (23)
    Yes, physical media will quickly disappear altogether
  • 18% (198)
    Yes, physical media will slowly disappear altogether
  • 34% (369)
    No, new physical formats will continue to be developed
  • 44% (470)
    No, but physical media will become a niche market for enthusiasts
1060 Total Votes  
post #271 of 920
Quote:
Originally Posted by fritzi93 View Post

I should think that putting a number on "good enough" is just asking for an argument. Depends on too many external considerations. Nevertheless, anyone who has experience with x264 encoding knows that content varies as to compressibility. A grainy source, say, will need a ton more bitrate that, say, very clean animation.

As to HEVC, if it's true that it's approximately twice as efficient as AVC, then yeah, you're going to need to double Blu-Ray bitrates, and assuming nothing else changes.

Some can compress visually perfectly to a lot less than 10mbps. I just know that HDX is visually the same as Blu-Ray, and it tops out around 10mbps. So really, the 38mbps they're claiming for HEVC is extremely generous, we should be able to see 4K video with HEVC compressed to around 18-20mbps if they do as good of a job tweaking it as VUDU has done.
Quote:
Originally Posted by BartMan01 View Post

Which leaves 1/2 of the US out. There are still a large number of people with nothing close to broadband (some still on dial-up for at least one direction) outside of major population centers. Until there is 100% broadband coverage at affordable prices, there will always be someone needing physical media and someone willing to sell it to them.
Quote:
Originally Posted by David Susilo View Post

And that's just the USA. Don't forget the rest of the world. BiggAW seems to forget that the world exists outside USA too. It's cheaper for me to buy 2nd hand blu-ray discs than paying for internet to watch as little as 6 HDX-quality movies each month

Well, if you read what I was saying, you would know that what I was saying is that there are enough people who can get 50+mbps connections NOW that there is enough of a market IN THE US to start streaming 4K video, even if it's compressed at a high bitrate like 38mbps HEVC. Certainly there are a lot of people in the US and globally who have inadequate access to good broadband, and that is a major political issue in both Canada and the US which neither country's government seems to care much about fixing.

The lack of broadband at all (or slow DSL lines) is a valid argument for why physical media will still be around in 10 years, but it's also not a valid argument as to why they won't start streaming 4K video soon. Heck, with H.264 I could handle streaming 4K video, even if it was bloated up to 40-50mbps, and if it was compressed like VUDU HDX, which would work out to about 36mbps, I'd be able to do another two 1080p streams at the same time.
post #272 of 920
Once again, you keep pushing your subjective observation of HDX to be visually transparant to Blu-ray. Give it up, it's subjective and there is no scientific fact behind it. rolleyes.gif
post #273 of 920
Quote:
Originally Posted by David Susilo View Post

Once again, you keep pushing your subjective observation of HDX to be visually transparant to Blu-ray. Give it up, it's subjective and there is no scientific fact behind it. rolleyes.gif

Nor is there any scientific fact behind the theory that a human can perceive the difference between HDX and Blu-ray, or that most TVs can even reproduce it. So until someone does a double-blind study, I guess we'll have to leave it at visually equivalent, as no one has proven VUDU wrong.
post #274 of 920
Scientific fact: 10 mbps < 35 mbps. If studios can make their max bitrate to be only 10 mbps, why would they even bother with 50 Gb disc?

Even Vudu does NOT claim their HDX to be equal to Blu-ray.
post #275 of 920
PS: imagic already proven that HDX is NOT equivalent to blu-ray.
post #276 of 920
Looks like I will have to admit I was wrong...local techs tell me 100 Mbps (Megabits) DSL will be up and running by September. Currently running tests at the local university. Also, no more copper wires either above or below ground anymore. It's all fibre from here on out.
post #277 of 920
Quote:
Originally Posted by David Susilo View Post

Scientific fact: 10 mbps < 35 mbps. If studios can make their max bitrate to be only 10 mbps, why would they even bother with 50 Gb disc?

Even Vudu does NOT claim their HDX to be equal to Blu-ray.

They did at one point. Yes, there is more data on a Blu-Ray. That doesn't mean a human can see it.
post #278 of 920
The video maybe getting close on HDX, but I can see advantages with the blu ray version. Take a look at the Skyfall blu ray, it was very impressive and we should be so lucky that they all look that good. What about HDX audio? It's not bad, but there is no comparison with DTS master audio. Getting closer but no cigar yet.
post #279 of 920
Not that I thought that was the best Bond movie, but you have to admit the video quality was really good.
post #280 of 920
Quote:
Originally Posted by BiggAW View Post

They did at one point. Yes, there is more data on a Blu-Ray. That doesn't mean a human can see it.

The point is they don't anymore. Why? Because they can be sued for false advertisement since both technically (scientifically) and legally, they are not visually transparant to blu-ray. You are the only person who is adamant about that.

Not to mention the lack of lossless audio... Which I already know you will soon say that no human can hear the difference. LOL !!!

biggrin.gif
post #281 of 920
Quote:
Originally Posted by David Susilo View Post

The point is they don't anymore. Why? Because they can be sued for false advertisement since both technically (scientifically) and legally, they are not visually transparant to blu-ray. You are the only person who is adamant about that.

Not to mention the lack of lossless audio... Which I already know you will soon say that no human can hear the difference. LOL !!!

biggrin.gif

Visually transparent to blu-ray at what distance, on what material and on what equipment?

Yes, 10 mbps < 35 mbps. But also 20kHz < 100kHz. I have seen old ads for amps which claimed that they go higher than 20kHz. So what, unless you are a bat?

Same with this debate. Again, you either see the pixels from where you seat, or you don't. If you do not, making the pixel density greater will not have a visible effect to you.

As manufacturing capabilities improve, going to 4k is an easy and cheap way for manufacturers to create hype and generate upgrade sales. Nobody has ever lost money by underestimating the public.

But, if you do not see pixels sitting 10' from your 55" 2k screen now, you will gain no resolution benefit from replacing it with a 55" 4k screen.

You may well get picture quality improvements with the new set, such as better motion handling, or better color rendition, etc.. But the increased resolution will have nothing to do with it, since your eyes couldn't resolve the 2k pixels in the first place.

For the vast majority of consumers, 120"+ screens placed 10' from the couch are not in the cards in the next year or two. Which means that 4k will have a great marketing impact, but minimal real world benefits. It also means that whatever minor advantages physical media has today over streaming, will likely be gone by the time we all have 150" 4k screens hanging on the wall.

As to lossless audio..., yep, most ABX tests I've see show that for music, statistically reliably identifiable differences between lossless and lossy compression start to disappear at about 256kbps and pretty much completely disappear at 320kbps (for MP3). Of course, the results may vary, depending on the material and codecs used, but at some point, say 320kbps, lossless and lossy cannot be reliably distinguished.

Also keep in mind that lab testing is generally done under controlled conditions, while normal home and theater experiences are often subject to ambient noise, varying light, unknown original material sources and so on.

I have always found it ironic that 18 year-olds with "perfect" hearing are happy listening to their iPods, but older guys who probably can't hear much above 8kHz insist on lossless, or argue heatedly that they can hear differences between mainstream amps, or speaker wire, or power cords.... Ideally, there should be a difference between a hobby and religion.

For the record, I rarely use my BR anymore, since it's too much hassle and Netflix is more than sufficient for me in terms of content, as well as quality (60" Kuro plasma viewed from about 9-10').

I have also pretty much stopped listening to my 2GB lossless audio collection, since I discovered MOG, which streams at 320kbps -- on my Gallo Solos I really cannot reliably tell the difference in the vast majority of cases (and in the few cases where I think can, I bet dollars to donuts that the mastering has more to do with it than the compression).

For the most part, my physical media is in forgotten boxes in a couple of closets, together with my film camera.
Edited by Ryan1 - 4/19/13 at 6:21pm
post #282 of 920
You gotta love the theory in that biggrin.gif but don't tell me I won't pick out an MP3 versus a well mastered disc with 3-4 times the bitrate. Just because 18 year olds listen to lossy downloads over his/her earbuds and can't tell the difference in formats doesn't mean I won't through my humble but very revealing setup. As far as seeing the difference between 4K and 2K, if people that are in the field and have nothing to gain tell me there's a difference then I'd believe them over someone who probably hasn't experienced it.
post #283 of 920
That's exactly the thing. Most 4K naysayers unfortunately either never had the experience or worse, don't even bother trying to experience because they're blinded with theories and said-so from other naysayers who never experienced 4K.

As for me? I had the chance to live with the Sony 4K projector for almost 6 months and just want to set the record straight. I have absolutely nothing to gain since I'm neither in retail nor in marketing.
post #284 of 920
Quote:
Originally Posted by comfynumb View Post

Not that I thought that was the best Bond movie, but you have to admit the video quality was really good.

It's hard to judge video quality when the content is so horrible. That series really went downhill after World is Not Enough.
Quote:
Originally Posted by David Susilo View Post

The point is they don't anymore. Why? Because they can be sued for false advertisement since both technically (scientifically) and legally, they are not visually transparant to blu-ray. You are the only person who is adamant about that.

Not to mention the lack of lossless audio... Which I already know you will soon say that no human can hear the difference. LOL !!!

biggrin.gif

They can't be sued for visual equivalence, as they have visual equivalence. They just have to be careful with wording and not say "same quality as Blu-Ray", but something like offers the "same experience as Blu-Ray" or "You see the same quality as Blu-Ray" or something like that. What sort of audio bitrate are they using? As we know from computer audio, once you get past about 128kbps AAC per channel (so 256kbps for music), there is absolutely no audible difference. And there is probably no audible difference at 96kbps per channel AAC (192kbps stereo).
Quote:
Originally Posted by Ryan1 View Post

Visually transparent to blu-ray at what distance, on what material and on what equipment?

Yes, 10 mbps < 35 mbps. But also 20kHz < 100kHz. I have seen old ads for amps which claimed that they go higher than 20kHz. So what, unless you are a bat?

Same with this debate. Again, you either see the pixels from where you seat, or you don't. If you do not, making the pixel density greater will not have a visible effect to you.

As manufacturing capabilities improve, going to 4k is an easy and cheap way for manufacturers to create hype and generate upgrade sales. Nobody has ever lost money by underestimating the public.

But, if you do not see pixels sitting 10' from your 55" 2k screen now, you will gain no resolution benefit from replacing it with a 55" 4k screen.

You may well get picture quality improvements with the new set, such as better motion handling, or better color rendition, etc.. But the increased resolution will have nothing to do with it, since your eyes couldn't resolve the 2k pixels in the first place.

For the vast majority of consumers, 120"+ screens placed 10' from the couch are not in the cards in the next year or two. Which means that 4k will have a great marketing impact, but minimal real world benefits. It also means that whatever minor advantages physical media has today over streaming, will likely be gone by the time we all have 150" 4k screens hanging on the wall.

As to lossless audio..., yep, most ABX tests I've see show that for music, statistically reliably identifiable differences between lossless and lossy compression start to disappear at about 256kbps and pretty much completely disappear at 320kbps (for MP3). Of course, the results may vary, depending on the material and codecs used, but at some point, say 320kbps, lossless and lossy cannot be reliably distinguished.

Also keep in mind that lab testing is generally done under controlled conditions, while normal home and theater experiences are often subject to ambient noise, varying light, unknown original material sources and so on.

I have always found it ironic that 18 year-olds with "perfect" hearing are happy listening to their iPods, but older guys who probably can't hear much above 8kHz insist on lossless, or argue heatedly that they can hear differences between mainstream amps, or speaker wire, or power cords.... Ideally, there should be a difference between a hobby and religion.

For the record, I rarely use my BR anymore, since it's too much hassle and Netflix is more than sufficient for me in terms of content, as well as quality (60" Kuro plasma viewed from about 9-10').

I have also pretty much stopped listening to my 2GB lossless audio collection, since I discovered MOG, which streams at 320kbps -- on my Gallo Solos I really cannot reliably tell the difference in the vast majority of cases (and in the few cases where I think can, I bet dollars to donuts that the mastering has more to do with it than the compression).

For the most part, my physical media is in forgotten boxes in a couple of closets, together with my film camera.

Yup.
Quote:
Originally Posted by David Susilo View Post

That's exactly the thing. Most 4K naysayers unfortunately either never had the experience or worse, don't even bother trying to experience because they're blinded with theories and said-so from other naysayers who never experienced 4K.

As for me? I had the chance to live with the Sony 4K projector for almost 6 months and just want to set the record straight. I have absolutely nothing to gain since I'm neither in retail nor in marketing.

Well if you put it on a 10' screen and sit 5' away from it, of course you are going to see the difference. That's not a normal home viewing use case, however. Most people don't build mini IMAX theaters in their basements.
post #285 of 920
Are you daft or something? In multiple posts I've written that even in the 85" are of my screen viewed from 9'-10' away, myself and a whole binch of people (including people from Technicolor and THX) can clearly see the difference between 2K vs 4K.

I've experienced 4K, you haven't. Who got better authority in what works and what doesn't?
post #286 of 920
Quote:
Originally Posted by BiggAW View Post

They did at one point. Yes, there is more data on a Blu-Ray. That doesn't mean a human can see it.

And yet, you keep saying that so many people can get streaming 4K. If people can't tell the difference in blu-ray vs. your HDX compression, then why bother right? rolleyes.gif

I am not jumping onto 4K for my theater room for a long time. I just don't have the finances to do it, plus my current projector (last year's JVC RS45) only has 500 hours on it, but I can tell a big difference between any blu-ray vs. alternative compression that I've ever seen. Maybe not every scene, but at some point there is a little blocking, softness, blur, grain, or whatever that isn't there on the blu-ray.

Now, I used to convert a lot of my DVDs to .AVI so that I could travel with my movie library on my laptop without taking a bunch of discs and I thought that a 1.4GB file was a perfect reproduction of the image. Of course it looked great on a 15.6" laptop, but at the time, if I played them back on my 104" screen, I could see the difference. Now at 138", anything short of the physical blu-ray disc or a 1:1 copy of that disc (which I don't have a media server system set up so I don't do this), and I can tell the difference pretty easily.

I do love my lossless audio too.

I think that the big problem is that we live in a world that generally does not care about quality, but just about things being (perceived as) easy. A couple of clicks on a remote, and you can stream some movie sure is enticing for people that don't plan far enough ahead to rent a disc and pick it up at their closest RedBox (yeah, I know, some are stripped down discs), or they like that they don't have to actually own anything, or that their Netflix movie queue didn't have the movie that they wanted to watch already sent to them. Click, click, click and there is the movie.

Then again, look at music. I remember reading a survey amongst young adults aged between something like 16-28 about what is the best format for music in terms of quality, and it was very heavy towards .MP3s. This was years ago when quality was across the board, plus the idea being an .MP3 is that you are stripping out the nuances from an audio track (usually started as a CD, which itself is compressed from an analog master). The perception was that the latest, easy to share format, had the best quality. It wasn't true. The article went on to do blind tests where the tested people ended up either not being able to tell the difference or picked the CD as better when they didn't know the source. The speculation at the time was just one of psychology and our human desire to have easy at the expense of better. There were other theories, but I just don't remember them.

Just remember this simple fact. Anytime you compress something, you are stripping out data that the algorithm does not believe that you can see (or hear). I would argue that with any quality system (not even the super expensive stuff) the average person could tell in a close side to side inspection. Again, there are exceptions. If you have a small display, or just stereo speakers, then you would be limited in your ability to see the benefits, but even me, with my very modest system compared to many of them here on AVS, can tell the difference.
post #287 of 920
Quote:
Originally Posted by David Susilo View Post

Are you daft or something? In multiple posts I've written that even in the 85" are of my screen viewed from 9'-10' away, myself and a whole binch of people (including people from Technicolor and THX) can clearly see the difference between 2K vs 4K.

I've experienced 4K, you haven't. Who got better authority in what works and what doesn't?

What were you using for a 2k source? I don't believe that at that size and distance, you could tell the difference between "mastered in 4k" that's actually 1080p on Blu-Ray/HDX and actual 4k video.
post #288 of 920
I use video server from Sony preloaded with movies, demos, trailers, slideshows.
post #289 of 920
Quote:
Originally Posted by David Susilo View Post

Are you daft or something? In multiple posts I've written that even in the 85" are of my screen viewed from 9'-10' away, myself and a whole binch of people (including people from Technicolor and THX) can clearly see the difference between 2K vs 4K."

I am just curious: what exactly is the clearly visible difference?

In other words, are you claiming that on your 85" screen you can resolve 2k pixels at 10'?

If so, this would make you highly unusual, but it is not impossible. It would, however, be very, very, very unusual for a everyone in a larger group of people to be able to resolve 2k pixels at such a distance.

But, you have not mentioned pixels, as far as I have read your posts. You simply claim that you see a difference. If you are speaking of an overall picture quality difference then in all likelihood it has nothing to do with the density of the pixels, but with other factors.
Quote:
Originally Posted by David Susilo View Post

I've experienced 4K, you haven't. Who got better authority in what works and what doesn't?

I suppose this is what the Flat-Earth Society members would say: Look! It looks flat and am not falling off, so I know what I am talking about! biggrin.gif

Without a double-blind test, such statements are at best misleading, particularly since you are not even claiming "ultra-vision." You may in fact be seeing all sorts of differences, but they will likely have nothing to do with resolution. You are most likely attributing them to 4k simply because you know that "it must be better."

It's the same with lossless: under controlled conditions people cannot reliably hear a difference between relatively high bitrate compressed music (for MP3 it's about 256kbps to 320kbps) and lossless.

The resolution charts are valid, since they are generally developed by testing subjects under controlled conditions.

As to streaming vs BR, there are a whole bunch of threads here comparing freeze-frames of various movies: most show some visible differences, but they are unlikely to be easily noticeable unless the viewer is pixel-peeping at freeze-frames. Seriously, streaming Downton Abbey from Netflix (ATV on a 60" screen viewed from 9'), not once did I have the urge to rush out and grab a BR because of poor PQ.... smile.gif
post #290 of 920
Like I said in an earlier post, before you go and argue audio/video with someone you should at least try and look into their background. My recommendation would be, not to argue with someone who is THX certified, unless you are yourself. Just saying.
post #291 of 920
I give up. It is literally like me claiming that the earth is round to the flat-earth theorists way back when.

All I can say (again) is live with 4K for yourself, then feel free to give negative opinion as much as you guys want, but not before living with 4K or at least before having spending a couple of hours with 4K.
post #292 of 920
Quote:
Originally Posted by David Susilo View Post

I give up. It is literally like me claiming that the earth is round to the flat-earth theorists way back when.

All I can say (again) is live with 4K for yourself, then feel free to give negative opinion as much as you guys want, but not before living with 4K or at least before having spending a couple of hours with 4K.

It's really not a difficult question to answer: were you seeing pixels when viewing 1080p from the same distance, or not?

If you were not seeing the individual pixels from 10 feet, why would doubling the pixel count make the picture look better?

There is no magic in this, but you guys keep talking around the issue.

Human ability to resolve detail is fairly well understood, so pardon me if I do not care about statements which do not make sense, be it "personal experience" with 4k, or with a seance.

If comfynumb wants expert opinions which actually make sense within current understanding of visual acuity, here is an excerpt from Dr. Raymond M. Soneira, the President of DisplayMate Technologies Corporation:
Quote:
1920x1080 HDTVs: On the other hand, the average viewing distance for living room HDTVs in America is around 7 to 10 feet, depending on the screen size. So to appear "perfectly" sharp with 20/20 Vision like the iPhone 4 Retina Display, HDTVs only need a proportionally much lower PPI in order to achieve "Retina Display" status and have the HDTV appear "perfectly" sharp and at the visual acuity limit of your eyes.

Existing 40 inch 1920x1080 HDTV is a "Retina Display" when viewed from 5.2 feet or more
Existing 50 inch 1920x1080 HDTV is a "Retina Display" when viewed from 6.5 feet or more
Existing 60 inch 1920x1080 HDTV is a "Retina Display" when viewed from 7.8 feet or more

Since the typical HDTV viewing distances are larger than the minimum distances listed above, the HDTVs appear "perfectly" sharp and at the visual acuity limit of your eyes. At the viewing distances listed above the pixels on a 1920x1080 HDTV will not be visible by a person with 20/20 Vision in exactly the same way as the Retina Displays on the iPhone 4, new iPad 3, and MacBook Pro at their viewing distances. So existing 1920x1080 HDTVs are "Retina Displays" in exactly the same way as the existing Apple Retina Display products. If the HDTVs had a higher PPI or a higher pixel resolution your eyes wouldn't be able to see the difference at their proper viewing distances.....

4K HDTVs and Projectors: Some manufacturers are introducing HDTVs with resolutions that are at least double the existing standard 1920x1080 resolution - 3840x2160 or more. They are often called 4K displays. Some reviewers have already claimed dramatically improved picture quality and sharpness - but that is impossible unless they have significantly better than 20/20 Vision or are watching from an absurdly close viewing distance. However, the higher resolutions are important for Digital Cinematography and cinema projectors that have large 10 foot or more screens.
post #293 of 920
Of course I can see pixel structure of the 85" 16:9 are from my 9'-10' seating distance! I hardly know anybody who can't. Even within my family and relatives (14, myself included, 13 can see the pixel structure right away). My clients (ranging from opthalmologist, TV anchorpersons, post production engineers, lawyers, RCMP -- Canadian FBI equivalent, printing company owners) and my friends (people who work for Technicolor, Pinewood Studios Toronto etc), all of them can see the pixel structure of 2K and when I switch to the 4K projector, the improvement is noticable.

But soon enough, I know that one of the flat-earth believers will claim that my claim is impossible, I'm lying, etc. Hence this is my final post for this thread. The world is round, guys, accept it.
post #294 of 920
Quote:
Originally Posted by Ryan1 View Post

It's really not a difficult question to answer: were you seeing pixels when viewing 1080p from the same distance, or not?

If you were not seeing the individual pixels from 10 feet, why would doubling the pixel count make the picture look better?

There is no magic in this, but you guys keep talking around the issue.

Human ability to resolve detail is fairly well understood, so pardon me if I do not care about statements which do not make sense, be it "personal experience" with 4k, or with a seance.

If comfynumb wants expert opinions which actually make sense within current understanding of visual acuity, here is an excerpt from Dr. Raymond M. Soneira, the President of DisplayMate Technologies Corporation:

Exactly. What the industry needs to work on is getting the quality of 1080i/720p content up to par, not this 4k stuff.
post #295 of 920
Quote:
Originally Posted by BiggAW View Post

Exactly. What the industry needs to work on is getting the quality of 1080i/720p content up to par, not this 4k stuff.

What a hypocrite! Up to par to what? You're happy with the max bitrate of 10mbps!! rolleyes.gif
post #296 of 920
Quote:
Originally Posted by David Susilo View Post

Of course I can see pixel structure of the 85" 16:9 are from my 9'-10' seating distance! I hardly know anybody who can't. Even within my family and relatives (14, myself included, 13 can see the pixel structure right away). My clients (ranging from opthalmologist, TV anchorpersons, post production engineers, lawyers, RCMP -- Canadian FBI equivalent, printing company owners) and my friends (people who work for Technicolor, Pinewood Studios Toronto etc), all of them can see the pixel structure of 2K and when I switch to the 4K projector, the improvement is noticable.

But soon enough, I know that one of the flat-earth believers will claim that my claim is impossible, I'm lying, etc. Hence this is my final post for this thread. The world is round, guys, accept it.

Statistically, this makes no sense.

Even if you personally have 20/10 vision, this would put you in a distinct minority (I will remain polite and not ask you how old you are, or if your vision has been recently tested).

The chances that most of your friends are all eagle-eyed are even more statistically implausible. And lawyers, really?! Most lawyers are blind as bats due to life-long eye strain.

But let us give you the benefit of the doubt on the eagle-eye bit and assume that you and your friends are all better than "normal", say at approximately 20/15. According to a standard visual acuity calculator, you still need to sit about 8 feet or so from your 85" screen to get the full benefit of 4k (and this, mind you, is with perfectly illuminated, high contrast, still image). At 10 feet, your screen needs to be closer to 105" to get the benefit of 4k resolution (for the same perfectly illuminated, high contrast still image).

BTW, if you are like the rest of us with "perfect" 20/20 vision, you need to sit at 6.3 feet of your 85" screen, or at 10 feet, you need a screen of just about 140" to get the full benefit of 4k resolution.

Since the glass is cheap enough, 4k will be the standard in a couple of years, I agree. But unless screen size increases correspondingly, 4k by itself will be as beneficial in most homes as "by the pound" 1000wpc separates and helium-filled cables.

And as far as max bitrates, the upcoming H.265 video encoding standard provides the same perceived quality as H.264, but at half the bitrate. So, 10mbps should be just fine for those of us with "normal" vision.

Cheers. cool.gif
Edited by Ryan1 - 4/21/13 at 1:10am
post #297 of 920
Quote:
Originally Posted by David Susilo View Post

What a hypocrite! Up to par to what? You're happy with the max bitrate of 10mbps!! rolleyes.gif

The golden standard of 19mbps MPEG-2 for 1080i/720p, or the MPEG-4 AVC equivalent of about 9mbps. That area where human eyes actually can tell the difference. I am certainly not a hypocrite for wanting to bring the quality of other sources up. Today's HDTV signals, encoded in real-time are equivalent to maybe 6 or 7mbps MPEG-4 AVC, and if they were encoded offline would be probably 5 or less. I've seen offline encoded MPEG-4 720p at 4mbps that looks better that most cable HD channels. 10mbps MPEG-4 AVC is the realm of VUDU HDX in 1080p, where you're getting visual equivalency with Blu-Ray.
post #298 of 920
Personally physical media has been completely dead to me for about 5 years and really started about 10 years ago. The only physical media I spend money on is Vinyl or local bands where you can only get their album at the show. Other then that I can stream or D/L any video I want to see in less time then it takes for me to go to a rental location and come home. Internet caps will have to be raised and google fiber is going to force our luddite ISPs into stepping up their game. Once google fiber is available by me, d/ling a 4K movie will be trivial. Traditional TV is already on it's way out as it is so streaming and d/ling are the natural progression.
post #299 of 920
As many have said. Streaming quality is poor at best. REALLY DISAPPOINTING. It's great for Dora the explorer for the kids. But main movie watching is FAR FAR behind.
post #300 of 920
Of course physical media will disappear but at least 10 years later.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Community News & Polls
AVS › AVS Forum › News Forum › Community News & Polls › Is the End of Physical Media Inevitable?