AVS Forum banner
1 - 13 of 13 Posts

·
Registered
Joined
·
2,428 Posts
Discussion Starter · #1 ·
Simple question - Is watching (audio & video)a Blu-Ray movie in 1080i superior to viewing the same exact movie in 1080i DD on Directv PPV? Directv now releases new movies on the same day as the DVD is released to the retail market and before released by Netflix, Blockbuster, etc.


I welcome all comments.
 

·
Banned
Joined
·
365 Posts
Let's say you watch Dexter on Showtime OnDemand. It's bitrate will be 6.75Mb. On Netflix Blu-ray, ~29Mb.

D* is a max of ~13Mb while Blu is 40Mb. If a few blocking or dropped frames don't bother you, go for VOD/PPV. You already know what you're getting.
 

·
Banned
Joined
·
20,735 Posts
I have yet to see anything on broadcast, digital cable, satellite, or VOD that comes anywhere close to matching the video quality offered on any decent blu-ray.


That may change in the future, but at this point there remains fairly significant differences.


How significant those are for you depend on how critical you are, the quality of your video system, and what viewing ratio you're viewing from.


You could easily view on both and see how they differ and make a judgement yourself on how much it matters to you. For some people, it matters a lot, and for some people it matters not at all.


Certainly there is a convenience advantage with VOD and not having to deal with discs as well.
 

·
Registered
Joined
·
2,428 Posts
Discussion Starter · #4 ·
Initially, I among many others went the HD-DVD route and loved both the audio and video quality. Since I am finally about to get a quality BR player it is refreshing to learn I can expect a degree of superior quality over DTV.


Thanks for your help.
 

·
Banned
Joined
·
20,735 Posts

Quote:
Originally Posted by RWetmore /forum/post/19599782


Chris,


This is a little off topic, but what is your view or opinion about 10 bit color? Do you think it's necessary or is 8 bit transparent? Any blind studies you know that have been done?

Well, hopefully not actually blinded!



It depends. I think that 8-bit content is fully capable of being sufficient without visible banding or visible bit-depth issues, but it comes with some serious caveats.


First is that we are currently viewing on displays with relatively limited dynamic range. And at this point, gamma-corrected 8-bit, when handled properly is IMO sufficient. It is not sufficient when you are dealing with displays of much higher dynamic range where it no longer is enough steps. But we aren't there with current displays and viewing.


Second, is the "when handled properly" caveat. 8-bit is delivered nonlinearly, and a lot of displays don't have enough bit-depth in linear space even to keep up with 8-bit content. So you need to be able to de-gamma into basically 12-bit linear just to match gamma-corrected 8-bit. A great deal of banding that we see today is because a lot of displays, particularly affordable displays, don't have enough bit-depth and add a lot of banding, regardless of how good the 8-bit source looks.


The last part is also what happens along an 8-bit chain. I consider 8-bit sufficient, but barely sufficient. Anything less and you run into problems. Any kind of remapping, or gamma tweaks, or adjustments of black and white points or ranges such as moving between video and graphics levels has to be done in more than 8-bit. And what happens frequently in the digital domain, particularly in the home, is that you're taking 8-bit content, and moving things around in 8-bit, or maybe you're doing it in 10 or 12, but the output is only 8-bit so practically it's the same thing: you've got a bottleneck. So 8-bit is great if it starts great and then basically isn't touched again. And there are a zillion reasons why stuff gets messed with after it's already been turned into 8-bit, and it is messed with in a way that leaves you lots of banding. Just look at anything on cable/satellite/broadcast, you see banding all over the place. Commercials are often the worst. So I would prefer to have 10-bit content, not because I believe it actually would be any visibly different when handled properly, but because of how much more resiliant it is to being handled improperly.


And this is exactly why you see such high bit-depths in the production world, because you need to be able to mess with everything and not cause a whole lot of problems.


So in an idealized situation, I am not convinced that with current LDR displays, that 10-bit really provides any significant or visible benefit over 8-bit (if we assume we aren't making any compromises to get to 10-bit: we're just holding everything else exactly the same).


But with the proliferation of video processing all over the place, it's really easy to start doing stuff in your playback chain before an 8-bit bottleneck, and start doing damage, because 8-bit is sufficient, but just barely.


That's my perspective at this point. Willing to be convinced otherwise, but handled properly I've never seen 8-bit content really fundamentally have visible banding issues that is inherent to the fact it was 8-bit, assuming we're looking at current low-dynamic-range displays. 8-bit is grossly unsufficient with HDR display. As the DR of displays increases, I think that's where you'll see a move towards greater bit-depths, and of course in the professional world, the production world, and things like D-cinema where you have the bandwidth available to deliver it. But as far as today goes, I simply am not convinced that there would be any visible impact with DVD or BD on current systems in moving to 10-bit.


Which is not to say that there are not frequently all kinds of banding problems and bit-depth limitations going on, but that is not because 8-bit is insufficient, but because things are being authored poorly, handled poorly, or being crippled in other ways that is yielding effective performance that is less than 8-bit.
 

·
Registered
Joined
·
4,737 Posts
Thanks, Chris.


Have there been any scientific studies on 8 bit vs. 10 bit color - meaning has anyone ever successfully distinguished between the two assuming all others things are equal?


Is it true generally that 10 bit would not need to be dithered? If yes, shouldn't that in theory provide purer color resolution?


Also, someone mentioned here that it is possible to add 10 bit color to the blu-ray spec with backward compatibility to 8 bit players. The only caveat is those with 8 bit only players would get 8 bit undithered.
 

·
Banned
Joined
·
20,735 Posts

Quote:
Originally Posted by RWetmore /forum/post/19629209


Thanks, Chris.


Have there been any scientific studies on 8 bit vs. 10 bit color - meaning has anyone ever successfully distinguished between the two assuming all others things are equal?

I'm not aware of such a study.

Quote:
Is it true generally that 10 bit would not need to be dithered? If yes, shouldn't that in theory provide purer color resolution?

I don't really know. It is my understanding that 8-bit can be helped with some dithering, but mainly to help hide compression artifacts. I don't know if 10-bit would benefit from similar use of dither, my guess if it's to hide compression artifacts that dither would still help in 10-bit. If dither is used simply to mask 8-bit transitions from becoming visible, then I would think that 10-bit would eliminate the need to use dither for that. But I'm not sure that 8-bit would need dither for that.

Quote:
Also, someone mentioned here that it is possible to add 10 bit color to the blu-ray spec with backward compatibility to 8 bit players. The only caveat is those with 8 bit only players would get 8 bit undithered.

Beyond my knowledge. And not sure about that. I also seem to recall something like that being discussed, but I don't remember if that was a claim that it was possible, or just someone who wanted 10-bit and was imagining a way to do it.
 

·
Registered
Joined
·
4,737 Posts

Quote:
Originally Posted by ChrisWiggles /forum/post/19629878


I don't really know. It is my understanding that 8-bit can be helped with some dithering, but mainly to help hide compression artifacts. I don't know if 10-bit would benefit from similar use of dither, my guess if it's to hide compression artifacts that dither would still help in 10-bit. If dither is used simply to mask 8-bit transitions from becoming visible, then I would think that 10-bit would eliminate the need to use dither for that. But I'm not sure that 8-bit would need dither for that.

That's very interesting. I assumed that the dithering was used to hide 8 bit transitions. I didn't know it was used primarily to mask compression artifacts.


Anyway, instinctively I've suspected that 8 bit may be too close to the threshold of what can be distinguished and that 10 bit has a much "safer" amount of headroom (and perhaps is even visibly slightly better under ideal display/viewing conditions). Of course, I'm not sure, which is why I was curious if any blind scientific studies have been done.

Quote:
Originally Posted by ChrisWiggles /forum/post/19629878


Beyond my knowledge. And not sure about that. I also seem to recall something like that being discussed, but I don't remember if that was a claim that it was possible, or just someone who wanted 10-bit and was imagining a way to do it.

I don't recall the details either, but I think it was Amir who said it was feasible to do, though he wasn't convinced there was enough benefit for it to be worth it.
 

·
Registered
Joined
·
6,970 Posts

Quote:
Originally Posted by ChrisWiggles /forum/post/19598141


I have yet to see anything on broadcast, digital cable, satellite, or VOD that comes anywhere close to matching the video quality offered on any decent blu-ray.


That may change in the future, but at this point there remains fairly significant differences.


How significant those are for you depend on how critical you are, the quality of your video system, and what viewing ratio you're viewing from.

I wish I could agree with you, but I can't. At least from an OTA HD comparative perspective, using my untrained laymen eyes. I've only had a blu-ray player for a short time, but so far... I'll take much of the OTA HD I see over the few blu-ray titles I've seen so far. At worst, it's a wash in terms of quality. This is viewing on an albeit, small-ish 42" plasma panel from about 5.5 to 6ft away from the screen.


Titles viewed thus far: The Descent, Gone with the Wind, Hellboy II, Amelie (the new version), La Femme Nikita and I Am Legend. All of which have solid reputations for image quality though only a couple of which might approach what is considered to be "reference" quality.


I'm not sure how to account for such divergent assessments, but I see many dramatically different opinions just like this; i.e., "incredible, nothing can touch it" vs. "meh, nice I guess but not much different from a good dvd version".


I suspect in my case, 42" is simple too small of a display for blu-ray to really impress, even if seated within the recommended proximity range for that given size. Also, the panel could use a good calibration, though my OTA viewing is operating under the same non-ideally calibrated conditions as well.
 

·
Registered
Joined
·
4,658 Posts
Quote:
Originally Posted by CruelInventions
I wish I could agree with you, but I can't. At least from an OTA HD comparative perspective, using my untrained laymen eyes. I've only had a blu-ray player for a short time, but so far... I'll take much of the OTA HD I see over the few blu-ray titles I've seen so far. At worst, it's a wash in terms of quality. This is viewing on an albeit, small-ish 42" plasma panel from about 5.5 to 6ft away from the screen.


Titles viewed thus far: The Descent, Gone with the Wind, Hellboy II, Amelie (the new version), La Femme Nikita and I Am Legend. All of which have solid reputations for image quality though only a couple of which might approach what is considered to be "reference" quality.


I'm not sure how to account for such divergent assessments, but I see many dramatically different opinions just like this; i.e., "incredible, nothing can touch it" vs. "meh, nice I guess but not much different from a good dvd version".


I suspect in my case, 42" is simple too small of a display for blu-ray to really impress, even if seated within the recommended proximity range for that given size. Also, the panel could use a good calibration, though my OTA viewing is operating under the same non-ideally calibrated conditions as well.
I'd have to disagree with you.


As Chris said, "decent blu-ray" blows away most any HD TV consistently. That includes Dish Network Sat-based HD and OTA ATSC on my setup, which is a 133" projector. The OTa and Sat stuff will break up in to macroblocks regularly, whereas I'll only occasionally get a BR disc that exhibits any such issue.


My buddy's DiretTV HD channels are similar on his 60" LCD.


It's not as noticable on my smaller 37" screen, so maybe that's what's leading to your experience.


-sc
 

·
Registered
Joined
·
6,970 Posts
Good point about macroblocking and such. Though I don't encounter it nearly as much with my OTA HD as you seem to do, still, I hadn't considered that aspect when making my previous post.
 
1 - 13 of 13 Posts
Top