AVS Forum banner

1 - 5 of 5 Posts

·
Registered
Joined
·
46 Posts
Discussion Starter #1 (Edited)
Just how important is the video bitrate of a Blu-ray disc to image quality? While doing some research recently, I found something that intrigued me. for example, the film Avatar came to Blu-ray in its original, single-disc version and, shortly thereafter, as 3-disc Extended Collector's edition set. Now, here's what intrigued me: the original release has an average video bitrate of around 28 Mbps, while the 3-disc version, where they had to fit 16 additional minutes on a single disc, has a video bitrate of around 22-23 Mbps. Likewise, I found similar differences with Django Unchained. The US Blu-ray release of that film has a bitrate of 28 Mbps, compared to the European version's 23 Mbps. And I'm sure these are not the only 2 such cases.

But my question is: does this necessarily mean that those lower-bitrate versions have slightly worse image quality? What can technically be done to keep image quality the same, while reducing the bitrate by 5 Mbps (when, as far as I know, Blu-ray already pushes codecs to their limits)?

P.S.: I don't have both versions of the above mentioned films to do a comparison myself.
 

·
Registered
Joined
·
587 Posts
But my question is: does this necessarily mean that those lower-bitrate versions have slightly worse image quality? What can technically be done to keep image quality the same, while reducing the bitrate by 5 Mbps (when, as far as I know, Blu-ray already pushes codecs to their limits)?
It's possible that whomever mastered the first edition learned from the experience and found a way to reduce the average bit rate without reducing the amount of data in critical scenes. Progress is made by people who push the boundaries beyond what was previously thought possible.

OTOH the vendor may have made a conscious decision to let quality suffer to make room for all the extra fan content. Perhaps the people who buy these versions aren't as picky about getting top bit rates...
 

·
AVS Forum Special Member
Joined
·
11,139 Posts
If there's a visible PQ difference between the two versions perhaps, if it's subtle, it could be measured on screen by comparing specific image details to multiburst patterns. This 2014 post shows measurement results from a filmed, 4k-DI movie, with a sublink to one multiburst Blu-ray source test pattern. -- John


Edit: Just watched "The Theory of Everything" (TOE) on a FIOS premium channel and briefly measured a hair of Hawking's wife in bright sunlight, a similar scene to the 4k-DI movie above. In that film recall the wife's hair was ~2mm wide, while it measured ~3mm from the TOE production. The table in the linked post has a rez conversion--and of course more than bit rates factor in. Maybe Wheaties vs a UK breakfast. ;-)
 

·
Registered
Joined
·
19 Posts
Interesting to know, I always go for blu ray over rips/streams whenever possible because whilst the general public are basically sheep and go for resolution over bitrate and such I can tell a difference.
 

·
AVS Forum Special Member
Joined
·
11,139 Posts
Here's an interesting avsforum thread discussing Blu-ray (AVC) bitrates and rates that will be used in upcoming 4k Blu-rays (HEVC). The thread includes graphs and photos from a presentation at last year's NAB show. Came across a tech paper, from a 2014 conference, by the same author (John Pallett) covering similar material in the Oct. 2015 SMPTE "Motion Imaging Jounal." -- John


EDIT: Notice the Motion Imaging article contains graphs comparing PQ besides those comparing signal-to-noise ratios (avs thread).
 
1 - 5 of 5 Posts
Top