Originally Posted by khronikos
Sorry, let me clear things up. You did take that quote before I edited it. I mean average in that I think most blurays are decent enough. I was not WOWED by this as I have watched many blurays already that have been as good or better. This one has a decent bitrate.
Are you watching the video or are you watching the bit rate meter?
I realize that you're new here, but this argument about bit rates has been beaten into the ground with a sledgehammer.
Bit rate, in and of itself, is not
an accurate indicator of quality. As much as you seem to believe otherwise, there is no direct linear relationship between bit rate and picture quality. It is equally possible to have a crappy picture with high bit rate and an excellent picture with low bit rate, depending on the specific circumstances of the video content and how well the compression is performed. Some images do not require many bits to be displayed transparently, and cranking up the bit rate can be an exercise in diminishing returns.
The Thin Red Line is a slow-moving film with a lot of long, slow, static shots without much movement in them. Those sort of scenes don't have the same bit rate requirements as, say, the spastic quick-cut shakycam action scenes in Transformers 2.
As easy as it would be to say, "Look, the bit rate meter just spiked. That's great picture quality!"
-- and as much as that may make any uneducated Joe with a PS3 feel like he's suddenly an expert in video engineering based on nothing more than watching the bit rate meter go up and down -- it just doesn't work that way in the real world.
It would also help to have a working knowledge of the grain and color properties of different 35mm film stocks, the optical characteristics of different camera lenses (especially anamorphic camera lenses), and the aesthetics of motion picture photography in general.
Picture quality is a much more complex topic than some people realize, and cannot be boiled down to, "Ooooh, high bit rate. So shiny!"