Originally Posted by Dan Hitchman
It would be nice if WB dropped their outdated VC-1 encoders for the newest AVC and stopped with the low bitrate dips.
I think you missing the key difference between a codec and encoder. The codecs cant be improved after the format specification is finalized, or else it would be impossible for older players to decode the movie. You can however improve the encoder, but that has little to do with the codec.
I've noticed that many of their Blu-rays are indeed softer than from other studios... and low and behold... their bitrates actually do dip below 10 Mbps in scenes you would think would be encoded at around 20 or so.
Apples and oranges since every movie is unique. But what scene do you think need 20 mbs?
They didnt as an example use a DI for the entire movie of Inception (just some scenes), this moves down the master a couple of generation from the negative, compared to alot of other movies that uses DI for the entire movie.
By using lower average bitrates, that forces them to use heavier doses of high frequency filtering to make it easier on the encoder.
Why does they need to use High frequensy filtering? It you want to encode with lower bitrate, you just encode with lower bitrate.