Originally Posted by Supermans
Amir, you still haven't answered the question in the first post of this thread... I believe you have said VC-1 wasn't tweaked to benefit from higher bitrates while AVc was..
No, I said the reverse. VC-1 from day one was designed for HD and high bitrate work. It was AVC which was completely designed for SD applications and lower (i.e "the internet"). The HP profile added some quick fixes but there was not enough time to redesign the codec to have it be completely competitive.
What has happened is that people try to cover up the faults with turning things like loop filter off which is deadly at at sub 25 mbit/sec, but at higher rates, the side effects of blockiness is probably not too severe (although I have seen some bad examples in PotC).
Does that mean AVC handles higher bitrates better than VC-1? That is what this thread is asking..
Actually, the whole assertion is mistaken with respect to both codecs. Yes, tests have shown that AVC at higher rates have lower performance but that is an aberration I think (yes, I said AVC, not VC-1. I can dig up the report if you like).
The thing is that both codecs achieve 90% of their quality by the time you get to 10 to 12 mbit/sec. So the curve is highly exponential. As you seek data rates above say, 20-25 mbit/sec, you are chasing the long tail where adding bits, doesn't increase visual quality. This is why we say you don't need more bits. Not that the codecs lose performance at higher rates.
By the way, MPEG-2 has the same curve except that its slope is not as exponential as AVC/VC-1 and reaches the asymptote much later. So it can use more bits and create better quality better but that is because it is darn inefficient.