Originally Posted by benwaggoner
"Tends to" is my qualification. Corser grain will survive more than finer grain, and there's less of an effect at high bitrates.
Grain preservation simply wasn't a major design goal of AVC. Video conferencing was a much higher priority than HD movie delivery, for example. The High Profile had to be added after the initial spec was done to get it even in the ballpark of VC-1 for HD film reproduction.
I wasn't at Microsoft then, but yes, this was definitely something I remember folks at Microsoft being proud about circa 2003.
Less so. They were focused on other scenarios. I'm sure some thought about it, but it wasn't something I remember being talked a lot about at MPEG events and such.
thanks for the answers, man
So, when creating VC1, it was a conscious decision to treat grain differently from how MPEG2 did. Is there an effort underway, or planned for, to educate early adopters and reviewers about this difference? That this is not a flaw of VC1, but rather a strong point, one that should be welcomed by those who want a transparent encode?
I really think some consumer-education would help VC1's cause (assuming, of course, that VC1 is the most accurate at grain-encoding). I say this cos its gonna take a bit of visual evidence wrt the film-master to convince people (like me) who have watched nothing but MPEG2 for HT.
Btw (if you know), when creating the AVC HP, what design choices led to bad grain preservation at bitrates lower than what VC1 would need for a proper grain-encode? (If I understand you correctly, you are syaing that at higher than VC1 bitrates, AVC HP is fine with grain.)
Also, does MPEG2 Ever do a non-exaggerated grain-encode, say at high enough bitrates? Or are you syaing that MPEG2 is incapable of accurate grain preservation?