Intel IGP HD2000/3000 can't be used for cable channels that are compressed with H.264 and requires DRM copy-protection. The driver just poops out when you try those channels. Luckily, such channels only exist in a few Cox markets with special Plus packages. But H.264 compressed (instead of MPEG2) cable channels sure will pop up more in the future when cable providers seek to free up more bandwidth. That's the only drawback I encountered with Intel IGPs so far.
This will change... and possibly soon. The broadcast companies are lobbying very hard to require copy-protection on ALL transmitted TV material. They are even pushing it for OTA. I pray it does not, but they are trying to kill OTA in the US. I refuse to pay for cable TV (prices are 5 to 10x what they should be) so I'll resort to 100% "Internet DVR" if they get their way.
It seems to me that the differences between amd,nvidia and intel post processing capabilities boil down to what kind of content is to be viewed. The anand article indicates that madvr pushes even hd4000 for some kinds of content. It sheds little light on the impact of using intel,amd or nvidia under media center!
What is not clear is what differences exist between hd2000,3000 and hd 4000 for playback of 1080i content through media center? Ceton and silicondust cable card tuners are widely used and require media center. In the U.S. we have 720p and 1080i for or broadcast and cable tv. If someone could focus on how well the hd4000 in media center handles deinterlacing 1080i vs. Amd and nvidia it could shed some light on how well it handles content that is widely viewed by htpc users.
I have not used Intel HD3000 or HD4000 for deinterlacing but here's some AMD & NVidia info..
AMD HD6450: I have done a lot with the AMD HD6450. Tested all sorts of material on it on the highest setting (Vector-Adaptive). It works VERY well for everything except OTA 1080i, which it drops frames. Just not quite powerful enough. From reading many user posts and reviews it seems that the HD6570 and higher handles 1080i perfectly.
NVidia: I would do more testing with my current NVidia 880M but based on my research (many web searches and this article) NVidia does not allow any control except on/off for Inverse Telecine . The posts/articles I have read says NVidia does Motion-Adaptive, which is not as good as Vector-Adaptive. IMO it sucks NVidia allows no control... it makes it harder (sometimes impossible) to tweak to optimal settings and do comparisons. AMD gives you total control and I've played with the AMD settings... you can see a huge difference on different materials depending on what you choose. Though, the de-interacing with the card I have 880M (equal to the ) does a good job from what I can tell... just falls down in other area of PQ when dealing with scaling, etc. And my 880M has Feature Set C which has all the latest decoding features from NVidia, except 4k. Ref 1 Ref 2
Here's a good techie (non-pro) review and discussion on GTS450 vs HD5670 vs HD3000 ... lots of good technical discussions and comparisons. NVidia does well here but they do note HD3000 was OK too.
... this is a videophile thread... it is about getting the 'best' quality. Though, if you have a display smaller than 50-55" you are not likely going to notice much difference no matter which of the GPUs we are talking about.