Originally Posted by Livin
In the same review I linked to
, GPUs are compared in one section based only on deinterlacing and then with denoise, etc (on an otherwise unadulterated images) and there are clear difference in what each GPU is able to render and to what the accuracy.
Denoising is another post processing step that, IMHO, is not really necessary for Blu-ray videos and even most broadcast content. It might be necessary for only camcorder content (and even in that case, most camcorders do plenty of denoising before actually encoding the video). The HQV test clip is an artificial one just to show the denoising algorithms in action.
Deinterlacing in current day GPUs : Different GPU vendors use different algorithms. While AMD's VA used to be very good, it appears that other vendors have caught up. For non-artificial test material, as long as the cadence is detected properly, it is almost impossible to tell the difference between the deinterlaced outputs from the various GPUs. Please look at MissingRemote's reviews where Andrew uses a football testclip. Recently, at AnandTech, we have started using some 480i TV broadcast content with particularly nasty ticker combing artifacts. Using that clip and madVR deinterlacing (which uses the HW deinterlacer in the GPU), I can't visually find any difference when playing it back using GPUs of any of the three vendors.
Also, in the pictures in the discrete HTPC GPU shootout piece, note that different GPU vendors have different default contrast enhancement settings / output levels / RGB or YCbCr output. To discuss things on a equal footing for the average consumer, we left everything at default. It is possible that a reader might like the default config of one of the GPU vendors better, but that doesn't make that output right for everyone
For those of you that only use 1080p material, it likely does not matter what GPU you use as long as it can handle the base processing... BUT if you use other material like 1080i, 720p, 480p, etc AND have a large screen where you'll notice 'PQ inconsistencies' AND want to get the cleanest scaling, deinterlacing, denoise, deblocking, etc .. then GPU matters... a lot. I'm in all of these categories.
For H.264, deblocking is not optional in the decoder path. Hence, you should be able to get the same output from all the GPU decoders. If you use madVR for scaling, the output will again be the same. EVR / EVR-CP use driver APIs for scaling, so the algorithm used may not be the same. Unfortunately, I don't think you can choose what algorithm gets used in that step for any of the GPUs. I already covered the denoising aspect in the first part of this post.
As I mentioned before... I moved from an AMD HD6450 (low-end card) to an Nvidia Quadra 880M and noticed immediately much more macroblocking. Ive been trying to tweak the NVidia settings best I can but there seems FAR LESS ability to do so than with AMD. Thus, PQ is noticeably much lower. AND, this comes from the fact I did no post processing with the HD6450 other than vector adaptive deinterlacing... so, I must conclude the AMD is much better at upscaling and deinterlacing, by default.
Quadro 880M is GT2xx part as per this page
. Further, it is a professional workstation card, and it is not immediately obvious whether the Quadro drivers have all the video post processing steps enabled in the drivers. GT2xx was never a great HTPC candidate (except for its Linux VDPAU support). So, I believe comparing Quadro 880M and AMD 6450 is like comparing apples and oranges. For 6450 comparison, I suggest using GT430. PQ wise, you will get very similar results.
I was hoping to get other videophiles that might have experiences with different GPUs but it seems maybe there are not many out there that have done evaluations.
What sort of further evaluation are you looking for? We will try to address this in future coverage.