I've seen hardware-accelerated decoding vs. 100% software decoding discussed elsewhere on this forum from time to time. Often, people say they see a PQ improvement, usually with the hardware decoding.
I find this difficult to believe, since as far as I know the actual decoding process (using various math functions and algorithms, like inverse discrete cosine transforms, etc.) would be identical in each case. With hardware-accelerated decoding some of these operations would be done in hardware (on the video card) rather than by the CPU. Thus, less CPU load like you said. But the end result should be the same. DVD decoding is just crunching numbers, isn't it?
I tried it both ways on my HTPC and I was not surprised, therefore, to see no difference in PQ. I never checked to see if there was a difference in CPU load. I do have a problem with hardware decoding, though: it causes occasional PC lockups. This forces me to use the software setting all the time.
The difference you will probaly see is in the de-interlacing. eg with hardware mode on my ATI card, watching video material, my ATI card automatically uses Progressive de-interlacing but the software doesn't. It must be selected manually within WinDVD.
I am not sure about this but the scaling of the image to fit your screen resolution could have an impact too, the hardware mode may have a special way of resizing it (like ffdshow but not as complciated).
Originally posted by Radiophile ....DVD decoding is just crunching numbers, isn't it?.....
In a sense. But for a better description a good EE person needs to come in here. I just know the general answer. The decoders and GPUs on a vid card are not like normal CPUs. They deal with different types of data (like no floating point IIRC). So their IS are usually much smaller and can be much more optimized. So having a CPU do a graphics or video process would take much more resources than having a GPU do it.
I can't find it, but an old thread here on the forum discussed ATI's IDCT, which is in their GPU, was so good that it rivaled a $10K professional DVD player. The person explaining this in the post was "in the business" and was pretty upset because it was his $10K, but he was also happy to have an alternative that cost about $100 (price at the time for one of ATI's VIVO cards)
Of course ATI, have improved upon their Rage Theater TV output chip since then too, so it may even be better quality since the deinterlacing could be better. Or was this only important for computer viewing, can't remember.
I would tend to disagree with conclusion re: software vs. hardware decoding, and if my memory serves me well, there were several threads where most people concluded that software decode produces better PQ.
Don't know exactly why, but that is very obvious on my system, regardless whether I use WinDVE 5, NVDVD 2.55 or Sonic filters. I believe that hardware acceleration uses different (standardized) algorithams for decoding than software mode.
The fact that I can see a difference could be a combination of several factors: high resolution display ([email protected] fed its native resolution), all-digital path (DVI to monitor) large size of monitor (61"), decent display technology (DLP HD2 chip), proper monitor calibration (all forum tweaks + DVE), powerfull video card (9800 Pro with dedicated power supply), use of superior video renderer (VMR9 vs. overlay mixer), stabilized clean power used for entire HT ystem (AVS 2000 + HTS 5000), good bias lighting in the viewing room.
However, each system is unique and if you do not see a difference that is fine. To me this would just indicate that your system has headroom to grow.
A forum community dedicated to home theater owners and enthusiasts. Come join the discussion about home audio/video, TVs, projectors, screens, receivers, speakers, projects, DIY’s, product reviews, accessories, classifieds, and more!