AVS Forum banner
  • Our native mobile app has a new name: Fora Communities. Learn more.

any difference in DXVA between 6600GT and a 7600GT?

2955 Views 12 Replies 5 Participants Last post by  kamaleon
Hi folks,


Glad to be here



I am wondering if i should upgrade my current 6600GT pci-E card to a slightly faster 7600GT.

I am mostly concerned about the purevideo acceleration. Would anybody be able to tell me that the geforce 7 series GPU's have an enhanced purevideo motor compared to the 6 series?

I'm mostly using my card to watch live sat mpeg2 HD and H.264 channels (I'm in Europe).

I've read that VPU acceleration is dependant on clock speeds, but I wonder if there's any *real* improvement other then marginal CPU load variation between the two geforce series.


Any insights are welcome.


Cheers,


kamaleon
See less See more
1 - 13 of 13 Posts
For HD h264 and VC1, all you'll get is that marginal cpu difference - neither has anywhere near enough power to do processing like spatial-temporal deinterlacing or bad edit detection (despite NVidia's own spec sheet claiming they can).


For mpeg2, the only real difference is HD mpeg2 bad edit detection, and that still doesn't work properly - you're better using dscaler5 than Purevideo.
This chart shows the differences...

http://www.nvidia.com/page/purevideo_support.html


That's absolutely true about clock speeds and acceleration. I had to learn this on my own. I was wondering why IVTC wouldn't work on my mom's 6200 card with DVD playback (jumpy every 3 seconds) and eventually disabled it after finding that chart. The low-end 6000 series cards will not assist with IVTC. I think this is only something to worry about if you buy one of the extremely low-end parts. So by forcing it in the control panel I am guessing it was using her P4 3.0GHz for IVTC.
That chart is totally misleading - it claims IVTC, bad edit detection, spatial-temporal deinterlacing etc for all HD content. In reality, those cards can only do that for MPEG2 HD, and you need a 8800 to do it for h264/vc1 HD.


It even claims to do h264 acceleration for all manner of cards, when the reality is if you disable GPU acceleration on a 6600GT it makes absolutely no difference to the CPU usage. Insanely, it even claims to do h264 acceleration on integrated 6150 graphics.....
Wicked, thanks for the 2 prompt replies

Quote:
Originally Posted by arfster /forum/post/0


For HD h264 and VC1, all you'll get is that marginal cpu difference - neither has anywhere near enough power to do processing like spatial-temporal deinterlacing or bad edit detection (despite NVidia's own spec sheet claiming they can).

So does that mean the VPU is one and the same on the 6 and 7 series hey?

Reading in the firingsquad article (can't post the url as i have no right still a newbie here)

"While the VPU across the GeForce 6 and 7 line is identical in design, performance is dependent on the core clockspeed. The advantage of this is that you won't have to buy the top-of-the-line GPU to get H.264 decode acceleration. Likewise, as new features are developed for the GeForce 7, it'll often trickle down into the older GeForce 6 models. Still, this also means that the VPU in the GeForce 7950GT at 550MHz is slower than the VPU in the GeForce 7600GT at 580MHz. Of course, this performance difference is purely theoretical."

As english is not my native language, can anyone attempt at explaining me what this person probably meant by "it'll often trickle down"?
Quote:
For mpeg2, the only real difference is HD mpeg2 bad edit detection, and that still doesn't work properly - you're better using dscaler5 than Purevideo.

Could you develop that statement?

Personnally i've found out that the odd sat broadcast i've watched in mpeg2 HD was not as properly displayed using dscaler5 and cyberlink7 (which i believe is using the purevideo VPU?), and there was a major CPU load difference - 45% with dscaler 5 and around 17% with cyberlink7

Quote:
Originally Posted by Jarretth /forum/post/0


That's absolutely true about clock speeds and acceleration. I had to learn this on my own. I was wondering why IVTC wouldn't work on my mom's 6200 card with DVD playback (jumpy every 3 seconds) and eventually disabled it after finding that chart. The low-end 6000 series cards will not assist with IVTC. I think this is only something to worry about if you buy one of the extremely low-end parts. So by forcing it in the control panel I am guessing it was using her P4 3.0GHz for IVTC.

Can you briefly explain what IVTC does? and how is it related to clock speeds?

Tell you teh truth, i don't really understand the clock speed issue. A faster VPU will take more load off the CPU or will it be able to add more features to the acceleration - like bad edit, IVTC, etc? How does it work?
See less See more

Quote:
Originally Posted by arfster /forum/post/0


That chart is totally misleading - it claims IVTC, bad edit detection, spatial-temporal deinterlacing etc for all HD content. In reality, those cards can only do that for MPEG2 HD, and you need a 8800 to do it for h264/vc1 HD.


It even claims to do h264 acceleration for all manner of cards, when the reality is if you disable GPU acceleration on a 6600GT it makes absolutely no difference to the CPU usage. Insanely, it even claims to do h264 acceleration on integrated 6150 graphics.....

I do agree with you, but for me, H.264 acceleration on the 6600GT is actually important, CPU load will hugely increase without it, using cyberlink7
It restores the film playback to 23.97 fps (like watching in a theatre) versus 29.97 (the framerate on the DVD and the way your TV expects it to playback at). None of the frames are stretched (contain an odd or even field of the previous one). It's only something you'd notice if you went frame by frame. 23.97 looks smoother since each frame is equal to 1 odd + 1 even field.
and how and where do you activate those options? never saw any of that.

Quote:
Originally Posted by arfster /forum/post/0


It even claims to do h264 acceleration for all manner of cards, when the reality is if you disable GPU acceleration on a 6600GT it makes absolutely no difference to the CPU usage. Insanely, it even claims to do h264 acceleration on integrated 6150 graphics.....

That's not true. I've tested H.264 acceleration with a 6600 on a low-end system and it certainly did indeed make a difference, it brought the CPU usage when playing back 1080p H.264 clips down as much as 20%.


The 6150 has the PureVideo VPU from a GeForce 6600 grafted on, and has a reasonably high clock speed so it's certainly plausable it can accelerate H.264. I haven't tested it myself.


But I do agree NVIDIA's chart is pretty poor.

Quote:
Originally Posted by kamaleon /forum/post/0


and how and where do you activate those options? never saw any of that.
http://www.tomshardware.com/2007/01/...deo/page4.html

Quote:
Originally Posted by SpHeRe31459 /forum/post/0


That's not true. I've tested H.264 acceleration with a 6600 on a low-end system and it certainly did indeed make a difference, it brought the CPU usage when playing back 1080p H.264 clips down as much as 20%.

Odd, it does nothing on my two machines (pentium d 3.5ghz, core2 3.4ghz). Might be on lower end it's more use.


For more details, see the anandtech tests, even faster cards than the 6600GT are making no difference with HD-DVD h264.

Quote:
The 6150 has the PureVideo VPU from a GeForce 6600 grafted on, and has a reasonably high clock speed so it's certainly plausable it can accelerate H.264. I haven't tested it myself.

I have - it crashes :) Given the 6150 can't even do 1080i MPEG2 without dropping frames, it's asking way too much for it to cope with the same resolution in h264 (which requires hugely more resources)
typical THG test


no indication whatever what kind of deinterlacer they used in the dxva avivo test

....vector deinterlacing ..motion adaptive ...default


no indication about various pq issues with the detail enhancement features (both gc are affected)


no indication about the purevido problems with dbv/s tvstreams (pretty unusable)


and so on...


others may chime in


best

mine
See less See more

Quote:
Originally Posted by SpHeRe31459 /forum/post/0


The 6150 has the PureVideo VPU from a GeForce 6600 grafted on

How do you know this stuff? Where could i find more info about this? Please?


Where could i find info/ a comparison between 6600GT's VPU and 7600GT's one?
See less See more
1 - 13 of 13 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top