AVS Forum banner
1 - 20 of 44 Posts

·
Registered
Joined
·
96 Posts
Discussion Starter · #1 ·
Yes, again!


Please reply with your own WIN criteria.



Requirements.

1.- Requires max. 1 x 6 pin sup. power.

2.- Will work with a
3.- Bitstream

4.- UVD +3 or PureVideo +4

5.- HDMI audio bitstreaming

6.- HDMI 1.4a 3D

7.- Must be able to handle every type of encoding with > 60fps

Contenders
 
GeForce GTS 450
GeForce GTX 550 Ti
Radeon HD 6570
Radeon HD 6670
Radeon HD 6850
Battle Royal Win
Low heat/noise = ATi (Radeon HD 6570)
Power/Performance ratio = ATi (Radeon HD 6850)
24P = Tie
Software Support (LAV CUVID) = Nvidia
Driver/Control Panel = ATi (has more options, video post-processing)
Game eye candy = ATi (better AA)
Phyx = Nvidia

So my conclusion at this stage is that ATi make more efficient hardware, and Nvidia have better software support.


But I would really like to see your criteria matches.
 

·
Registered
Joined
·
3,826 Posts
ATI/AMD is much better at corrupting operating systems and causing fresh installs. It wins big here.


I don't play games. And I like to feel like I don't need to get lucky playing Russian Roulette when a new driver is installed. It is also nice when the bugs you waited to get fixed for months stay fixed and don't come back with the next version of the month.


The GT 545 while a bit pricey gets my vote as "the" HTPC card.. high horsepower, low power usage, single slot and I believe the 5th gen vid engine.
 

·
Registered
Joined
·
527 Posts

Quote:
Originally Posted by gtgray /forum/post/20831498


ATI/AMD is much better at corrupting operating systems and causing fresh installs. It wins big here.


I don't play games. And I like to feel like I don't need to get lucky playing Russian Roulette when a new driver is installed. It is also nice when the bugs you waited to get fixed for months stay fixed and don't come back with the next version of the month.


The GT 545 while a bit pricey gets my vote as "the" HTPC card.. high horsepower, low power usage, single slot and I believe the 5th gen vid engine.

Hmm, ok.


Been using ATI/AMD cards for....15 years now...and I've never seen a driver install corrupt a system and cause a fresh install, and I never ever use any type of cleaning process first (frankly I think they are more dangerous than helpful).


I'd really like to hear about your personal stories since I'm sure you are not just speaking rhetoric and you've personally experienced these problems multiple times otherwise you'd never ever post something like that here.


It's also good to know that nVidia is bug free and you never have to worry about them introducing bugs in their driver releases. No never.


o_O
 

·
Registered
Joined
·
3,826 Posts

Quote:
Originally Posted by santiagodraco /forum/post/20831704


Hmm, ok.


Been using ATI/AMD cards for....15 years now...and I've never seen a driver install corrupt a system and cause a fresh install, and I never ever use any type of cleaning process first (frankly I think they are more dangerous than helpful).


I'd really like to hear about your personal stories since I'm sure you are not just speaking rhetoric and you've personally experienced these problems multiple times otherwise you'd never ever post something like that here.


It's also good to know that nVidia is bug free and you never have to worry about them introducing bugs in their driver releases. No never.


o_O

Been there several times and I was long an advocate of ATI because of their low power, low cost parts and early support for HD Bitsreaming.


I just don't need the drama anymore. A new WHQL Nvidia driver comes out, I can load it with confidence, no wrong CCC panels, no crashes while the driver is loading. I have seen so much nonsense from these ATI drivers that AMD should be embarrassed. I can't speak to intel drivers in deatil becasue I have only run HD2000 stuff a little bit, but it has been robust at least in the 32 bit environments I have.


I own an HD 2600XT, an HD3650, an HD 5450 and about the end of last year after two very ugly adventures with ATI drives I almost bailed out of HTPCs all together. The now aging and arguably obsolete ATI stuff has been in a basket for the last year. I use Cetons for HD Cable I need robust productin quality display drivers.. HD cable is mission critical for me now with HTPC.


Bought a GT 430 after the second messy issue it a bit noisy, but I found it robust, then I added the GT 545s. Not one bit of excitement with the Nividia drivers. I update them when I get around to it, and I don't really see any difference between the old and the new ones for the most part.


So depending on the particualr boxes use it gets Nvidia discrete or Intel iGPU.. LAV CUVID, and madVR just make the case that much more compelling for Nvidia.


I have on a number of occasions had to user restore points after ATI driver installs, and twice nothing but fresh installs repaired the damage. I don't need the drama. I really don't. If there is someone this forum who is a big fan of ATI drivers I don't know who you are.
 

·
Registered
Joined
·
2,256 Posts
I use a Zacate HTPC and before than a HD5450, never had any major issues with them, but then again i only updated drivers once every 2-3 months, the most, so i must have missed out the "buggy" version. Which brings me to the question, regardless of what GPU manufacturer is in you HTPC, why the hell do people feel the need to update to every new driver revision once per 1-2 weeks? If it works, leave it as it is goddamit, it`s a known fact that messing around with drivers can crap out your sistem.
 

·
Registered
Joined
·
306 Posts
I'm running a HD6950 in my HTPC/Gaming PC and HD3200 IGP on my other PC.


I dread updating the drivers as I've had plenty of experiences of them not working after doing so (or just CCC not working) and having to - uninstall, boot into safe mode, driversweep, boot into normal mode, reinstall, reboot - all gets a bit tedious.


I don't think I've ever had to restore from a system backup though. Nonetheless, it seems inexcusable that they haven't come up with an automated update that will do any necessary uninstalling, rebooting, etc for me.


These days I tend to only update the driver first, then CCC after rebooting, which seems to be more successful generally.
 

·
Registered
Joined
·
20 Posts
My #1 requirement above all of those is that the card is passive.


So dunno, most of these (or all of them?) would fail



What happened to the good old GT 430? Too cheap and boring to be included?
 

·
Registered
Joined
·
844 Posts
I'm done using driver sweep to clean ATI driver install. It clean more than needed as it caused me to reinstall W7 x64 and what a pain. All you gotta do is use CCC uninstall and following the directions. Duck Soup
 

·
Registered
Joined
·
113 Posts

Quote:
Originally Posted by crótach /forum/post/20831832


What happened to the good old GT 430? Too cheap and boring to be included?

There are reports that it is a bit underpowered to handle 60p content. A good review by Anandtech showed it Maxed out at decoding about 55fps for some content. See 'discrete GPU for HTPC shootout' on the website. (posting from iPhone so adding URL tricky!)


However, there are now some overclocked GT430 cards around ( or of course you could overclock a stock one). Some report over a 20% increase in performance by doing this. My question is: would an overclocked gt430 be powerful enough? 20% over on 55fps would be about 66fps. Would there be enough headroom in this card to use MadVR at stock settings and LavCuvid at once? Has anyone any experience of an overclocked GT430? I am stuck with a low profile case which is limited in length by a HD bay, so this is the only ( current) option for me.


SBR
 

·
Premium Member
Joined
·
16,132 Posts

Quote:
Originally Posted by Sandy B Ridge /forum/post/20831881


There are reports that it is a bit underpowered to handle 60p content. A good review by Anandtech showed it Maxed out at decoding about 55fps for some content. See 'discrete GPU for HTPC shootout' on the website. (posting from iPhone so adding URL tricky!)


However, there are now some overclocked GT430 cards around ( or of course you could overclock a stock one). Some report over a 20% increase in performance by doing this. My question is: would an overclocked gt430 be powerful enough? 20% over on 55fps would be about 66fps. Would there be enough headroom in this card to use MadVR at stock settings and LavCuvid at once? Has anyone any experience of an overclocked GT430? I am stuck with a low profile case which is limited in length by a HD bay, so this is the only ( current) option for me.


SBR

Are you talking about the second chart of this page (although I don't see "55fps" there)? The chart is for pure academic purpose.

Quote:
The above testing is only of academic interest, since there is no real 1080p24 content at 110 Mbps.

Similarly there is no real 1080p60 content at 70Mbps or higher (it may be possible to create such a file by using video editing software though). Typical 1080p60 contents shot by camcorders are at 20-40Mbps.


BTW the main limitation in this test is Video Processor 4 of GT 430. You won't see any improvement by overclocking.
 

·
Registered
Joined
·
113 Posts

Quote:
Originally Posted by renethx /forum/post/20832048


Are you talking about the second chart of this page (although I don't see "55fps" there)? The chart is for pure academic purpose.

No, this one caught my eye:
http://images.anandtech.com/graphs/graph4380/38188.png



Quote:In the above graph, we see that the lack of shaders in the GT 520 affects the madVR performance. The madVR steps become the bottleneck in this case. On the GT 430, the VPU remains the bottleneck till the more complicated scaling algorithms (of theoretical interest) are enabled (which are not presented in the graph above).


Quote:
Similarly there is no real 1080p60 content at 70Mbps or higher (it may be possible to create such a file by using video editing software though). Typical 1080p60 contents shot by camcorders are at 20-40Mbps.


BTW the main limitation in this test is Video Processor 4 of GT 430. You won't see any improvement by overclocking.

I don't understand this bit about the VP4. The GTS450, and all GTX cards are also VP4 and not limited in the framerate decoding. The difference is number of shaders, GPU clock speed and memory clock speed. Surely upping the shader speed and memory speed by overclocking will make some difference?




SBR
 

·
Premium Member
Joined
·
16,132 Posts

Quote:
Originally Posted by Sandy B Ridge /forum/post/20832226


No, this one caught my eye:
http://images.anandtech.com/graphs/graph4380/38188.png


Quote:In the above graph, we see that the lack of shaders in the GT 520 affects the madVR performance. The madVR steps become the bottleneck in this case. On the GT 430, the VPU remains the bottleneck till the more complicated scaling algorithms (of theoretical interest) are enabled (which are not presented in the graph above).

The above test was done with a 1080p24 clip. So the minimum fps is 24, and 55 of GT 430 is fast enough. Actually 1080p24 is the easiest video format for madVR and there is nothing to worry about. Remember that *the bottleneck* means the bottleneck in this *academic* test, not in the real-world experience.

Quote:
Originally Posted by Sandy B Ridge /forum/post/20832226


I don't understand this bit about the VP4. The GTS450, and all GTX cards are also VP4 and not limited in the framerate decoding. The difference is number of shaders, GPU clock speed and memory clock speed. Surely upping the shader speed and memory speed by overclocking will make some difference?




SBR

- GT(S/X) 4xx/5xx except for 520: VP4

- GT 520: VP5


VP4 is fast enough to decode normal 1080p60 (max bitrate second chart. It also supports decoding 4K x 2K videos (VP4 does not of course). In playing back progressive HD contents, stream processors play little role (some light video post processing tasks such as sharpening and denoise).


Perhaps GeForce 600 Series (28nm, March 2012) all have VP5.
 

·
Registered
Joined
·
113 Posts

Quote:
Originally Posted by renethx /forum/post/20832508


The above test was done with a 1080p24 clip. So the minimum fps is 24, and 55 of GT 430 is fast enough. Actually 1080p24 is the easiest video format for madVR and there is nothing to worry about. Remember that *the bottleneck* means the bottleneck in this *academic* test, not in the real-world experience.

Sorry, I read this as the decode speed capability for 1080p h264 encoded material. So if you throw a 1080p h264 encoded clip at 60 fps at the card then it will struggle. Am I misreading this test then?

Quote:
VP4 is fast enough to decode normal 1080p60 (max bitrate second chart. It also supports decoding 4K x 2K videos (VP4 does not of course). In playing back progressive HD contents, stream processors play little role (some light video post processing tasks such as sharpening and denoise).

So a non-overclocked GT 430 is equivalent to an overclocked GTX560Ti for the purposes of MadVR and LAVCUVID (as long as the bitrate is sensible) because they have the same VP4 engine?


Still confused.



Thanks for you help with this. I'm struggling with which bits of the GPU are important for MadVR and LavCuvid respectively. Some posts on this board (not this thread) and doom9 have alluded to the fact that the GT430 (and GT520) is insufficient for MadVR so now I am confused as to why!

SBR
 

·
Premium Member
Joined
·
16,132 Posts

Quote:
Originally Posted by Sandy B Ridge /forum/post/20832642


Sorry, I read this as the decode speed capability for 1080p h264 encoded material. So if you throw a 1080p h264 encoded clip at 60 fps at the card then it will struggle. Am I misreading this test then?

Perhaps you don't know what the bitrate of real 1080p60 contents (videos shot by recent camcorders is the main source): it's 20-40Mbps. VP4 can handle them easily as seen in this second chart (the vertical line is the bitrate of the test clip: 30Mbps, 40Mbps, ...., 90Mbps; 50Mbps or higher is just for academic purpose).

Quote:
Originally Posted by Sandy B Ridge /forum/post/20832642


So a non-overclocked GT 430 is equivalent to an overclocked GTX560Ti for the purposes of MadVR and LAVCUVID (as long as the bitrate is sensible) because they have the same VP4 engine?


Still confused.

- LAV CUIVID: uses VP (decoding) and stream processors (deinterlacing for interlaced contents)

- madVR: uses stream processors (upsamling/upscaling)


So the number of stream processors (and the speed of video memory) is important as well as the generation number of VP. GT 430 DDR3-1600 is a good choice (it can handle all the video contents currently available fine under CUVID+madVR) and you won't see any difference between it and higher models in real-world experience. On the other hand, GT 520 has too few stream processors for proper deinterlacing.


Perhaps the posts by myself is the source of confusion. madVR 0.66 or later is less demanding and I confirmed that GT 430 DDR3-1600 is good enough.
 

·
Registered
Joined
·
1,111 Posts
Seems that the GTX 545 (or is it the GT 545? There's many links for both on google.com) would probably be a safe card if it has VP5 (and 1.5GB of DDR3), but they don't seem to be easy to find (I only see one link for a GT 545 at Fry's)?
 

·
Premium Member
Joined
·
16,132 Posts

Quote:
Originally Posted by H8nXTC /forum/post/20832826


if it has VP5

Unfortunately it's VP4. If you want VP5, just wait for GeForce 600 Series in March 2012.


I haven't seen a content GT 430 DDR3-1600 can't handle under LAV CUVID + madVR, but higher model can. You are welcome to post the clip at 4shared.com if you find one.
 
1 - 20 of 44 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top