I need advice on a Physx card. - AVS Forum
Forum Jump: 
 1Likes
  • 1 Post By bd2003
 
Thread Tools
post #1 of 16 Old 07-02-2014, 07:54 PM - Thread Starter
Senior Member
 
Chris5028's Avatar
 
Join Date: Mar 2013
Posts: 276
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 58
I need advice on a Physx card.

What would you recommend for a dedicated Physx card? I have a 660 and would like to be able to use Physx in games like Metro LL without dropping to 15 FPS. I intended to upgrade to a 780 but I think I will be buying another Plasma before they are all gone. any advice is appreciated.

Specs:
i5 3450 @3.1ghz
660 oc
16GB DDR3 1600
750W Corsair Builder Series
2TB 7200rpm HDD

Chris5028 is offline  
Sponsored Links
Advertisement
 
post #2 of 16 Old 07-03-2014, 05:08 PM
AVS Addicted Member
 
DaGamePimp's Avatar
 
Join Date: Jan 2001
Location: WA State
Posts: 15,496
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 20 Post(s)
Liked: 153
A GTX 650 is suggested for optimal PhysX off-load implementation.


Jason
DaGamePimp is offline  
post #3 of 16 Old 07-05-2014, 04:16 PM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 10,336
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 156 Post(s)
Liked: 1354
I need advice on a Physx card.

Quote:
Originally Posted by DaGamePimp View Post
A GTX 650 is suggested for optimal PhysX off-load implementation.





Jason


Even that is probably overkill. I have a 750 I use for physX, and I've never seen the GPU load crack 20%. A GT 630-640 would prob be sufficient.

Best advice though? Don't buy a PhysX card. It's wasteful and less cost effective to split those resources between two cards than buying a better card. And the second card just sits idle in 90% of games. The only exception is if you're into bitcoin mining and specifically want a 750 for it's mining prowess. Or if you have an old card laying around. Just don't bother buying one specifically for the purpose.

Steam/PSN/Xbox Live: Darius510

Last edited by bd2003; 07-05-2014 at 05:24 PM.
bd2003 is online now  
post #4 of 16 Old 07-05-2014, 09:37 PM
AVS Special Member
 
moob's Avatar
 
Join Date: Jan 2008
Location: SoCal
Posts: 1,563
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 6 Post(s)
Liked: 44
Also, from what I remember for Metro:LL, you have to install the newest Physx drivers separately. I have no idea why those aren't included in the GPU drivers or when the game is installed (easy enough for the game to check for updated drivers). A lot of people were getting stuck with performance in the 15fps-20fps range and that fixed it.
moob is offline  
post #5 of 16 Old 07-06-2014, 03:35 AM
AVS Addicted Member
 
DaGamePimp's Avatar
 
Join Date: Jan 2001
Location: WA State
Posts: 15,496
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 20 Post(s)
Liked: 153
Quote:
Originally Posted by bd2003 View Post
Even that is probably overkill. I have a 750 I use for physX, and I've never seen the GPU load crack 20%. A GT 630-640 would prob be sufficient.

Best advice though? Don't buy a PhysX card. It's wasteful and less cost effective to split those resources between two cards than buying a better card. And the second card just sits idle in 90% of games. The only exception is if you're into bitcoin mining and specifically want a 750 for it's mining prowess. Or if you have an old card laying around. Just don't bother buying one specifically for the purpose.



I would think a 660 could benefit a great deal from a dedicated PhysX card (if one enjoys the pX additions).


However in that case it might just be more beneficial to seek out a second 660 for SLi.

I can say from personal experience that a 630 cannot handle Borderlands 2 PhysX on High.


Going with too slow of a dedicated PhysX card can actually decrease performance (assuming the main gpu is considerably faster and the dedicated pX card cannot keep up).


A 550 should also be a very capable PhysX card.


Jason
DaGamePimp is offline  
post #6 of 16 Old 07-06-2014, 11:55 AM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 10,336
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 156 Post(s)
Liked: 1354
Quote:
Originally Posted by DaGamePimp View Post
I would think a 660 could benefit a great deal from a dedicated PhysX card (if one enjoys the pX additions).


However in that case it might just be more beneficial to seek out a second 660 for SLi.

I can say from personal experience that a 630 cannot handle Borderlands 2 PhysX on High.


Going with too slow of a dedicated PhysX card can actually decrease performance (assuming the main gpu is considerably faster and the dedicated pX card cannot keep up).


A 550 should also be a very capable PhysX card.


Jason

I guess I'm underestimating how weak those cards are.

I don't see much of a future for PhysX anyway. The only reason anyone ever used it is because nvidia paid them to, so there's like 2-3 games a year that bother. I'm sure devs will heavily use GPU physics soon, but with the new consoles running AMD the only reasonable API to use is OpenCL or directcompute.

So while there's still some potential future for offloading physics to a secondary GPU, I have a feeling you won't need to buy another dedicated GPU, instead you'll be able to finally put that previously useless integrated GPU to work.

Steam/PSN/Xbox Live: Darius510
bd2003 is online now  
post #7 of 16 Old 07-06-2014, 12:38 PM
AVS Special Member
 
moob's Avatar
 
Join Date: Jan 2008
Location: SoCal
Posts: 1,563
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 6 Post(s)
Liked: 44
Not to mention the fact that you don't even need a dedicated card to utilize Physx (at least the not hard-coded kind).

I have a 7970GHz, and with Borderlands 2, I just edited the .ini and offloaded all the work onto my CPU and Physx (high) worked just fine. Sure, you take a small hit in performance, and you need a decent CPU, but it's easy enough to use.
moob is offline  
post #8 of 16 Old 07-06-2014, 12:44 PM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 10,336
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 156 Post(s)
Liked: 1354
I also think it's horrendously unoptimized. When I was monitoring the GPU activity on my secondary card, I noticed it didn't go back down when there wasn't much happening on screen. The GPU load just kept creeping up the further I got into the level, like it was still doing useless number crunching on objects that I left behind 10 minutes ago.

I wouldn't put it past nvidia to intentionally waste cycles and reduce performance like that in order to get you to buy a more expensive GPU.

Steam/PSN/Xbox Live: Darius510
bd2003 is online now  
post #9 of 16 Old 07-06-2014, 01:26 PM - Thread Starter
Senior Member
 
Chris5028's Avatar
 
Join Date: Mar 2013
Posts: 276
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 58
Thanks for the feedback everyone. I will try the driver. If that doesn't fix it I will just play physx free till I can save enough pennies for an upgrade. I am getting 50fps on ultra now. Too bad my mobo isn't sli friendly. I'd love a second 660.
Chris5028 is offline  
post #10 of 16 Old 07-06-2014, 01:48 PM
AVS Special Member
 
moob's Avatar
 
Join Date: Jan 2008
Location: SoCal
Posts: 1,563
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 6 Post(s)
Liked: 44
Good luck. I was surprised by how well Last Light ran as well after the poor performance of the original.

Quote:
Originally Posted by bd2003 View Post
I also think it's horrendously unoptimized. When I was monitoring the GPU activity on my secondary card, I noticed it didn't go back down when there wasn't much happening on screen. The GPU load just kept creeping up the further I got into the level, like it was still doing useless number crunching on objects that I left behind 10 minutes ago.

I wouldn't put it past nvidia to intentionally waste cycles and reduce performance like that in order to get you to buy a more expensive GPU.
Sad but true. Nvidia has been alienating me as a customer more and more recently. Physx is easy enough to ignore, but something like GameWorks has the potential to upset the market. I don't have a preference when it comes to GPUs...I just go for whatever offers the best price/performance. My current GPU is AMD but I've owned many Nvidia GPUs as well, including my very first dedicated card (GeForce2 MX). I just want fair competition.
moob is offline  
post #11 of 16 Old 07-06-2014, 03:05 PM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 10,336
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 156 Post(s)
Liked: 1354
Quote:
Originally Posted by moob View Post
Good luck. I was surprised by how well Last Light ran as well after the poor performance of the original.


Sad but true. Nvidia has been alienating me as a customer more and more recently. Physx is easy enough to ignore, but something like GameWorks has the potential to upset the market. I don't have a preference when it comes to GPUs...I just go for whatever offers the best price/performance. My current GPU is AMD but I've owned many Nvidia GPUs as well, including my very first dedicated card (GeForce2 MX). I just want fair competition.

Yeah, I have mixed feelings about gameworks (and PhysX, gsync, CUDA and all their other proprietary stuff.) On one hand, it really sucks for consumers when someone tries to split the market like that. But all too often the open standard only comes in response to someone like nvidia trying to steal it all for themselves. Freesync, OpenCL, and even dx12 are basically in response to the threat of closed tech.

It's just annoying when you have these niche things like PhysX or 3d vision, where the broader market clearly isn't interested yet, that they actually succeed in holding it hostage. I might go with AMD next round just to spite them, but they're no angels either.

Steam/PSN/Xbox Live: Darius510
bd2003 is online now  
post #12 of 16 Old 07-06-2014, 04:21 PM
AVS Special Member
 
MSmith83's Avatar
 
Join Date: Jun 2005
Posts: 8,557
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 13 Post(s)
Liked: 52
Quote:
Originally Posted by bd2003 View Post
I might go with AMD next round just to spite them, but they're no angels either.
Richard Huddy did not convince you enough that AMD is the John McClane to Nvidia's Hans Gruber?

I used to be really interested in Nvidia's hardware-based PhysX, but not anymore. Let's hope that DirectX 12 can help fix all of this sloppy mess in inefficient, underdeveloped and fragmented features.
MSmith83 is offline  
post #13 of 16 Old 07-06-2014, 04:51 PM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 10,336
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 156 Post(s)
Liked: 1354
Quote:
Originally Posted by MSmith83 View Post
Richard Huddy did not convince you enough that AMD is the John McClane to Nvidia's Hans Gruber?

I used to be really interested in Nvidia's hardware-based PhysX, but not anymore. Let's hope that DirectX 12 can help fix all of this sloppy mess in inefficient, underdeveloped and fragmented features.
Actions speak louder than words. When mantle, trueaudio and their alternative to gameworks open up to the industry, then I'll believe it.
Scott Simonian likes this.

Steam/PSN/Xbox Live: Darius510
bd2003 is online now  
post #14 of 16 Old 07-06-2014, 05:22 PM
AVS Special Member
 
moob's Avatar
 
Join Date: Jan 2008
Location: SoCal
Posts: 1,563
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 6 Post(s)
Liked: 44
I'm willing to give AMD the benefit of the doubt right now. Mantle's still in Beta, but they said they'd be releasing a public SDK by the end of the year. Since they were late on delivering Mantle implementation in BF4 in the first place, I'll even give them a few extra months. As for OpenWorks, that needs to come out sooner rather than later.
moob is offline  
post #15 of 16 Old 07-06-2014, 07:19 PM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 10,336
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 156 Post(s)
Liked: 1354
Quote:
Originally Posted by moob View Post
I'm willing to give AMD the benefit of the doubt right now. Mantle's still in Beta, but they said they'd be releasing a public SDK by the end of the year. Since they were late on delivering Mantle implementation in BF4 in the first place, I'll even give them a few extra months. As for OpenWorks, that needs to come out sooner rather than later.
I have zero faith in openworks ever turning into anything. The world doesn't need AMD to create another repository for open source code, and the last person who should be running it is a GPU vendor anyway. They're only using it to score points with the public against NVIDIA (who deserved to be called out), but they'll abandon their support for it as soon as it's convenient. They have something to gain in preventing NVIDIA from locking them out with black box DLLs, but nothing to gain by giving their future code away to the competition.

Nor should anyone support a graphics (or audio) API controlled by their competition. If they were really serious about being open, it should have been an open standard from day one, and they'd be claiming no control over it. The truth is far from it. Why should NVIDIA or Intel even consider mantle over DX12? It's ridiculous for Huddy to even suggest that, so in the end, does it even matter if it's open? No one in their right mind but AMD will ever use it.

So while on the surface they're more "open" than NVIDIA (which isn't exactly the most difficult thing to do), they're trying to have their cake and eat it too. Perhaps the lesser of two evils, but still evil.

Steam/PSN/Xbox Live: Darius510
bd2003 is online now  
post #16 of 16 Old 07-07-2014, 02:26 PM
AVS Special Member
 
moob's Avatar
 
Join Date: Jan 2008
Location: SoCal
Posts: 1,563
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 6 Post(s)
Liked: 44
Quote:
Originally Posted by bd2003 View Post
I have zero faith in openworks ever turning into anything. The world doesn't need AMD to create another repository for open source code, and the last person who should be running it is a GPU vendor anyway. They're only using it to score points with the public against NVIDIA (who deserved to be called out), but they'll abandon their support for it as soon as it's convenient. They have something to gain in preventing NVIDIA from locking them out with black box DLLs, but nothing to gain by giving their future code away to the competition.

Nor should anyone support a graphics (or audio) API controlled by their competition. If they were really serious about being open, it should have been an open standard from day one, and they'd be claiming no control over it. The truth is far from it. Why should NVIDIA or Intel even consider mantle over DX12? It's ridiculous for Huddy to even suggest that, so in the end, does it even matter if it's open? No one in their right mind but AMD will ever use it.

So while on the surface they're more "open" than NVIDIA (which isn't exactly the most difficult thing to do), they're trying to have their cake and eat it too. Perhaps the lesser of two evils, but still evil.
Well, DX12 titles aren't expected to hit the streets until late next year. And if the past is anything to go by, game developers don't exactly move quickly with DX adoption. I'm also guessing we'll have to upgrade to Windows 9 for DX12 support. There are reasons why they may want to give Mantle a second look...at least for now. Of course, I do also agree that chances are it'll be dead when DX12 becomes commonplace if it isn't adopted by others.

I know AMD has been touting Linux support, but I have very little confidence in SteamOS as a whole.
moob is offline  
Reply HTPC Gaming

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off