Need a video card that will work with a 250W Power Supply - AVS Forum
Forum Jump: 
 
Thread Tools
post #1 of 23 Old 01-09-2013, 12:19 AM - Thread Starter
Member
 
dpatel304's Avatar
 
Join Date: Sep 2006
Posts: 48
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Here is my current set up:
MOBO: GIGABYTE GA-H55N-USB3
PSU: APEVIA ITX-AP250W
CPU: Intel Core i3-530 Clarkdale 2.93GHz

Currently using the onboard GPU which has been fine so far, since I've only been using the HTPC for watching videos. When I built the thing I, stupidly, thought there was no way I'd use this for gaming. A couple years later, and I want to run some Steam games. Right now, I've only tried Super Meat Boy, which does run, it just runs slower than it should. I don't plan on running anything too intensive, but I'd like to be able to play arcadey types of games like SMB without any problems.

Is there a good video card I can get that will work with this small PC and not consume too much power? I did some searching and saw some people recommend the Radeon HD 6570. I don't know a thing about video cards. I just want to be able to run middle of the road games well on my HDTV. If need be, I can upgrade my PSU so I have more options.

Also, my case is tiny and overheats easily, so I don't put the cover on. Currently my PSU is just sticking outside the case, so, if I were to upgrade my PSU, could I potentially get a regular sized PSU and just let it sit outside the case as long as it is a 24 power pin like my MOBO?

Thanks in advance.
dpatel304 is offline  
Sponsored Links
Advertisement
 
post #2 of 23 Old 01-09-2013, 01:14 AM
 
darklordjames's Avatar
 
Join Date: Jun 2003
Posts: 7,909
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 93
You keep saying "small PC" but don't tell us what the case is. Does it support full-size cards, or is it low-profile only? Also, given that your PSU is hanging out of the case, I would probably recommend a new case and PSU before thinking about a video card.

As for video cards, stick with Nvidia. AMD's drivers are terrible. Hell, I'm about to swap out my Radeon 6870 for a Geforce 660 of roughly the same speed just to get good video card drivers back in to my system. In that vein, I'd start by looking at the Geforce GT 640, like this one:

http://www.newegg.com/Product/Product.aspx?Item=N82E16814130796

The key is that you're looking for something that doesn't need a supplemental 6 or 8-pin power plug. You're looking for something that runs purely off the PCIe available power.
darklordjames is offline  
post #3 of 23 Old 01-09-2013, 11:26 AM
Advanced Member
 
macks's Avatar
 
Join Date: Jan 2013
Posts: 715
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 31
If your case overheats I would not add a videocard... Buy a new case before you even consider this quest. The biggest problems with video cards in htpc's is normally noise so look for the quietest video card you can(for a passive video card you need good case cooling).

As far as AMD not having good video drivers that is a bogus statement darklordjames. A 660 is considerably faster than a 6870. The only issues I'm aware of have been on just released games and those normally have enough bugs that adding a couple video driver bugs is the least of your worries.

Back to the OP:

For arcade style games a 6570 should be sufficient. If you want to be able to play basically any game at lowered settings I would look at a 7750 or gtx650.
macks is offline  
post #4 of 23 Old 01-09-2013, 12:16 PM - Thread Starter
Member
 
dpatel304's Avatar
 
Join Date: Sep 2006
Posts: 48
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Thanks for the responses.

Sorry, this is the case that I have:
APEX MI-008 Black Steel Mini-ITX Tower Computer

I'd rather not upgrade too much else on the unit. It's not critical that I play games on it, but I guess I want to make it the best gaming machine it can be by just adding a video card.
dpatel304 is offline  
post #5 of 23 Old 01-09-2013, 01:20 PM
Advanced Member
 
macks's Avatar
 
Join Date: Jan 2013
Posts: 715
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 31
Please see:

http://www.silentpcreview.com/article905-page5.html

I would atleast add a 120mm fan like this guy did...

It looks to me like a 6570 would physically fit but I would suggest measuring exactly how much room you have and making sure the card will fit before you buy it... You will need a single-slot full height card. Hopefully someone with more experience on this case will chime in.
macks is offline  
post #6 of 23 Old 01-09-2013, 02:11 PM - Thread Starter
Member
 
dpatel304's Avatar
 
Join Date: Sep 2006
Posts: 48
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I've added a fan and removed the cover, so overheating isn't a problem any more.
dpatel304 is offline  
post #7 of 23 Old 01-09-2013, 02:37 PM
Yrd
AVS Special Member
 
Yrd's Avatar
 
Join Date: Mar 2008
Location: Elkton, MD
Posts: 3,003
Mentioned: 6 Post(s)
Tagged: 0 Thread(s)
Quoted: 187 Post(s)
Liked: 192
Since this post you've added the fan, or you've already had it?

Adding a video card into this overheating machine is only going to make it hotter. 6570 should be fine. I would look for something that is low profile too. I don't know what type of space you have in the machine, but you should be aware of the size of the heatsink/fan on the card you buy.

XBL Gamertag- Yrd
PSN - Yerd

Steam - Yrd

Yrd is offline  
post #8 of 23 Old 01-09-2013, 02:41 PM - Thread Starter
Member
 
dpatel304's Avatar
 
Join Date: Sep 2006
Posts: 48
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I've already had it and I haven't had overheating issues since. When I first built the machine, it shut down on me a few times. Since then, I've removed the cover, which exposes all sides except the front, and added two fans.
dpatel304 is offline  
post #9 of 23 Old 01-09-2013, 03:39 PM
 
darklordjames's Avatar
 
Join Date: Jun 2003
Posts: 7,909
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 93
"As far as AMD not having good video drivers that is a bogus statement darklordjames."

No, they truly are garbage. It's been a year since Nvidia did it and they haven't gotten around to implementing the purely software based adaptive vsync that consoles have been doing since 2006 and is of utmost value to the range of cards that we are talking about. Overscan correction for HTPC use is busted all to hell. Eyfinity is broken in numerous ways. AMD drivers mostly function if all you want to do is render to a single PC monitor in a 2004 style. If you want to do anything even remotely out of the ordinary like hook the machine up to an HDTV? Then things go down hill real quick. Nvidia's drivers are simply more refined in countless little ways that add up to a much nicer user experience.

dpatel - Adaptive vsync means that on your casual gaming system you can set your vsynced framerate to 30fps, which the system will constantly try to target and give you a good, clean frame when it hits that target. When your framerate drops below 30fps it will turn vsync off and give you torn frames at a faster framerate. The result is motion that stays consistent. If you go with an AMD card you will have to choose between vsync on or off. With vsync on your framerates will always be a factor of 60. In effect your framerate will jump all over the place from 60, 30, 20, 15, or 10fps. The result is terribly juddery motion. Alternatively you can leave vysnc off and have a more fluidly fluctuating framerate, but then your will have tearlines throughout your render at all times. Both AMD solutions feel terrible in effect if you can't maintain a perfect 60fps.

That single setting of Adaptive Vsync between an AMD or Nvidia card of similar performance and price will result in a much better user experience for you moment to moment if you pick the Nvidia card that lets you handle vsync correctly.


"A 660 is considerably faster than a 6870."

60fps is 60fps, which I already reliably hit. A 660 gives me Ultra settings over High, which are for the most part visually unnoticeable at the expense of vastly more horsepower thrown at them. A 660 will give me roughly the same experience as my 6870 with better drivers and more graceful 60hz misses.
darklordjames is offline  
post #10 of 23 Old 01-09-2013, 04:42 PM
Advanced Member
 
macks's Avatar
 
Join Date: Jan 2013
Posts: 715
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 31
dpatel304 I'm sorry I have derailed your thread.

darklordjames I am sorry that you have had a bad experience but I will respond to your post with as much information as I can.

Overscan Correction - There's a slider in amd's control panel and it works perfectly for my htpc. I have used it on multiple TV's and never had an issue. You move the slider to the needed position for your TV, easy peasy.

Eyefinity - I have used up to 4 monitors and never had an issue so I don't know where your issues are coming from. Note: This is used as a workstation where games are not played.

Adaptive VSync - This sounds like a really cool feature, don't think I would use it and I will list my reasons. Triple buffering or normal vsync is a better solution if you don't want to deal with screen tearing. The people who don't like triple buffering/vsync due to slight latency issues would be much better served by turning there graphics settings down to the point that they will never fall below 60fps... I would rather graphics companies work on other features than something like adaptive vsync!

In summary Adaptive VSync is only needed if you are extremely picky about latency but not willing to set your video settings lower so you won't have to deal with the latency spikes that vsync can cause but you don't mind dealing with screen tearing.

To me the biggest difference between amd and nvidia is simple, HD Audio. This is something that your 6870 passes onto your receiver like it should. The 660 that you are buying transcodes the audio to lpcm and sends it to your receiver that way...

I believe the 640 that you recommended does not pass on HD Audio correctly... I believe all of the 5-series Nvidia cards below the 560??? do.

Yrd the case he has can support a full-sized card so low profile is not needed or recommended. It is hard to tell from pictures if he has enough room lengthwise to fit the 6570. On a different note the 7750 looks to have very similar dimensions and is considerably more powerful at only 30 bucks more but it is your money so spend it how you want. Let us know how it goes... smile.gif
macks is offline  
post #11 of 23 Old 01-09-2013, 06:51 PM
Yrd
AVS Special Member
 
Yrd's Avatar
 
Join Date: Mar 2008
Location: Elkton, MD
Posts: 3,003
Mentioned: 6 Post(s)
Tagged: 0 Thread(s)
Quoted: 187 Post(s)
Liked: 192
Sorry, I didn't mean low profile in PC building terms. I was looking at a couple cards on newegg and one said low profile and that stuck in my head when I was typing. I meant in terms of the cooler, your case only has a single pci slot, even some of the coolers that don't have double slot coolers are larger than a single slot. Make sure your case can hold the extra size if you look at one of those.

XBL Gamertag- Yrd
PSN - Yerd

Steam - Yrd

Yrd is offline  
post #12 of 23 Old 01-10-2013, 12:46 AM
 
darklordjames's Avatar
 
Join Date: Jun 2003
Posts: 7,909
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 93
"Overscan Correction - There's a slider in amd's control panel and it works perfectly for my htpc"

It defaults to an underscanned setting, plus it remembers per refresh rate and resolution. You run your desktop at 1080p with 0% overscan? Alright, set the overscan setting in the drivers. Running a game at 1080p/30 to play with some 3D? Welcome back to the overscan settings as it defaults back to underscanning! Running at 720p/60 for better 3D? Back to that overscan slider. Want to run 1024x600 for more intensive games on that 6570? Ah, the overscan slider missed you. You'd better go pay it a visit.

This is not good driver design.


"Eyefinity - I have used up to 4 monitors and never had an issue so I don't know where your issues are coming from."

Eyefinity does not understand that a giant single monitor sucks for desktop use as it breaks your maximize button. It also doesn't understand that the user does not want three distinct monitors for gaming. The end result is constantly switching between joined and discreet monitor modes. As an added bonus, the drivers remember which monitor you plugged in to which port on day one. Did you unplug everything to add a new hard drive? I sure hope you marked which specific monitor plugged in to which specific port! If you plug them back in correctly, but not in exactly the same location, then it puts them back to how they were on day one everytime you switch between Eyefinity desktop mode and Eyefinity gaming mode. Coming out of a game and switch back to desktop mode? Bam, your monitors are all in the wrong order now. Reorder your monitors in software and play another game? Hey look, they're in the wrong order yet again.

This is not good driver design.


"Triple buffering or normal vsync is a better solution if you don't want to deal with screen tearing."

Triple buffering is awesome and I love it. It is also largely ignored by game devs, as they mostly default to double buffering with no triple buffer option. Triple buffering also does not do anything to help with the drastic fluctuations in framerate from 15-60fps that our fellow member will see on these lower-end cards. Again, on this casual system Adaptive Vsync will give him a more consistent and pleasant experience moment to moment. And again, this is something that AMD is really late to the game on and there is no indication that they are even planning on showing up.

This is not good driver design.


"To me the biggest difference between amd and nvidia is simple, HD Audio. This is something that your 6870 passes onto your receiver like it should. The 660 that you are buying transcodes the audio to lpcm and sends it to your receiver that way..."

With HDMI the intention is for the source to do all decoding and spit out an uncompressed video and audio stream. HD audio bitstreaming is what happens when the marketing department gets a hold of the standard and decides they want to sell some more receivers to people that already have a receiver that will do 8-channel LPCM. If your complaint is that your HDMI port isn't bitstreaming audio, but instead decoding to LPCM as intended, then why aren't you also whining that the h264 video encode on your Bluray isn't bitstreamed out to your TV to decode? Oh right, because that would be ridiculous. wink.gif

To be perfectly clear, there is no advantage to bitstreaming. Thankfully there isn't much of a downside, so at least you aren't intentionally breaking your experience due to the effects of marketing.


dpatel - My argument is for your experience. This single option of Adaptive Vsync will give you a better experience. In addition, the drivers for the card are generally just better put together with far fewer odd decisions and blatantly broken design elements.
darklordjames is offline  
post #13 of 23 Old 01-10-2013, 05:36 AM
Advanced Member
 
macks's Avatar
 
Join Date: Jan 2013
Posts: 715
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 31
Overscan Correction - I never change resolution so I had not noticed this. You make a good argument here but it is still fixable within their drivers, although annoying.

Eyefinity - AMD has a tool for your issue on this. I believe it is called Hydravision. This is more of an issue for a desktop and I believe this tool fixes your complaints thoroughly.

Adaptive Vsync - I still completely disagree with you on this.

Triple buffering - Triple buffering unlike VSync has no performance penalty, meaning it is in every possible way better than adaptive VSync or normal VSync. Normal VSync is still a better option if you don't want to see screen tearing. Screen tearing drives me up the wall. To sum up:

Adaptive Vsync - Screen Tearing when fps drops.
VSync - No screen Tearing but fps will drop more.
Triple Buffering - No screen tearing. On a 60hz monitor it is possible that you will be 33ms??? behind on occasion. Poor support by game developers, shame on them.

HD Audio -

Nvidia - Source -> Bitstream -> Analog to speakers

AMD - Source -> Analog to speakers

Nothing you can post will convince me that the AMD solution is not better.

Lastly, I don't think any of this is relevant to the OP as he should stay above 60fps on the game he listed with the video card mentioned...
macks is offline  
post #14 of 23 Old 01-10-2013, 02:38 PM
AVS Special Member
 
Anthony1's Avatar
 
Join Date: Sep 1999
Posts: 6,088
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 137 Post(s)
Liked: 170
I haven't read thru all this thead, but just wanted to mention that I recently got a PC with a crappy power supply, and I've heard that the Radeon HD 7750 is the way to go. Make sure to get the low profile version with the DDR5 memory.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814202002


I'm trying to wait and find this on a better deal. I'm just using the built-in Intel graphics thing, it's kinda crappy, but it's fine for now.
Anthony1 is offline  
post #15 of 23 Old 01-10-2013, 09:25 PM
 
darklordjames's Avatar
 
Join Date: Jun 2003
Posts: 7,909
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 93
"HD Audio -

Nvidia - Source -> Bitstream -> Analog to speakers

AMD - Source -> Analog to speakers"


More accurately your chart would go:

Nvidia: Source Bitstream > LPCM decode > HDMI > Receiver > Post processing > DAC to speakers

AMD: See above, -or- Source Bitstream > HDMI > Receiver > LPCM decode > Post processing > DAC to speakers

You'll note that the only thing that changed is that the LPCM decode happens later in the chain. Being an entirely digital chain, there is no advantage to this. In fact, there is a distinct disadvantage in that you can't do system or menu audio mixing when bitstreaming directly from the disc. This means broken menu audio and broken special features on your fancy Bluray discs.

I ask again: Why do you not feel the same way about the video bitstream on that same Bluray disc?


"Nothing you can post will convince me that the AMD solution is not better."

It's not a matter of not being better. It's a matter of a feature with such little value that to make any sort of decision based upon it is ridiculous. It's like buying a TV based on whether the item in question is plasma or LCD, not based on feature set, price, looks, and performance.


"Lastly, I don't think any of this is relevant to the OP as he should stay above 60fps on the game he listed with the video card mentioned."

Yes, the only game that he will ever play on his $80 video card is Super Meat Boy. That is exactly why "Right now, I've only tried Super Meat Boy" is directly preceded by "and I want to run some Steam games" in the original post. Right? The word "some" of course means "only this single item" in modern English.
darklordjames is offline  
post #16 of 23 Old 01-11-2013, 04:09 AM
Advanced Member
 
macks's Avatar
 
Join Date: Jan 2013
Posts: 715
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 31
Also, my case is tiny and overheats easily, so I don't put the cover on. Currently my PSU is just sticking outside the case, so, if I were to upgrade my PSU, could I potentially get a regular sized PSU and just let it sit outside the case as long as it is a 24 power pin like my MOBO?

Sorry I missed this statement. Generally it is good practice to have your PSU grounded to your motherboard tray. This grounding is very important for your motherboard. If you look at acrylic cases they *should* have a metal strap included that goes from the power supply to the motherboard tray.

I ask again: Why do you not feel the same way about the video bitstream on that same Bluray disc?

This must be an nvidia option that I'm not aware of: adaptive h.264?(serious answer below).

You must really want an answer to this. I understand why you are making this comparison but you are really comparing apples and oranges. Many people will say that no matter how or where you decode a digital signal it is the same and I agree that this *should* be true. The problem is that it is very rare to find a decode that doesn't in some way transform the digital signal. This being said, I would prefer that all my audio and video are decoded within the same device. All video coming from my computer is decoded at my computer. All audio coming from my computer is decoded at my receiver. In this way everything is hopefully colored in the same way.

All of this being said if Nvidia(or whoever they bought the decode from) doesn't color during the decode then all should be fine, right? Well, almost. The problem here is that my receiver almost certainly treats inbound lpcm and inbound hd-audio differently(not applying +10 LFE is the most common issue).

For anyone else reading this thread. LPCM is a lossless codec and is a good alternative to the branded lossless codecs and I certainly wouldn't suggest buying a new video card just for HD audio. LPCM is a less efficient lossless codec though 30% is the number I have heard, and what significance this has? I have no idea.

"Lastly, I don't think any of this is relevant to the OP as he should stay above 60fps on the game he listed with the video card mentioned."

Yes, the only game that he will ever play on his $80 video card is Super Meat Boy. That is exactly why "Right now, I've only tried Super Meat Boy" is directly preceded by "and I want to run some Steam games" in the original post. Right? The word "some" of course means "only this single item" in modern English.


This just sounds like you are being picky so I will be picky. A 6570 isn't $80. As stated by the OP "but I'd like to be able to play arcadey types of games like SMB without any problems." This statement is why I made the suggestion I did and because he mentioned a 6570. I also recommended that he spend a couple extra bucks and buy a 7750 as the dimensions appear to be the same. Another alternative would be a 650.

darklordjames - Have I addressed all of the driver shortcomings that you had? I will agree that the overscan correction issue is annoying if you change resolutions often(it takes what 10 minutes to completely fix this?). Can we go back to judging video cards based on there performance and not based on extremely minor features(hd audio, adaptive vsync etc.)?
macks is offline  
post #17 of 23 Old 01-11-2013, 02:55 PM
 
darklordjames's Avatar
 
Join Date: Jun 2003
Posts: 7,909
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 93
"All audio coming from my computer is decoded at my receiver. In this way everything is hopefully colored in the same way."

What are you doing in the Gaming section of the forum then? Games output LPCM to your receiver. If your logic was internally consistent then you would want nothing but LPCM coming out of the PC. The only way that your logic does pass the consistency test is if you never play games.


"The problem here is that my receiver almost certainly treats inbound lpcm and inbound hd-audio differently(not applying +10 LFE is the most common issue)."

This now sounds like you are blaming Nvidia for your receiver being broken.


"LPCM is a less efficient lossless codec though 30% is the number I have heard, and what significance this has? I have no idea."

The answer is "zero significance". The bandwidth for full, uncompressed, 8-channel LPCM is there in HDMI. Bitstreaming to save 30% of that HDMI bandwidth in any particular moment does nothing. It's not like you can save up those 30% increments to later use on a single 4K video stream. wink.gif


"Have I addressed all of the driver shortcomings that you had?"

Of course not. A couple of paragraphs written by a forum user doesn't fix broken driver design, straight driver bugs, or missing features. And no, Hydravision doesn't come close to fixing AMD's poor multi-monitor implementation.


"and not based on extremely minor features(hd audio, adaptive vsync etc."

THAT IS EXACTLY THE POINT. Adaptive Vsync is not a minor feature, and it is of even greater importance the lower spec the machine is! What you are arguing for is the Playstation 3 solution of constant vsync, when the Xbox 360 has already spent 6 years proving that adaptive vsync provides a better user experience. Go read any recent multiplatform title comparison. The short of it usually comes down to "Vsync enforcing a drop from 30 to 20fps kinda sucks (the PS3 solution), while dropping vsync below 30 to maintain a ~25fps experience is not ideal but certainly feels better in motion and maintains better input response (the 360 solution)".

Yes, yes, yes, you'll bring triple buffering back up. The feature that the vast majority of games ignore, that you can't force in drivers for DirectX, and that takes obscure software (D3D Overrider) from 2009 to force at a system level. As an added bonus, that software of course no longer works in Windows 8. Triple buffering is a great concept, but the industry has already taken a pass on it. Arguing in it's favor at this stage is pretty pointless.
darklordjames is offline  
post #18 of 23 Old 01-12-2013, 01:39 PM - Thread Starter
Member
 
dpatel304's Avatar
 
Join Date: Sep 2006
Posts: 48
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Wow, a lot more responses and information than I was expecting. Thanks to everyone who responded.

I think, based on my research and this topic, it's probably best I just get a bigger case, bigger PSU, and open up my options, as far as a video card goes. Too much of a headache to try and find something that may or may not fit in my current case and may or may not run with my 250W power supply. I also have about 30 steam games that may or may not run after I upgrade to this low-end video card.

Best just to suck it up and upgrade the case. My case doesn't even work as is, seeing as how I've got the PSU sticking outside of it, and the cover is not on.
dpatel304 is offline  
post #19 of 23 Old 01-12-2013, 02:42 PM
 
darklordjames's Avatar
 
Join Date: Jun 2003
Posts: 7,909
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 93
"it's probably best I just get a bigger case, bigger PSU, and open up my options,"

Good plan. I would now target the Geforce GTX 650 or 650 Ti. The vanilla GTX 650 gets you a hell of a lot more video card over the GT 640 I mentioned earlier for about $30 more ($110). It moves you from a point of being a bit better than an Xbox 360/PS3 to a comfortable 1080p at 40-60fps. You likely won't notice much of a difference moving to the 650 Ti for another $30 ($140), as your CPU will be the limiting factor at that point. If plans to upgrade the CPU are in your future to something quad-core at >3.3ghz, then the 650Ti will give you a pretty solid 60fps, whereas the regular 650 would probably still deliver you something in the 45-60fps range.

In either case, the experience will be rather pleasant and significantly improved over the consoles instead of just "okay, I guess". smile.gif
darklordjames is offline  
post #20 of 23 Old 01-12-2013, 07:38 PM
Advanced Member
 
macks's Avatar
 
Join Date: Jan 2013
Posts: 715
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 31
The 650 ti is a solid card. Don't think I would recommend the 650 though, looking at newegg the amd 7770 is only $5 more and is a bit faster. 650ti to 7850 is $25. Then the 660 is 50 more. Each bump up is a good amount more graphics power but probably more money than you want to spend.

As far as upgrading the cpu, I'm not sure it's really needed unless you are buying a high-end gpu. Benchmarks showing cpu's limiting fps generally use top-end gpus at stupid low resolutions and graphics settings to attempt to make games show as being cpu-bound...
macks is offline  
post #21 of 23 Old 01-12-2013, 09:49 PM
 
darklordjames's Avatar
 
Join Date: Jun 2003
Posts: 7,909
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 93
"Benchmarks showing cpu's limiting fps generally use top-end gpus at stupid low resolutions and graphics settings to attempt to make games show as being cpu-bound..."

http://www.anandtech.com/bench/Product/118?vs=701

dpatel - The following is in no way intended to be taken as making fun of your CPU. We all have our budgets and our needs. It is purely an argument against macks' assertion that a CPU upgrade would be fruitless.

macks - The i3 is dual core at sub-3ghz. Aside from the basic failure to be able to hold a solid 60fps in just about anything as it will find things to get stuck on pretty regularly, it will also be dog-slow in anything that is heavily single-threaded like Torchlight II or Firefall. The vast majority of games these days are designed for the Xbox 360 architecture. As such, they really want a minimum of three cores to run on preferably with five threads, though four threads tends to be just fine in practice. Anything dual-core in a modern system results in one core running the heavy thread, with every other thread getting stacked on that second core. It becomes a bottleneck real fast.

Yes, paired up with a beefy enough video card the i3 will give average frames in the 60-90 range. This of course means fluctuations from 20-200fps within any particular second. Moving up to something quad-core in the 3.5-4.0ghz range moves that average to 100fps or so. What that really means is that your fluctuating range is now in the 45-300fps ballpark within that second. The result? Much smoother motion, which means an improved user experience.

http://techreport.com/review/21516/inside-the-second-a-new-look-at-game-benchmarking
http://techreport.com/review/23246/inside-the-second-gaming-performance-with-today-cpus

Here is some good reading for you. You'll notice that they don't even bother with anything dual-core in the entire breakdown. Hell, there are quad-core CPUs of comparable frequency and instructions-per-clock that struggle to maintain a solid 60fps with beastly video cards. Seriously, if you have not read this pair of articles then go do so now. There is some quality research going on in there. The short of it is that yes, anything modern will give you a reasonably solid 60fps. The fluidity of that 60fps is still very much influenced by CPU though. Anything dual-core is not modern.

Just so I understand you, macks. Decoding to LPCM at your receiver instead of at the PC? "Huge, world-ending problem!!!1" Holding an actual fluid 60fps instead of just benchmarking at 60fps? "Not that big of a deal. Who would ever need a newer CPU!?!"

At this point I'm starting to think that you're being contrarian just for the sake of it. To be clear, my goal here is that everybody has the best experience possible with their gear regardless of budget, in the process learning as much as possible. Stick around a bit longer and you'll find that I am ridiculously well researched and have a reputation as such. My other reputation around here is that I have an exceptionally low tolerance for bad advice. Don't be a source of bad advice. smile.gif
darklordjames is offline  
post #22 of 23 Old 01-13-2013, 05:50 AM
Advanced Member
 
macks's Avatar
 
Join Date: Jan 2013
Posts: 715
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 31
http://www.anandtech.com/bench/CPU/338

http://www.anandtech.com/bench/CPU/340

To the op: Anandtech bench is a great place to compare gpu's. Some of there benchmarks need re-ran with up-to-date drivers but it still give you a good idea of how powerful a gpu is compared to others.

Sometimes it just doesn't matter what cpu you are using even with a high-end gpu.(see above links) As you posted sometimes it does matter with a high-end gpu.

I hadn't noticed these on anandtech thanks for pointing them out.

Talking to someone who was looking at a $50 gpu and is now hopefully looking at a $100 one I don't see the need to upgrade that old i3. However if a 650ti or above is bought then maybe there is more truth to what you are saying.

Just to re-iterate, I mentioned far above that hd-audio and adaptive vsync are very minor issues. If adaptive vsync completely removed screen tearing then my next gpu would be nvidia without question, unfortunately this isn't the case.

Can you explain to me how a 99th percentile graph is different from a min. fps graph? other than being a little harder to read and more accurate than fraps?

Techreport posted very interesting results with a 7950. The issue is that this gpu is orders of magnitude more powerful than anything the OP is likely looking at. Are these cpu's orders of magnitude more powerful than his?

As far as I'm concerned max fps or min latency is completely meaningless. Avg fps and min fps(max latency) is all that matters.

"This now sounds like you are blaming Nvidia for your receiver being broken."

Receivers not properly applying +10 LFE to LPCM is very common and is done on purpose. I believe Nvidia and AMD make great gpu's and I buy whichever one is priced better even though I have never had an Nvidia card survive longer than 6 months I would buy there cards again in the future.

What are you doing in the Gaming section of the forum then? Games output LPCM to your receiver. If your logic was internally consistent then you would want nothing but LPCM coming out of the PC. The only way that your logic does pass the consistency test is if you never play games.

Thank you for another hostile response. Yet again I will reiterate that I think it is a very minor issue and really don't care that much either way.

Stick around a bit longer and you'll find that I am ridiculously well researched and have a reputation as such.

I have definitely seen from other threads that you have a reputation.

Lastly, I should have put down the reason why I didn't think a cpu upgrade was necessary and I didn't so I partially understand your hostile response.
macks is offline  
post #23 of 23 Old 01-13-2013, 02:59 PM
 
darklordjames's Avatar
 
Join Date: Jun 2003
Posts: 7,909
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 93
"Can you explain to me how a 99th percentile graph is different from a min. fps graph?"

Minimum fps still accounts for an entire second. In example: A minimum of 70 would imply that it always sustained over 60fps. In reality, a 70 minimum doesn't tell you that half that time was spent at 50fps and the other half at 90fps. In other words, stating a minimum fps of 70 implies perfect motion, but in practice there is still judder as the scene dips below 60fps half the time. End of the world? No. But in the pursuit of accuracy it is an important thing to know.

In a more extreme example, you can have one frame take an entire half of a second to render, meaning frozen motion for that half a second. Then if you render the other half-second at 120fps, the end result is a minimum framerate of 60fps. That magical number of 60 is useless of I was sitting there looking at a frozen screen for half the time! smile.gif Using a specific game, Skyrim is pretty crap in how it loads in new assets. On my current system I render Skyrim at a max of roughly 240fps. If I try real hard I can find areas in cities with enough NPCs and view distance to drag it down to a sustained 45fps, though normally my minimum is around 65fps out in the field, in dungeons, and walking around in cities. I average something in the 110fps range. The game still has occasional halts of probably 3-5 frames though when it brings in new assets. My average implies a 9ms render time, with 15ms adventure minimums and 22ms city minimums. Asset loads can cause a momentary 83ms frame though. To hit a perfect 60fps, every frame needs to be done in 16ms.

A 60fps average is now pretty cheap and easy to hit on a PC. As such, classic benchmark styles are becoming less useful. This is certainly a good thing, as we've been trying to hit a solid 60fps at a reasonable cost since 1996/1997 when we started seeing the Voodoo 1 and Riva 128 pop up. Now that we're getting close to universally achieving that goal with even integrated graphics, it's time to look at smoothness of motion. Tech Report's Inside The Second series makes a good attempt at doing that. Really, go read it, as I feel you may have glossed them over.

"Are these cpu's orders of magnitude more powerful than his?"

Nope. CPU's have been pretty stagnant since we hit 3.6ghz on a dual-core, Hyperthreaded Pentium 4 back in 2004. We aren't looking for a massive jump though. We're only looking to hit a solid, and smooth 60fps. Having an additional pair of real cores instead of a pair of Hyperthreads, plus a bit more clockspeed would get him real close. Again, I said "If plans to upgrade the CPU are in your future to something quad-core at >3.3ghz, then...". At no point did I tell him to upgrade his CPU. I merely said that if a CPU upgrade was in that system's future, then the extra $30 on the GPU now would be the smart move in the long game.
darklordjames is offline  
Reply HTPC Gaming



Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off