AVS Forum banner

Status
Not open for further replies.
1 - 15 of 15 Posts

·
Registered
Joined
·
10,434 Posts
Discussion Starter #1
Over on Firingsquad right now they just did a preview of the upcoming iteration of CROSSFIRE (ATI's SLI answer).


Now get this. An Nvidia Geforce 7800 GTX 512meg board (which runs alot faster than the 256 meg boards, so its really like a 7800GTX Super Ultra) will set you back..oh...$750 bucks. For One Videocard.


An ATI Crossfire 1800XT or whatever it is will set you back, once they become available, like $150 less than the nvidia part (according to Firingsquads article).


They ran a bunch of games through their benchmarks....1600x1200 resolution with 8xAA and 16xAF, so really max on the cards (in general, I think the ATI parts can push AA even higher).......both single cards and cards in SLI configuration.


With a $1500 SLI Nvidia setup (dual 512 7800GTX's) FEAR at that high rez gets you......65fps average. in the 50's with the crossfire (driver issues, otherwise the cards are about identical in performance).......

WTF..!??!?


I'm all for games pushing hardware, but this is insane. With some of these games, it actually seems like even an EXPENSIVE card like a 7800GTX would be considered "bare minimum" if you want to run them.


My 6800GT is still chugging along, but at lower rez's and less AA/AF eyecandy...but man..at the time I almost opted for the cheaper 6600GT. Oh man would I be disappointed now had I done that.


COD2...Fear.....bringing even $1500 combos to their KNEEES!?


Come ON folks.....thats just *not right*. A few years ago we were all freaking out that a high-end videocard suddenly cost $400! Now, they are $700..and you need TWO OF EM to run at high rez!?!?!


Also note: They didn't test any WIDESCREEN resolutions. Again..WTF.


Honestly, if this keeps up then yes, you WILL see people abandoing the PC game market in droves due to the high cost of hardware..relative to the performance you get in game.


$400 I can stomach for a good video card.....but if it takes $1500 worth of video just to break 60fps in some modern games at 1920x1080-ish resolution? F-That....seriously.


;)
 

·
Registered
Joined
·
3,737 Posts
A single 7800GTX 256mb ~400$ would compete just fine against a xbox360. I think you're going way overboard on what's considered typical PQ, typical resolutions, and what hardware it takes to run "bare minimum."


No kidding you need expensive stuff to run 1600x1200 with everything maxed out. It's not like any console can do that. Even in my days of the 9800pro, I couldn't play all games in 1600x1200...and that video card was 399$ when launched. I'll gladly take a 7800GTX with over twice the power for the same price....
 

·
Registered
Joined
·
5,031 Posts
I was pleasantly surprised with the performance of my new 6600 GT card in FEAR. I had already played through the game on a Radeon M10 though so that was night and day difference.


I totally agree with the OP though, I don't think you are getting anywhere near the bang for the buck in the mid-range or high-end that you got 2-3 years ago in video cards, and the SLI push just adds insult to injury.
 

·
Registered
Joined
·
10,434 Posts
Discussion Starter #4
Its just kind of insulting that the graphics card industry keeps expecting gamers to dig deeper and deeper.


I mean, we all know we have to upgrade every couple of years anyhow, but this is sort of absurd. I went from $200 videocards to $400 to now....$500? $600?? Its a little beyond nutty.


If you can limit your resolutions to 1024x768 or something equivelent in a widescreen format, I agree...a 7800GTX should get you buy for the forseeable future with some AA and AF turned on. But COD2, FEAR......Unreal Tournament 2006 and the other next-gen game engines......I dunno.


I can say this however, the first game that comes out in the next 12 months that wont do 40 fps at 1024x768 on a 7800GTX had damn well better look *astounding*.
 

·
Registered
Joined
·
860 Posts
I agree with the OP. In fact I posted a very simular thought in another thread.

One thing to keep in mind is the SLI has it's limitations. It costs twice as much but you don't twice the performance.

I am running two 7800 gts in SLI. When I do benchmarks with and without SLI I find that my average and highest FPS increase by about 60% (instead of a full 100%) BUT the big problem is that my minimum FPS stay about the same with or without SLI on. That is a huge problem. On the fear benchmark I got 235 max FPS with SLI instead of 151 without- who cares. The average went from 67 without SLI to 93 with- that kind of matters, but my minimum stayed EXACTLY the same at 36 regardless of SLI. This is consistant across other games that I have tried. The place where you really need the GPU power is to increase the minimum, not maximum!
 

·
Registered
Joined
·
4,273 Posts
HeadRusch I personally felt you knew better. Yah you can make Fear run like crap on any system. You turn off soft shadows, which is basically undistinguishable when you play the game and not stare at the game (and then it's very slight). You'll get your mad frames back. Then you tweak an option here or there and you are golden. None of these benches ever explain things like that, they just max everything for ease sake. IMO FEAR runs great with a $200 video card when sane settings are figured in. It’s like price/performance for buying something but it’s visual/performance tradeoffs for the win.


Also what really is the point of 1600x1200 at 8AA? Other then it's max for testings sake. My 2800+ with a 9800Pro system still runs things nicely, tone down a few really high-end settings and boom I'm in again. Playing the game it's no different then on my 3700+ 7800GT system.
 

·
Registered
Joined
·
570 Posts
I agree with you Horror that 1600 X 1200 with 8X AA is just for benchmarking and so it doesn't matter. But I have to disagree with you that soft shadows don't make a differnce. I agree that it's subtle when you turn it on, but as I was playing through the game I really felt that the soft shadows add a sense of next-gen to F.E.A.R. that just isn't there without them.
 

·
Registered
Joined
·
279 Posts
I just read over at TomsHardware about Nvidia's quad GPU cards! Insane. I think I read the price somewhere of $1600. I think I'll keep with my 9800AIW for a while longer.
 

·
Registered
Joined
·
378 Posts
Dang! And when directx 10 comes out with vista next year everyone will want to dump their directx 9 cards. Especially since directx 10 isn't even backward compatable with directx 9 except through software emulation. Imagine running a dual sli 7800 GT set up through software emulation to play the first directx 10 game that comes out! ouch


Nah, I'll buy one card for now and I won't feel so bad when the new API comes out in a year and I want to be able the use the newest features of that API. The first directx 10 games might be 6-8 months after vista comes out anyway so no real worries. By then everyone will natually want to upgrade again.
 

·
Registered
Joined
·
860 Posts

Mash said:
Imagine running a dual sli 7800 GT set up through software emulation to play the first directx 10 game that comes out! ouch

QUOTE]


Well, I don't have to imagine. That is the card config that I just bought. Actually after about a week of seeing much less than expected results from SLI I think that I may sell one card on fleebay and just plan on replacing the 7800 when DX10 comes out.
 

·
Registered
Joined
·
1,128 Posts
Quote:
Originally Posted by mking2673
I agree with the OP. In fact I posted a very simular thought in another thread.

One thing to keep in mind is the SLI has it's limitations. It costs twice as much but you don't twice the performance.

I am running two 7800 gts in SLI. When I do benchmarks with and without SLI I find that my average and highest FPS increase by about 60% (instead of a full 100%) BUT the big problem is that my minimum FPS stay about the same with or without SLI on. That is a huge problem. On the fear benchmark I got 235 max FPS with SLI instead of 151 without- who cares. The average went from 67 without SLI to 93 with- that kind of matters, but my minimum stayed EXACTLY the same at 36 regardless of SLI. This is consistant across other games that I have tried. The place where you really need the GPU power is to increase the minimum, not maximum!
Just thinking out loud here...but is it possible that the minimum fps you are seeing is where the game is CPU limited?
 

·
Registered
Joined
·
860 Posts
Quote:
Originally Posted by Jim S
Just thinking out loud here...but is it possible that the minimum fps you are seeing is where the game is CPU limited?
Yeah I thought about that too. I'm running a 3700+ so it's not a bad CPU. When I lower the graphical settings the minimum comes up so I assume that it is purely GPU restricted. It is more likely that my maximum with SLI is CPU restricted.
 

·
Registered
Joined
·
141 Posts
Key words are 1600 X 1200 Resolution. That's cutting edge my friend, similiar to how you can go out and get a 480P projector for 500 bucks or go out and spent 10k on a nice 3 chip DLP.


You want the best? Be ready to pay for it.


Personally I just built an 800 dollar rig with a 7800GT and it runs Call of Duty just fine

at the "optimal" levels. They also just came out with a dual core patch for the game that should help even more.
 

·
Registered
Joined
·
10,434 Posts
Discussion Starter #14
Quote:
Originally Posted by Ed B. 1979
Key words are 1600 X 1200 Resolution. That's cutting edge my friend, similiar to how you can go out and get a 480P projector for 500 bucks or go out and spent 10k on a nice 3 chip DLP.
Uh, 3 chip DLP costs alot more than $10k, last time I checked........as for 1600x1200 being cutting-edge, I disagree....maybe back in the Radeon 9800 days....but not today. True, I think its too high a resolution to really game at even with modern cards, but we aren't talking 1600x1200.


We're talking about a $500 videocard that can't do 60fps in some brand new FPS's....thats sad. Usually you buy the high end card to get framerates in the 100's...but now you spend that money just to get barely adequate performance from it? Thats what I am annoyed with, particularly.

Quote:
You want the best? Be ready to pay for it.
Yes...only in this case, even the best isn't good enough....which is the sad part.

Dont make me shell out $500 bucks for a videocard that wont even keep up with the latest game at REASONABLE resolutions like 1280x720 or 1280x1024...hell even 1024x768.

Quote:
Personally I just built an 800 dollar rig with a 7800GT and it runs Call of Duty just fine at the "optimal" levels. They also just came out with a dual core patch for the game that should help even more.
I agree that new patch makes a huge difference......but still, a 7800GT isn't a cheap card (tho it is a bargain in todays world it seems), but it doesn't exactly run rings around the game........I myself have had a 6800GT for some time now, and it struggles with the latest and greatest. COD2 probably punishes it the most, although the new patch helps by getting more juice out of my HT P4 processor.


Still, the point of my argument isn't that you need to dig deep to get great performance, its that today you seemingly need to dig DEEP Just to get *Barely Adquate* performance..forget "Great"...forget 120fps at 1600x1200....


etc.
 

·
Registered
Joined
·
860 Posts
I guess we will have to see HeadRusch. It may just be that this fall's games are not programed as well as they should be. Up until this fall I ran everything fine with a 2400+ and 9800 pro. That includes games like BF2 and rfactor. If games are written like BF2, things can be scaled and still look real nice and play smooth on modest hardware. However, this fall FEAR and Call of Duty are just hammering rigs. Maybe it's just these two games?

I've said it before, but one huge test in the near future will be how well the unreal 3 engine runs. If that looks as good as it does, and runs well, other developers will have to step up. If that is a just a hog, then look out.

The other big question in the next year will be what happens with Direct X 10.
 
1 - 15 of 15 Posts
Status
Not open for further replies.
Top