or Connect
AVS › AVS Forum › Gaming & Content Streaming › Home Theater Gaming › HTPC Gaming › Nvidia Kepler
New Posts  All Forums:Forum Nav:

Nvidia Kepler - Page 2

post #31 of 142
I think I might wait for the 110, for now I will stick with the 5870 triples I am using even tho I hate the AMD driver support for most games. But who knows, maybe if there is a price drop because of comp with AMD, I might get two 680s for SLI.
post #32 of 142
I have a 295 card , what you guys think, finally time to upgrade to this card. I been holding out for what seems like forever, but games were still running good minus dx11 of course.
post #33 of 142
Thread Starter 
If you find yourself wanting to play a lot of the latest games in maximum quality with playable performance, then I say go for it. It will also help you a bit with your electricity bill.
post #34 of 142
2D Review


3d review
post #35 of 142
Thread Starter 
I only watched a little of the first video, but I noticed Linus mention "coil whine." It can be prevented by certain power supplies, but by now you would expect a $500 graphics card to prohibit it from occurring at all.

SilverStone added a capacitor at the end of its graphics card power connectors for some of its power supplies to prevent extreme cases of electrical whine that result from varying electrical loads with certain cards.
post #36 of 142
I bought one....EVGA version, not sure I'm thrilled about it lol. Coming from 295 I hope I have some improvements , I mean duh I know I will , just I'm broke now haha.

Also not sure how HTPC will go. I'm using asus hdav slim to bitstream blu ray, guess I just eliminated the need for that , yea? I use TMT 5 so I hope it can bitstream using the 680.
post #37 of 142
Quote:
Originally Posted by hdtv00 View Post

I bought one....EVGA version, not sure I'm thrilled about it lol. Coming from 295 I hope I have some improvements , I mean duh I know I will , just I'm broke now haha.

Also not sure how HTPC will go. I'm using asus hdav slim to bitstream blu ray, guess I just eliminated the need for that , yea? I use TMT 5 so I hope it can bitstream using the 680.

680 is gonna smoke that 295 don't worry.
post #38 of 142
I just recently put (well 3 months ago) 2 GTX 580s in my rig - the 3 GB ones and haven't looked back. I'm a firm believer in Nvidia after coming from some R6950s. I think you will be very happy with your purchase.
post #39 of 142
Stoked! Will use the 2 cards for 3D gaming. I figure that with two cards, I'll get comparable frame rates to the 2D 1080p tests I'm seeing, even in 3D.

Main apps I'm worried about anyway are Dirt 3, Forza 4 and HAWX 2 - so not concerned about abysmal frame rate loads of something like Skyrim.

I have the Microsoft wireless controller adapter that lets me use all the XBox wireless controllers, the MS wireless racing wheels (2) with the pedals, and a full flight sim control setup.

I know exactly which friends are going to be over every week to play these! Lol

Pretty sure that I will use the Adaptive VSync with these 680s.
post #40 of 142
Thread Starter 
The one aspect that seems to be really holding the 680 back in some games is memory bandwidth.

But, the card still thrives in numerous popular games. Someone playing games like Arkham City, Battlefield 3 or Just Cause 2 for the first time with a 680 should be in for a real treat.

Hardware-based PhysX performance so far does not seem to be much of a leap over the 580.

I will soon buy a 680 as it's a wonderful and seemingly well future-proofed card for 1080p gaming. Low power consumption and low noise are a big plus. I'll probably even get a card with a reference cooler, something that I have not done in a long while.
post #41 of 142
I honestly wish PhysX would simply go away....its a cheap effect rarely worth the horsepower required to run it. Years later its an effect that still always strikes me as "gimmicky", which is sad. I had hoped it would lead to actual games where things blew up into particles in a more realistic manner, but instead all we get are 'free flowing capes' and some weak bullet-hole-that-produces-30-lbs-of-concrete-dust effects, honestly the "bullet hole" effect in Max Payne was more realistic and impressive than Mafia II with Phys-X.......
post #42 of 142
Thread Starter 
I just watched PC Perspective's video review of the GTX 680. In it, Nvidia's Tom Petersen explains the new features.

EDIT: Adaptive Vsync is what I thought it is, except it uses a 60fps baseline as the determining factor regardless of refresh rate; vsync is on when frame-rate is greater than or equal to 60, and vsync is off when frame-rate is less than 60. It is a good option to have if triple buffering is not working well for whatever reason, or if the frame-rate very rarely goes below 60fps in a particular game.
post #43 of 142
Quote:
Originally Posted by rdjam View Post

Stoked! Will use the 2 cards for 3D gaming. I figure that with two cards, I'll get comparable frame rates to the 2D 1080p tests I'm seeing, even in 3D.

Main apps I'm worried about anyway are Dirt 3, Forza 4 and HAWX 2 - so not concerned about abysmal frame rate loads of something like Skyrim.

I have the Microsoft wireless controller adapter that lets me use all the XBox wireless controllers, the MS wireless racing wheels (2) with the pedals, and a full flight sim control setup.

I know exactly which friends are going to be over every week to play these! Lol

Pretty sure that I will use the Adaptive VSync with these 680s.

You're playing Forza4 on your computer?!

Quote:
Originally Posted by HeadRusch View Post

I honestly wish PhysX would simply go away....its a cheap effect rarely worth the horsepower required to run it. Years later its an effect that still always strikes me as "gimmicky", which is sad. I had hoped it would lead to actual games where things blew up into particles in a more realistic manner, but instead all we get are 'free flowing capes' and some weak bullet-hole-that-produces-30-lbs-of-concrete-dust effects, honestly the "bullet hole" effect in Max Payne was more realistic and impressive than Mafia II with Phys-X.......

Agreed. However, I feel that the lake of quality should be aimed squarely at the devs who do not use the technology in a better way. Not the PhysX technology, itself.
post #44 of 142
Quote:


Agreed. However, I feel that the lake of quality should be aimed squarely at the devs who do not use the technology in a better way. Not the PhysX technology, itself.

I honestly think nobody can really figure out what to do with the technology. For gimmicky stuff like particles that cause generic debris or 'wavy capes' in Batman games, its fine. But when you try to do anything significant with the technology, say deformable environments or what have you....then the performance hit becomes too great and the gameplay experience changes so much between those with and without physX that they simply don't bother to mess with it.
post #45 of 142
Quote:
Originally Posted by HeadRusch View Post

I honestly think nobody can really figure out what to do with the technology. For gimmicky stuff like particles that cause generic debris or 'wavy capes' in Batman games, its fine. But when you try to do anything significant with the technology, say deformable environments or what have you....then the performance hit becomes too great and the gameplay experience changes so much between those with and without physX that they simply don't bother to mess with it.

Yeah, its a real shame too cuz I was promised crazy physics imagery and I want true deformable environments. Besides what has been done with the Frostbite 2.0 engine, there has been very little progression in this technology..ever.

In fact, this reminds me of an old tech video for Force Unleashed which ended up just being a pure lie.

http://www.youtube.com/watch?v=FqNJRxn23NQ
post #46 of 142
Maybe before I die someday there will be a fully destructible GTA type world where my force powers can level buildings and hookers with one mighty blow
post #47 of 142
Lol!

One day.... one day.
post #48 of 142
Thread Starter 
If a GPU-based physics standard based on OpenCL had gained traction, then I'm almost sure we would have seen more PC games using these effects much more realistically and with less burden on the hardware.

As it stands now, it appears that Nvidia simply gives a game developer the incentive to use GPU-based PhysX, which ends up being an afterthought implemented with the help of a team of Nvidia software engineers.

I will say that the recent improvements made to the PhysX SDK do look promising.
post #49 of 142
Quote:
Originally Posted by Scott Simonian View Post


You're playing Forza4 on your computer?!

dang - I see the problem - love Forza on the XBox, but when I looked for a Windows version tonight, of course, I come to find its only available for XBox.

Lol, hopefully I'll find a really good PC racing program that supports 2 player split screen!
post #50 of 142
Quote:
Originally Posted by rdjam View Post


Lol, hopefully I'll find a really good PC racing program that supports 2 player split screen!

Very doubtful
post #51 of 142
Ok well, REVIEWS!? Some of you have dipped into the piggy bank, I know this

I have heard one big PITA is the auto-overclocking that happens.....seems like the days of setting a voltage and setting a speed for Shaders/Ram/GPU are *over*.......can anyone confirm?

Does this mean we wont be seeing any more "Superclock" or "Black" edition Nvidia boards if the auto-overclocking is all set by ranges?
post #52 of 142
Quote:
Originally Posted by HeadRusch View Post

Ok well, REVIEWS!? Some of you have dipped into the piggy bank, I know this

I have heard one big PITA is the auto-overclocking that happens.....seems like the days of setting a voltage and setting a speed for Shaders/Ram/GPU are *over*.......can anyone confirm?

Does this mean we wont be seeing any more "Superclock" or "Black" edition Nvidia boards if the auto-overclocking is all set by ranges?

No. Galaxy for one has a factory OC'd card coming fairly soon.

http://www.pcper.com/category/tags/4gb-gtx-680

EVGA also has plans for a "FTW" and FTW 4GB" cards.

http://www.evga.com/articles/00669/#GTX680FTW

Personally, I am thinking I will dip when the 4GB cards come out. Should be a good upgrade from my 570's.
post #53 of 142
Yup. Same here. 4gigs of VRAM sounds right up my alley.
post #54 of 142
you only really need the ram for 3 screen gaming and 3d, otherwise the 680 in general is a beast no matter which you do choose, I bet those 4gb are gonna be 600 easily though.
post #55 of 142
Thread Starter 
Quote:
Originally Posted by Threefiddie View Post

you only really need the ram for 3 screen gaming and 3d, otherwise the 680 in general is a beast no matter which you do choose, I bet those 4gb are gonna be 600 easily though.

Agreed. 4GB is overkill for regular single screen usage at 2560x1600 and under.

The 10% factory overclock, revamped PCB and cooler with two 90mm fans on one of the Galaxy cards does look tempting though.
post #56 of 142
Quote:
Originally Posted by MSmith83 View Post

Agreed. 4GB is overkill for regular single screen usage at 2560x1600 and under.

The 10% factory overclock, revamped PCB and cooler with two 90mm fans on one of the Galaxy cards does look tempting though.

That's the one I'm eyeballing to replace my 570's in my NV Surround setup. The 1280 MB of Vram limits my ability to use AA in the newer titles. Even a single 2gb 680 will play at settings that I get single digit frames with my current setup.
post #57 of 142
Overkill is highly underrated.
post #58 of 142
Scheduled Delivery:
Thursday, 03/29/2012, By End of Day


I can't wait
post #59 of 142
Wow I want to buy one of these cards and a new 27'' monitor but the wallet doesn't

My cat was sick so the price of the Card went to the Vet I also go busted for a yellow light the same day... expensive day
post #60 of 142
Found a laptop with a 630m in it. I wonder just how powerful the mobile versions of these chips are...
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: HTPC Gaming
AVS › AVS Forum › Gaming & Content Streaming › Home Theater Gaming › HTPC Gaming › Nvidia Kepler