Nvidia Kepler - AVS Forum
Forum Jump: 
Reply
 
Thread Tools
post #1 of 142 Old 03-21-2012, 01:57 PM - Thread Starter
AVS Special Member
 
MSmith83's Avatar
 
Join Date: Jun 2005
Posts: 8,673
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 69 Post(s)
Liked: 68
It looks like the release of the Nvidia GTX 680 is imminent. All current reports point to it being based on GK104 and priced at $500 or $550, with about a 10% performance increase over AMD's 7970. It seems we will have to wait for a more powerful card based on GK110.

Also rumored is a new, much more effective anti-aliasing technique and support for three displays on one card.

Leaked photos show a smallish card given its supposed performance, and it uses just two 6-pin power connectors. It could be a boon for high-end HTPC gaming applications where significant power and low noise are wanted.

Anyway, the truth about the GTX 680 should soon be available.


EDIT: Some review links below.

http://hardocp.com/article/2012/03/2..._card_review/1

http://www.anandtech.com/show/5699/n...x-680-review/1

http://www.xbitlabs.com/articles/gra...e-gtx-680.html

http://hothardware.com/Reviews/NVIDI...Kepler-Debuts/

http://guru3d.com/article/geforce-gtx-680-review/1

http://www.tomshardware.com/reviews/...mark,3161.html

http://www.hardwareheaven.com/review...roduction.html
MSmith83 is offline  
Sponsored Links
Advertisement
 
post #2 of 142 Old 03-21-2012, 03:00 PM
AVS Addicted Member
 
joeblow's Avatar
 
Join Date: May 2006
Location: Los Angeles, CA
Posts: 12,197
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 213
Nice rumor summary. Thanks!

Los Angeles Lakers - 16 NBA Championships!

joeblow is offline  
post #3 of 142 Old 03-21-2012, 04:32 PM - Thread Starter
AVS Special Member
 
MSmith83's Avatar
 
Join Date: Jun 2005
Posts: 8,673
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 69 Post(s)
Liked: 68
There's an Nvidia marketing video out for the GTX 680 that talks about its power efficiency, new anti-aliasing method (TXAA), PhysX improvements, and support for up to four displays on one card.

MSmith83 is offline  
post #4 of 142 Old 03-21-2012, 04:34 PM
Advanced Member
 
Threefiddie's Avatar
 
Join Date: May 2010
Location: Lexington, SC
Posts: 703
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 34 Post(s)
Liked: 24
now if we can just get some new games to come out to play it on... dry couple months coming up....

Threefiddie is offline  
post #5 of 142 Old 03-21-2012, 06:17 PM
AVS Special Member
 
HeadRusch's Avatar
 
Join Date: Nov 2003
Posts: 9,882
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 27 Post(s)
Liked: 50
TXAA looks good, the rest is fluff......clockspeeds and throughput...thats all we care about Nvidia. Decreased noise is also important however because, lets face it, you jam a case up with a couple of these things and you don't want to hear the hairdryers running while you're trying to immerse yourself in, say, dead space.

Leaked TOMS HARDWARE review looks like base unit is getting on average 20+ FPS higher than 580's at stock speeds.....wonder what the OC potential on the 680's will be........(and Skyrim barely budges obviously since its cpu bottlenecked).......hmmmm, wonder how it runs CRYSIS

Xbox Live / PS3 / Steam: HeadRusch1
Keeping the world safe from the evil antics of Bernie Tanaka and Mel Fujitsu since 1986

HeadRusch is online now  
post #6 of 142 Old 03-21-2012, 07:27 PM
AVS Special Member
 
N8DOGG's Avatar
 
Join Date: Dec 2003
Location: Canada
Posts: 5,937
Mentioned: 6 Post(s)
Tagged: 0 Thread(s)
Quoted: 276 Post(s)
Liked: 345
As much as I want the 680's, I think I'm going to wait to SLI 2 x GK110's. They say second half of the year. I'm betting $700-$800 a pop too.

Blasting brown notes for 10 years and counting!

N8DOGG is online now  
post #7 of 142 Old 03-21-2012, 07:31 PM - Thread Starter
AVS Special Member
 
MSmith83's Avatar
 
Join Date: Jun 2005
Posts: 8,673
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 69 Post(s)
Liked: 68
Quote:
Originally Posted by HeadRusch View Post

Leaked TOMS HARDWARE review looks like base unit is getting on average 20+ FPS higher than 580's at stock speeds.....wonder what the OC potential on the 680's will be........(and Skyrim barely budges obviously since its cpu bottlenecked).......hmmmm, wonder how it runs CRYSIS

I was just reading that. Not bad at all, especially at 1920x1080.

I hope a site does PhysX benchmarks in games like Arkham City, and compares it to the 580.
MSmith83 is offline  
post #8 of 142 Old 03-21-2012, 07:37 PM - Thread Starter
AVS Special Member
 
MSmith83's Avatar
 
Join Date: Jun 2005
Posts: 8,673
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 69 Post(s)
Liked: 68
Quote:
Originally Posted by N8DOGG View Post

As much as I want the 680's, I think I'm going to wait to SLI 2 x GK110's. They say second half of the year. I'm betting $700-$800 a pop too.

Given how small the GK104 is, there should definitely be room for significantly better performance with the much larger GK110.

Of course, those of us sticking with 1920x1080 TVs or projectors should be served well by the GTX 680 for quite some time.
MSmith83 is offline  
post #9 of 142 Old 03-21-2012, 08:02 PM
AVS Special Member
 
Dashboard's Avatar
 
Join Date: Oct 2007
Location: Montréal, Québec
Posts: 2,011
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 53 Post(s)
Liked: 75
I'm still debating if I buy the new AMD card or those kepler.

The next couple of months will be interesting.. I was about to pull the trigger for a 7870 (and probably a new PSU since my 460W might be bottleneck for it) but I am now thinking about the kepler.. am I normal doctor?
Dashboard is offline  
post #10 of 142 Old 03-21-2012, 08:09 PM
Advanced Member
 
Threefiddie's Avatar
 
Join Date: May 2010
Location: Lexington, SC
Posts: 703
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 34 Post(s)
Liked: 24
personal opinion here... but nvidia > amd

Threefiddie is offline  
post #11 of 142 Old 03-21-2012, 08:29 PM
AVS Addicted Member
 
DaveFi's Avatar
 
Join Date: Nov 2002
Location: Natick MA
Posts: 17,180
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 3 Post(s)
Liked: 26
I'll never buy AMD again simply because of my experience with them now. Drivers are lacking.

That said from what I'm seeing of Kepler I don't see any reason to upgrade my tri-Crossfire unclocked 6950s yet. I think I will upgrade Nvidia's next go 'round.

XBOX Live: Wagmman
PSN: Wagg
BFBC2: Wagman
Steam: Wag

My Second Life character looks and acts exactly like me except he can fly.
DaveFi is offline  
post #12 of 142 Old 03-22-2012, 07:09 AM
AVS Special Member
 
Dashboard's Avatar
 
Join Date: Oct 2007
Location: Montréal, Québec
Posts: 2,011
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 53 Post(s)
Liked: 75
Since I,m already using an AMD card and I'm reading alot of horror stories about drivers issues switching from AMD to NVidia.. not really interested. Also, if the new kepler price is around 500$+... add a PSU it,s getting expensive. The 7870 should probably suffice to game on a single monitor at 1080P... we'll see I'm not settled yet
Dashboard is offline  
post #13 of 142 Old 03-22-2012, 07:18 AM
AVS Special Member
 
HeadRusch's Avatar
 
Join Date: Nov 2003
Posts: 9,882
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 27 Post(s)
Liked: 50
Read up on HardOCP's review...it looks like an evolutionary bump up from the 580, not quite revolutionary....10fps here, 20fps there....so maybe 25fps after some overclocking.

But I'm...honestly.....not sure if I'm going to upgrade my 580 at this point. My 580 overclocks to hell and back......Adaptive VSYNC looks pretty good though...we'll see. Maybe one of the expensive overclocked ones...but...hmmmm...not entirely sure.

Xbox Live / PS3 / Steam: HeadRusch1
Keeping the world safe from the evil antics of Bernie Tanaka and Mel Fujitsu since 1986

HeadRusch is online now  
post #14 of 142 Old 03-22-2012, 07:23 AM
AVS Special Member
 
Dashboard's Avatar
 
Join Date: Oct 2007
Location: Montréal, Québec
Posts: 2,011
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 53 Post(s)
Liked: 75
Quote:
Originally Posted by HeadRusch View Post

Read up on HardOCP's review...it looks like an evolutionary bump up from the 580, not quite revolutionary....10fps here, 20fps there....so maybe 25fps after some overclocking.

But I'm...honestly.....not sure if I'm going to upgrade my 580 at this point. My 580 overclocks to hell and back......Adaptive VSYNC looks pretty good though...we'll see. Maybe one of the expensive overclocked ones...but...hmmmm...not entirely sure.

If you already have a 580, I don't think it will worth the upgrade (maybe if you game on 3+ monitors).

The new card are worth upgrading for people like me... gaming on a 6770 and planning on buying a new 1080P monitor where the 6770 might not be enough
Dashboard is offline  
post #15 of 142 Old 03-22-2012, 08:38 AM - Thread Starter
AVS Special Member
 
MSmith83's Avatar
 
Join Date: Jun 2005
Posts: 8,673
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 69 Post(s)
Liked: 68
Those who bought a 7970 should not feel bad at all when it comes to performance.

I just wish competition dictated for this card to be priced as the mid-range card that it probably should have been with the more powerful GK110 looming. Now people will have to pay through the nose for the GK110.

Some review links have been added to the original post.
MSmith83 is offline  
post #16 of 142 Old 03-22-2012, 08:51 AM
AVS Special Member
 
257Tony's Avatar
 
Join Date: Dec 2006
Location: Syracuse
Posts: 2,012
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
Quote:
Originally Posted by MSmith83 View Post

Those who bought a 7970 should not feel bad at all when it comes to performance.

I just wish competition dictated for this card to be priced as the mid-range card that it probably should have been with the more powerful GK110 looming. Now people will have to pay through the nose for the GK110.

Some review links have been added to the original post.

It's as or more powerful than the 7970 at a lower price, and will probably drive the 79xx rpices down. Win for the consumer.

GTX 680 is impressive, outperforming the 7970 at triple monitor resolutions with less RAM.

I'm seriously contemplating selling my 570's and doing a single 680.
257Tony is offline  
post #17 of 142 Old 03-22-2012, 08:54 AM
AVS Addicted Member
 
DaveFi's Avatar
 
Join Date: Nov 2002
Location: Natick MA
Posts: 17,180
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 3 Post(s)
Liked: 26
When is the GK110 supposed to come out? I will probably upgrade to that.

XBOX Live: Wagmman
PSN: Wagg
BFBC2: Wagman
Steam: Wag

My Second Life character looks and acts exactly like me except he can fly.
DaveFi is offline  
post #18 of 142 Old 03-22-2012, 09:07 AM
AVS Special Member
 
Dashboard's Avatar
 
Join Date: Oct 2007
Location: Montréal, Québec
Posts: 2,011
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 53 Post(s)
Liked: 75
Those reviews are impressive.

I hope the kepler will in fact drive AMD to cut the 79xx and 78xx prices down

For a single monitor, I don't know if it's worth to have such a powerful card except to be futureproof.... 130 FPS in skyrim at Ultra... giggydy giddydy ya!

I'm torn don't know what to buy
Dashboard is offline  
post #19 of 142 Old 03-22-2012, 09:14 AM - Thread Starter
AVS Special Member
 
MSmith83's Avatar
 
Join Date: Jun 2005
Posts: 8,673
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 69 Post(s)
Liked: 68
Quote:
Originally Posted by 257Tony View Post

It's as or more powerful than the 7970 at a lower price, and will probably drive the 79xx rpices down. Win for the consumer.

GTX 680 is impressive, outperforming the 7970 at triple monitor resolutions with less RAM.

I'm seriously contemplating selling my 570's and doing a single 680.

A $100 price drop for the 7970 would be a win for the consumer, but a rather small win in my opinion. The big problem I think is that AMD may have set the bar much too low.

While the 680 easily beats the 7970 in 1080p with most games, the gap closes considerably in highest resolution gaming where performance matters most to many high-end gamers.

It just seems that the price-to-performance ratio with 28nm should have been much better.

It will be interesting see where prices are at by the end of this year when yields are optimal.
MSmith83 is offline  
post #20 of 142 Old 03-22-2012, 10:07 AM
AVS Special Member
 
HeadRusch's Avatar
 
Join Date: Nov 2003
Posts: 9,882
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 27 Post(s)
Liked: 50
Quote:
Originally Posted by Dashboard View Post

Those reviews are impressive.

I hope the kepler will in fact drive AMD to cut the 79xx and 78xx prices down

For a single monitor, I don't know if it's worth to have such a powerful card except to be futureproof.... 130 FPS in skyrim at Ultra... giggydy giddydy ya!

I'm torn don't know what to buy

Huh, where you seeing 130fps in Skyrim? Skyrim has been scaling kinda crappily on higher end GPU's has it not? Regardless....what is the 110 Kepler you guys are talking about, is that their single PCB/Dual GPU solution?

Xbox Live / PS3 / Steam: HeadRusch1
Keeping the world safe from the evil antics of Bernie Tanaka and Mel Fujitsu since 1986

HeadRusch is online now  
post #21 of 142 Old 03-22-2012, 10:16 AM - Thread Starter
AVS Special Member
 
MSmith83's Avatar
 
Join Date: Jun 2005
Posts: 8,673
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 69 Post(s)
Liked: 68
Quote:
Originally Posted by HeadRusch View Post

Regardless....what is the 110 Kepler you guys are talking about, is that their single PCB/Dual GPU solution?

The GK110 is a single-GPU solution. It is rumored to have a massive die size of about 550 mm², and is expected to be released later this year.
MSmith83 is offline  
post #22 of 142 Old 03-22-2012, 10:25 AM
AVS Special Member
 
HeadRusch's Avatar
 
Join Date: Nov 2003
Posts: 9,882
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 27 Post(s)
Liked: 50
Quote:
Originally Posted by MSmith83 View Post

The GK110 is a single-GPU solution. It is rumored to have a massive die size of about 550 mm², and is expected to be released later this year.

Thanky, I didn't realize there were "two stops" on the nvidia roadmap this year.....looks like this is a more efficient 580 (yes, I know different architecture),phase out the Fermi's and bring Kepler to bear.....but I guess the 110 is the one folks are waiting for?

I'm still tempted to upgrade if only to see what that adaptive VSYNC is all about

Xbox Live / PS3 / Steam: HeadRusch1
Keeping the world safe from the evil antics of Bernie Tanaka and Mel Fujitsu since 1986

HeadRusch is online now  
post #23 of 142 Old 03-22-2012, 10:27 AM
AVS Club Gold
 
rdjam's Avatar
 
Join Date: Mar 2005
Location: Miami, FL
Posts: 9,736
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 13
OK - snagged my two for SLI! Two of the EVGA card on their way overnight for my new i7 3930K gaming box.

Reviews look good out there.

This has some new features, also, compared to Fermi. Changed the way texture shaders were built/used, and compensated by going with waay more CUDA cores, as well as new anti-aliasing modes, TXAA. And changes to how PhysX is handled, and eliminating the old PhysX processor.

All in all, I felt it was worth the jump.
rdjam is offline  
post #24 of 142 Old 03-22-2012, 10:32 AM
AVS Special Member
 
HeadRusch's Avatar
 
Join Date: Nov 2003
Posts: 9,882
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 27 Post(s)
Liked: 50
I think I'm going to skip the generic reference boards and wait for a TWIN FROZR overclocked or superclocked version, then push that a little further. Looks like everything being released today, tomorrow falls under the realm of "stock speeds". Nothing wrong with that, I'm sure the O/Cing potential is there. Can't wait to hear your guys experiences, ESPECIALLY if you are upping from a 570 or 580 like I would be.

Xbox Live / PS3 / Steam: HeadRusch1
Keeping the world safe from the evil antics of Bernie Tanaka and Mel Fujitsu since 1986

HeadRusch is online now  
post #25 of 142 Old 03-22-2012, 10:46 AM - Thread Starter
AVS Special Member
 
MSmith83's Avatar
 
Join Date: Jun 2005
Posts: 8,673
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 69 Post(s)
Liked: 68
Quote:
Originally Posted by HeadRusch View Post

Thanky, I didn't realize there were "two stops" on the nvidia roadmap this year.....looks like this is a more efficient 580 (yes, I know different architecture),phase out the Fermi's and bring Kepler to bear.....but I guess the 110 is the one folks are waiting for?

I'm still tempted to upgrade if only to see what that adaptive VSYNC is all about

I would imagine that it's mostly owners of multi-monitor setups who are waiting to see what the GK110 is about.

As for adaptive vsync, I think (I could be wrong) the idea is for vsync to be engaged when the frame-rate gets higher than the display's refresh rate, and then disengaged when the frame-rate is lower than the display's refresh rate. The problem with the latter point is that tearing will still occur (albeit to a lesser degree) as the output will not be perfectly sync'd to the display's refresh cycles. It is a nice option to have, but I'll stick with regular vsync and enable triple buffering when lag (or more specifically, frame-rate halving) is an issue.
MSmith83 is offline  
post #26 of 142 Old 03-22-2012, 10:52 AM
AVS Special Member
 
HeadRusch's Avatar
 
Join Date: Nov 2003
Posts: 9,882
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 27 Post(s)
Liked: 50
Quote:
Originally Posted by MSmith83 View Post

The problem with the latter point is that tearing will still occur (albeit to a lesser degree) as the output will not be perfectly sync'd to the display's refresh cycles. It is a nice option to have, but I'll stick with regular vsync and enable triple buffering when lag (or more specifically, frame-rate halving) is an issue.

I *thought* they said it would match the framerates to the monitor refresh rates, so you wouldn't get the "immediate drop in frames" when you got less than 60fps on your 60hz monitor...almost like putting some kind of hardware triple buffer in there so if you are at 45fps you'll get 45fps progressive without tearing.

Obviously...there are MATHS at play SKYRIM is the last game where tearing drove me insane both due to high framerates as well as low......

And not really tearing more like microstuttering, and you engage vsync and have to deal with the input lag (even with forced double or triple buffering, you still get the mouse lag).

Xbox Live / PS3 / Steam: HeadRusch1
Keeping the world safe from the evil antics of Bernie Tanaka and Mel Fujitsu since 1986

HeadRusch is online now  
post #27 of 142 Old 03-22-2012, 11:02 AM - Thread Starter
AVS Special Member
 
MSmith83's Avatar
 
Join Date: Jun 2005
Posts: 8,673
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 69 Post(s)
Liked: 68
Quote:
Originally Posted by HeadRusch View Post

And not really tearing more like microstuttering, and you engage vsync and have to deal with the input lag (even with forced double or triple buffering, you still get the mouse lag).

In my experience with Direct3D games, forcing triple buffering almost always eliminates noticeable mouse lag in games that have it with vsync on. This revelation first came about when I tried it with the game FEAR.
MSmith83 is offline  
post #28 of 142 Old 03-22-2012, 11:05 AM
AVS Special Member
 
HeadRusch's Avatar
 
Join Date: Nov 2003
Posts: 9,882
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 27 Post(s)
Liked: 50
Quote:
Originally Posted by MSmith83 View Post

In my experience with Direct3D games, forcing triple buffering almost always eliminates mouse lag in games that have it with vsync on. This revelation first came about when I tried it with the game FEAR.

I was using D3Doptimizer or whatever to try to force triple buffering with vsync on in Skyrim and always got "mediocre" results. With Vsync turned off you get horriffic microstuttering. I tried setting the read-ahead frames to 1, but that made things worse. Vsync on is perfect and you just deal with mouse lag AND the painful drop to 30fps when things slow down a bit....

But honestly Skyrim is a bad offender.....many other games don't give me the grief about vsync. In fact I almost always ran with Vsync off and never noticed stuttering or tearing.....*shrug*.

I am very interested to hear how the 680 changes that equation.

Xbox Live / PS3 / Steam: HeadRusch1
Keeping the world safe from the evil antics of Bernie Tanaka and Mel Fujitsu since 1986

HeadRusch is online now  
post #29 of 142 Old 03-22-2012, 11:10 AM
AVS Special Member
 
Dashboard's Avatar
 
Join Date: Oct 2007
Location: Montréal, Québec
Posts: 2,011
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 53 Post(s)
Liked: 75
Quote:
Originally Posted by HeadRusch View Post

Huh, where you seeing 130fps in Skyrim?

In Tom's hardware review, it has 138 FPS at high settings for my resolution (the almighty 1680x1050 )

These new cards will cost me alot.. card + new PSU + new 1080P monitor
Dashboard is offline  
post #30 of 142 Old 03-22-2012, 11:23 AM - Thread Starter
AVS Special Member
 
MSmith83's Avatar
 
Join Date: Jun 2005
Posts: 8,673
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 69 Post(s)
Liked: 68
Quote:
Originally Posted by HeadRusch View Post

I was using D3Doptimizer or whatever to try to force triple buffering with vsync on in Skyrim and always got "mediocre" results. With Vsync turned off you get horriffic microstuttering. I tried setting the read-ahead frames to 1, but that made things worse. Vsync on is perfect and you just deal with mouse lag AND the painful drop to 30fps when things slow down a bit....

But honestly Skyrim is a bad offender.....many other games don't give me the grief about vsync. In fact I almost always ran with Vsync off and never noticed stuttering or tearing.....*shrug*.

I am very interested to hear how the 680 changes that equation.

Skyrim does have all kinds of oddities going on.

Forcing triple buffering on top of vsync has made for perfect gameplay with a number of games that were at the time taxing my GPU. Some games use triple buffering by default, and some others give you the option to enable it via menu (such as some older Tomb Raider games). Other Direct3D games, like FEAR and Just Cause 2, require use of D3DOverrider for this function.

I am definitely interested in knowing more about adaptive vsync. The chart I saw made it look like a simple algorithm that enables vsync when the frame-rate would otherwise be above the display's refresh rate, and then disables vsync when the frame-rate goes below the display's refresh rate.
MSmith83 is offline  
Reply HTPC Gaming

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off