My office rig has a Titan Xp on a EK block and I don't recall the price going up from when I had 3x Titan Maxwells on EK blocks, so I don't see the price going up on them.
This card will be in my HTPC until AIB cards come out but I am hearing that the PCB layout will be the same. If that is the case and all PCBs are the same, it will be one block to rule them all.
Even if the air cooling solutions are better, one of those will go in the HTPC and the FE will go into the office PC and I will put it in my custom loop.
Definitely curious about the real world performance of this line of cards. Especially how quite they are in operation under full load with fans.
Personally interested in the 2070(ti) to potentially replace my 970 to improve 4K playback, gaming, and some video editing improvements. Hopefully my 4790K can tide me over for another year or so.
Nvidia is claiming 1/5 the noise output of current Pascal cards. I am assuming this is compared to the current FE cards with a single fan blower cooler.
My 1080 Ti FTW3 is one of the louder cards, mainly due to the aggressive fan ramp but I certainly wouldn't call it loud.
Your 4790K should be fine. At 4K the GPU is the bottleneck but maybe Touring will change that. I only really see these new cards making the CPU the bottleneck for lower end CPUs with low clocks and thread count coupled with high resolution like 4K.
Prices aren't quite as high as I expected. I'll wait for the 3080 Ti or whatever the next model is called, which will probably hit around the time Intel has its feeble discrete GPUs on offer.
I'm also still running a 970 with 1080p... and 4790k and have been looking at
the evga ftw 2 and ftw3 cards.
Since this news finally came out I'm thinking about upgrading to 2080 or 2070
and then get a 1440p monitor......
The problem is waiting on reviews now ..... and so it looks like getting a evga
with dual bios might be 4-5 months away, if they even release a card with dual
bios........... hope they do .
Yup, they made sure to not make a single statement about rasterized performance.
My buddy who sells graphics cards as part of his business has a theory which actually make quite a lot of sense.
Over supply of 10xx series chips is a real thing, distribution told me that, the rest of this is a guess. Oversupply = due to mining.
2080 was ready months ago, but they didn't want to release it because of the over supply. Hence the 3rd party cards are all ready to go on launch day from like every manufacturer.
They waited and waited but now the Ti is ready.... with 7nm die shrink around the corner (AMD said 7nm vega is releasing this year).
So now they have no choice but to release it and price them above the 10xx series so they can still get rid of their 10xx series cards.
It's certainly plausible.......I also think they were enjoying the stretch the 1080 gave them, Nvidia can afford to sit on technology...but board manufacturers cannot, they're always marketing the next upgrade...when they have nothing to sell, they stagnate.
I'm still amazed we are seeing Ti's ship on day one. I guess I must ask, why wouldn't you buy a Ti knowing full well they offer a good uptick of performance for their price premium, usually........and they are generally marketed towards the crowd that
is already itching for the next upgrade, which is why they always deliver a solid 25-35%+ more oomph than the normal-clocked boards....making them a legitimate upgrade path, not a lateral move.
But....but...when you have the 70, the 80 and the Ti and hell probably a Titan or three as well coming.....uh......well.....that's a lotta choice all at once. You have to ask why they aren't holding the Ti parts off.......people need time to digest, THEN you bring them back to the Buffet
Yeah, that is why his theory makes so much sense. I have bought Titans since the original Keplar gen. Titan always was the fastest, the earliest. Then the Tis would come out later, usually about 9 months to half a year later and roughly match the performance. Over time that gap closed and the Tis came out even faster.
Now we are at the point where the theory really checks out, that Nvidia was sitting on these cards, which is why the Ti is available at launch and not only the Ti but the 3rd party cards are launching in parallel. (I would typically call them AIBs but I had heard all the PCBs are the same.)
I mentioned earlier that I already have a 2080 Ti preordered for my HTPC but I am worried that there may not be that 25-35% improvement over my 1080 Ti FTW3 in normal rasterized workloads. It also feels weird to pay the same price for the 2080 Ti that I have for my Titan Xs but without the extra 1GB of RAM and launching in tandem with all the cards. (I realize that sounds extremely superficial and admittedly it is but where is the premium pricing coming from!?)
7nm GPU dies are likely around the corner and they are stuck between trying to dump Pascal cards and make them look like a deal, get these Turing cards out ASAP before 7nm are a thing. Seems their hands are forced.
All those things combined make me weary of these 20xx cards...
This looks like its going to be 10-15% faster than a 1080TI except the ray tracing, which won't really be supported by much for a long time. I'd wait until sometime in 2019 and see what comes.
The Turing architecture is a much more exciting proposition for general content creators and enterprise. But that won't stop company analysts and TV personalities who don't even game from saying things like "Nvidia's newest chips are a must and the only choice for gamers!" as they kneel before the shrine of Jensen Huang.
We'll see, I'm waiting to see if Kyle at HardOCP signs the NDA or not to get review samples. I expect...honestly? I expect the tick in the tock cycle....so if its, you know, normally a 25% improvemet...its 35%......now if they blow us all away with 50%+, ok, I like a seasoned mayo with my crow.
There has been some early impressions from some demos with ray tracing at GamesCon but with no benchmarking tools, however at Resetera there was a thread discussing how Shadow of the Tomb Raider was running between 30-40 FPS at 1080P with the the 2080ti. To be fair that code may not be full optimized and if I understand correctly RT will not be included at launch.
That may not bode well for the 2070 in regards to RT performance and depending on conventional performance metrics will decide whether I go for the 1070ti versus the 2070.
The RT tech is really cool but it likely will be a couple of years before it widely supported and running well on cards that are not the bleeding edge.
As expected. Even the smallest of ray tracing features will destroy performance. It is indeed much too early to get excited about the tech for gaming purposes. There will be very few graphical "oohs" and "aahs" of the Crysis days until full-on ray traced gaming becomes a reality.
Performance maybe better once developers have more time with the hardware and code. The rumor is that SOTTR and BFV teams have only had a few weeks to implement the ray tracing features, so there is still plenty of opportunity for things to improve; at least for the 2080 cards, the 2070 RT maybe a complete nonstarter.
I wonder if Nvidia will be releasing new GTX cards that do not have the RTX silicone to fill the gap below the $600 price point.
Looks like Nvidia is releasing frame rate and frame pacing numbers for their demos, scuttlebutt is some games scaling to 45%+ and others are in the 20% range, so you know, the kind of variance we see when its GPU vs CPU and so forth.......if the improvements are closer to 50% as others are stating online, that would explain the price-point, putting it closer to Titan territory.
I guess if you're a gamer desperate to to show off the highest FPS number in a bench (or if you produce content and can make use of the real-time raytracing) they might make sense despite the astronomical price, but for HTPC use I have no reason to upgrade my 1080Ti until they offer a model with HDMi 2.1 (hardware 48gb/s support vs 18gb/s of HDMI 2.0x). For me that was the deal breaker. I can always use more power with MadVR, but these GPUs will be obsolete in 6-12 months max, with 7nm models being released, bringing down the 285W power and bringing HDMI 2.1 support (which I'm looking forward to for RGB 4:4:4 10/12 bits support at 60p and above).
No idea why anyone would pay that price for a GPU (that's 100% more than I paid for my 1080Ti a year ago), and very bad news if people do keep their pre-order and don't send to nVidia the right message: at that price, without HDMI 2.1 or 7nm and with 285W power, stick it where the sun doesn't shine!
I hope these will be a huge failure and that they will come back to more reasonable pricing by the time the next models launch.
This is an entire thread of nerd-dudes with The Spendies. Benchmarks? Puckit, I'll take Two sight-unseen then complain later if they don't live up to my expectations!
Cuz thats how *we* roll, you see....
Also: None of you need this card, you're all getting acceptable framerates today. Admit it. Ultra Grass in GTAV is not a reason to spend a grand. Those benchmarks above show a bunch of ****ty games running 2 times as fast or 100% improvement.....Ark and PubG...well hold on there, let me get my wallet.
Now you show The Witcher at a 2x improvement, people gonna shop.
But curiously they aren't showing The Witcher...or any game that, you know, actually struggles to hit 4k/60 let's say on the top tier hardware.
At 4K Pascal cards frequently drop below 60fps quite a bit with any demanding game.
A ~30% improvement at least in normal rasterized loads and being able to double that performance with DLSS optimized games sounds quite enticing from the limited data available.
I was worried that we wouldn't see many DLSS enable games but the fact that Nvidia does all the work and you are basically getting free performance through machine learning to enable some very high tech cores on a graphics card, is actually really neat!
Currently I don't dare enable anti aliasing as it is far too expensive but if I can enable it for free performance wise, that sounds quite excellent.
I thought I'd be building my pc around this card but, so far its not exciting me. Especially that price. Guess I'm waiting to see what AMD can provide. Especially cpu side, i think im gravitating back to AMD.
I thought I'd be building my pc around this card but, so far its not exciting me. Especially that price. Guess I'm waiting to see what AMD can provide. Especially cpu side, i think im gravitating back to AMD.
Navi won't be out until 2019. Maybe even 2H 2019. It's gonna be a long wait. I might be grabbing Zen 2 though.
I've been wanting to upgrade my system as well (3770K/390) but then the mining craze happened and that put everything on hold. And then Nvidia started acting a fool with their anti-consumer/monopolistic practices (GPP/refusal to support Adaptive-Sync/releasing multiple cards with the same name but with different specs on unsuspecting consumers/GameWorks/disabling and re-enabling overclocking) and I'm done with them. Both AMD and Nvidia have done shady things in the past but lately Nvidia just doesn't seem to give an eff about consumers and considering people are still lining up to buy their cards (especially the 20xx series with their bloated prices), why would they change? I guess I'm to blame as well as, like probably most you, I've owned my fair share of Nvidia cards in the past but until they change their ways I won't be supporting or recommending them.
Still, I'm a PC nerd, and the ray-tracing in BFV did look sexy. A bit too shiny but I can't wait for that level of eye-candy to be available in the mainstream without the massive performance hit.
Just a hunch but I think if the 20xx series truly had a massive jump in raster performance Nvidia would have been showing it ever so proudly (even if that had not been their main push... RT).
I hope all the skepticism is unfounded but something just feels a little suspect at this point. The other factor here is price of admission, the 2080ti better laugh at the 1080ti in order to justify the price variance (and not just when it comes to RT). No way I would pre-order just to be 'first', we know there will be technical issues out of the gate (especially regarding the drivers).
That said once these are solid and shown to be worth the coin I'm down for a 2080ti (to replace my aging, tired and worn out 1080ti ).
I'm with you, it could really go either way. The great thing about this pre-orders is, you can cancel it if it's not up to snuff, nothing lost.
I think they were banking on everyone going nuts for the Ray Tracing but didn't really pan out once we all say the benches. Hopefully on the 12th, we will get to see whats what. If it's not at least a 30% jump over the 1080ti, it will be meme'd to death lol/
I agree that it's going to be a while but this has been tech that's been out of reach for so long that the mere fact we're so close is pretty exciting. Like you mentioned earlier too it'll be interesting how this plays out with the next-gen consoles on the horizon.
That is the whole ballgame IMO on the AAA Devs going hog wild on New RT IP and Remastering classics with it IMO. Meaning the Console market will have to lead with next gen. If MS and Sony can fully optimize and mod RX 580-Vega series GPU's like MS did with XB1 Jaguar. And deliver it with RT @ 1080p/60 real time locked frames...and 4k/30 locked. Then it will be Katy bar the door on RT. Because a decent 4k HDR UHDTV panel will have no trouble upscaling a 1080p/60 RT image to SR/pseudo 4K...or rendering 4k/30. Even though I suspect the lag would be awful at live 4K. Bottom line is PC gamers want 'perfection' in across the board performance as they define it. Console gamers want 'pretty' awesome visual fidelity with tolerable Lag time on their HDTV panels period. The RTX2080 is just not ready for prime time for Full 4k RT on PC yet. But it could deliver even more of the visual awe that console gamers look for on a fully loaded 1080P platform.
IMO the 7N cards that are in development by NVidia and AMD (2019-2020) will deliver the goods on 4K RT for PC gamers who demand 4k/60-100FPS @1-5ms lag (along with unprecedented 1080p-1440p specs with RT). But the RT special modded/sunk cost AMD 580 and Vega series GPUs will likely be in the architecture MS and Sony deploys for their next gen consoles. And that will be the catalysts for AAA Game Devs to throw their full weight behind RT tech (as the showcase technology for the new consoles). That's why I called the RTX 2080 a grossly overpriced (for me) entry level card for RT. It's just akin to what AMD will do with MS/Sony in the new consoles (at a fraction of the cost). As I said earlier...$1000 for 2 games is waaay too rich for my blood. Especially when they will likely look better on Big Screen through XB2X and PS5.
In conclusion...I think it is tremendous that NVidia blazed the path with these 20xx RT cards. It is just a peek however in the goodness they will have coming for PC gamers in their 30xx-50xx cards (remember 8k is right around the corner). Big Green also prodded MS, Sony and the Game Devs on how to speed accelerate into Next Gen IMO. By aggressively launching RT via the RTX into the PC space (that they dominate)...they are sorta pushing AMD to accelerate its development calendar in the Console development space. A lot of Game Devs will thus use the RTX2080 (a modded 1080) as the template for developing & optimizing Next Gen RT games for the new AMD based consoles & UHDTVs. While MS & Sony will mod their AMD GPU's to match or crush that spec in their new hardware on a "fixed" UHDTV platform. NV will move more aggressively upstream in the PC gaming space. That's pretty much it for me on how I see this card and topic.
Having been in interested in Ray Tracing since the first home computer RT pics were shown in an Amiga magazine back in the late 80s, when even a very simple scene would take many minutes to hours to render, the fact that we are actually getting some decent bits of RT ray-tracing in home computers now seems pretty amazing to me and does deserve the hype.
That said, for games and such, it may not be until the generation after this set of cards that it really begins to truly work, especially for sub x80 level cards (I've heard that this round they won't even bother with RTX x60 class cards since gaming seems to dominate the home market these days and it won't be enough for that purpose to make it worthwhile).
Looks like normal rasterized workloads are seeing quite a nice boost if these numbers are legit. I was just thinking last night about cancelling my pre-order but now I am holding on...
45% is the average across those 14 titles, not best case but as the embargo lifts there will be more numbers to go off so it could be better or worse. This also only takes into consider normal rasterized workloads and not games that leverage DLSS which can double the performance for some titles.
I don't think anyone contested the price of the cards, yes?
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Related Threads
?
?
?
?
?
AVS Forum
34M posts
1.5M members
Since 1999
A forum community dedicated to home theater owners and enthusiasts. Come join the discussion about home audio/video, TVs, projectors, screens, receivers, speakers, projects, DIY’s, product reviews, accessories, classifieds, and more!