Buy 2080Ti now, or wait for holidays?

Knowing Nvidia's typical cadence it's probably going to be a small die that comes out first. x80 Ampere will be 10-15% faster than Titan RTX for $799 FE. Reason why the next x80 won't have a 980 Ti to 1080 increase of 30% from die shrink is because the 2080 Ti is gigantic--754mm2 is 25% bigger than 601mm2 of 980 Ti.

You'll have to wait for 2021 for the big Titan/Ti Ampere card.
Maybe, but the problem is you'll have better than 2080Ti on much smaller die. And it's on EUV. So Nvidia can make big dies as on average 14nm without many defaults and much better than 7nm TSMC which is last node possible on DUV and which is now great for small dies but just okay for middle size. It means that small and middle size (let's say up to 250mm²) dies will cost very little to produce to Nvidia compared to AMD now and in line to the size of the chips on Turing. Apart from more than 50% percent performance increase for the same transistor count, there will also be a 1/4 die shrink for Ampere. So a same featured GPU as the 2080Ti will be on 7nm Samsung less than 200mm² and will run 50% faster for the same TDP, but mind that Nvidia will prefer to drop power consumption instead of preserving full speed benefit (so they announced), so that may be around 20/30% better. And mind that at 200mm² this won't be even the middle range of Ampere line. It will be like what's replacing the GTX 1660Ti. AMD on 7nm TSMC is using a 251mm² GPU for its 5700XT but 7nm DUV at TSMC is not quite as efficient and dense as 7nm EUV as Samsung but that doesn't explain all. Nvidia Turing architecture is far ahead of AMD RDNA 1 used in Navi. AMD needs to show what they can do better with RDNA2. If it's not much more efficient than RDNA1 they won't be even close to Ampere.
In fact Nvidia has a so much better architecture with Turing, that they could afford using a much older chip 16nm+ technology than AMD and also put on half of the chip Raytracing and IA parts not used by raster part, knowing that raster part alone, only on half of the chip, beats AMD with whole chip used for raster and also using recent 7nm technology. It's just double unfair competition for Nvidia and Nvidia still beats AMD. And mind that Nvidia makes huge money out of their GPU sales, unlike AMD who uses its CPU success to help for their GPU business vs Nvidia. And now Nvidia had 3 years free after launching Turing to prepare all new architecture made for cutting edge chip technology this time.

Not a big fan of any company but I am very impressed by Nvidia lately. Looking for AMD only if it's cheaper and AMD knows that.
 
Maybe, but the problem is you'll have better than 2080Ti on much smaller die. And it's on EUV. So Nvidia can make big dies as on average 14nm without many defaults and much better than 7nm TSMC which is last node possible on DUV and which is now great for small dies but just okay for middle size. It means that small and middle size (let's say up to 250mm²) dies will cost very little to produce to Nvidia compared to AMD now and in line to the size of the chips on Turing. Apart from more than 50% percent performance increase for the same transistor count, there will also be a 1/4 die shrink for Ampere. So a same featured GPU as the 2080Ti will be on 7nm Samsung less than 200mm² and will run 50% faster for the same TDP, but mind that Nvidia will prefer to drop power consumption instead of preserving full speed benefit (so they announced), so that may be around 20/30% better. And mind that at 200mm² this won't be even the middle range of Ampere line. It will be like what's replacing the GTX 1660Ti. AMD on 7nm TSMC is using a 251mm² GPU for its 5700XT but 7nm DUV at TSMC is not quite as efficient and dense as 7nm EUV as Samsung but that doesn't explain all. Nvidia Turing architecture is far ahead of AMD RDNA 1 used in Navi. AMD needs to show what they can do better with RDNA2. If it's not much more efficient than RDNA1 they won't be even close to Ampere.
In fact Nvidia has a so much better architecture with Turing, that they could afford using a much older chip 16nm+ technology than AMD and also put on half of the chip Raytracing and IA parts not used by raster part, knowing that raster part alone, only on half of the chip, beats AMD with whole chip used for raster and also using recent 7nm technology. It's just double unfair competition for Nvidia and Nvidia still beats AMD. And mind that Nvidia makes huge money out of their GPU sales, unlike AMD who uses its CPU success to help for their GPU business vs Nvidia. And now Nvidia had 3 years free after launching Turing to prepare all new architecture made for cutting edge chip technology this time.

Not a big fan of any company but I am very impressed by Nvidia lately. Looking for AMD only if it's cheaper and AMD knows that.

Hmm, well I was mainly looking at AMD's progress, they went from from 496mm2 to a 251mm2 die with about a 10% performance increase from 14nm to 7nm.

So I assumed something similar would happen to Nvidia, a pure die shrink would reduce the 754mm2 2080 Ti to around 375mm2 on 7nm. Are you saying Samsung's 7nm is closer to a 50% reduction in size than TSMC 7nm? Otherwise I can't see a pure die shrunk 2080 Ti being 250mm2, much less 200mm2.
 
Hmm, well I was mainly looking at AMD's progress, they went from from 496mm2 to a 251mm2 die with about a 10% performance increase from 14nm to 7nm.

So I assumed something similar would happen to Nvidia, a pure die shrink would reduce the 754mm2 2080 Ti to around 375mm2 on 7nm. Are you saying Samsung's 7nm is closer to a 50% reduction in size than TSMC 7nm? Otherwise I can't see a pure die shrunk 2080 Ti being 250mm2, much less 200mm2.

He's just wild ass guessing, and is WAY off the mark.

All the 12nm, 14nm, and 16nm GPUs have about 25 Million Transistors/mm^2 density. AMD's 7nm GPU average about 40 Million Transistors/mm^2.

Calculating from the actual transistor densities, 2080Ti shrunk to 7nm would be:
754*25/40 = 471 mm^2.

So still a big expensive die.
 
I was hoping nvidia would release the 2080ti super as a star wars edition rounds about now, but alas, no.

That they haven't done that, in any shape or form, says to me nvidia are just sitting on expensive to make chips and until AMD threatens them in anyway, and that won't till mid next year at the very earliest, they're certainly not gunna drop their prices any time soon.

I now think the Titan Xp Star Wars edition was a rarity as it came off the back of the crypto boom and I suspect nvidia had a surplus of full fat chips they could no longer sell on mass to miners. So they flogged them off in the form of the Star Wars edition cards and, rather crazily, for less than the Titan Xp, which they carried on selling for £250 more during and even after the Star Wars card had all gone OOS by March. :confused:

They certainly won't be doing that again, not with their expensive process they're currently on.

those were an odd card and the crypto boom was still plenty relevant (though yes trending down) at the time those cards were selling. Heck, people were buying them for $1134 (IIRC) and them reselling them on ebay for $1600 in between inventory fillups on the nvidia site. The only people buying those cards in mass at that time were crypto miners.
 
RTX isnt worth it this gen. Unless you find a good price for a lateral, like I did - just stay with a 1080ti. Thats the card that is the value/performance leader right now. My 2080 is barely faster, and I’ve found zero benefit to RTX.
 
RTX isnt worth it this gen. Unless you find a good price for a lateral, like I did - just stay with a 1080ti. Thats the card that is the value/performance leader right now. My 2080 is barely faster, and I’ve found zero benefit to RTX.

Ditto. Haven't even used RTX yet. I know some people enjoy it which is cool, but it really is a pretty expensive feature right now that will be a lot better next gen.
 
  • Like
Reactions: N4CR
like this
For a few days one model of 2080ti slipped close to $1100, which is the lowest it ever been these parts. (CEU) So I got one. And just in time as well, because the price went back up hours after I ordered.

Gigabyte GV-N208TTURBO-11GC rev 2.0

Yes it is a blower, but I never had one of those before, so I said let's try it it can't be that bad. Actually it is worse than I ever would've imagined. At the slightest stress the fan ramps up and in 5 seconds and becomes obnoxiously loud. I've read a professional review of this card after I bought it (I didn't know I was getting a 2.0 until after I got it) and it turns out it has a more aggressive fan curve than normal, but I doubt it would be much better with regular it might only take a few more seconds to get to full blow, as despite being loud it also runs hot. Idle at 40-45C and under load 75-80, with thermal throttling kicking in at 84, that's not much headroom.

But I knew I had options, I'm not stupid I was planning for this exact scenario. So I got an aftermarket cooler for it from the leftover change:

Accelero Xtreme IV

It was a bit of a hassle to install, and since this card has all components on the top side, VRM and memory cooling is not great, as this only has direct contact cooling for components on the backside of the pcb.

Anyway this cooler is far better than any factory GPU cooler I saw, it might not look great, but it is very functional. idle temp: 27-30C highest temp reached running Time Spy Extreme was 56C.The fan won't even start ramping up at that point, not that it would matter as the max rpm on these fans are less than the idle rpm of the blower. Even with the price of the cooler included I'm still at least $30 bellow the next cheapest card, that doesn't come with 3 year warranty.

And no, replacing the cooler does not void the warranty as gigabyte doesn't put stickers on the screws, I can re-mount the stock cooler any time.
 
At this point I would definitely put 2080Ti purchase on hold, with Big Fury/Big Vega/ Big Radeon 7 last year and BIG NAVI supposedly set for an announcement at CES, it definitely makes sense to wait for CES. It could lead to 2080Ti discounts...

I would HIGHLY recommend going ahead and buying the 2080Ti and be happy for a couple of more years. Its the same shit with the GPU department at AMD every year. If you couldn't afford a 2080Ti over the last year, chances are you really can't afford it now. And IF and that's a huge grain of IF SALT, the new Navi is more powerful, you are within the return window still.

Play now. There is always something later....
 
Being late in cycle wait for RTX 3080ti now. Have two 1080 tis gonna wait myself. Not impressed with performance/price.
 
  • Like
Reactions: noko
like this
Not true pendragon. All we are saying is if your card is getting you buy ok save the gotta have it itch for the next generation and splurge then. That's all.
 
yeah but if they wait for the 3080 then there will be something else new right around the corner and they will have to wait for that and on and on and on....
I think much more relevant is that we don’t have any word on when a 3080 (Ti?) might be available. Unless I missed something, it could easily be late 2020 or later before anything new is released, for all we currently know.

I agree with waiting if whatever you have is working, though. Might as well get some more use out of it, and the new cards may get announced in the mean time.
 
If you can find a 2080 Ti in the $700-800 range then get one. If not, then wait, especially if you have a 1080 ti/Titan X Pascal level card.
 
  • Like
Reactions: noko
like this
I think I am more monitor/desire limited then video card limited. I would like my next upgrade be a ultra wide 5K monitor but 120hz+, HDR 1000nit minimum. Which would go along with a much more beefy GPU. At 1440p the 1080Ti or 5700XT is doing great. The closest monitor to my ideal is the Samsung CRG9 Series C49RG90S 49" Dual Quad HD 5120 x 1440 Resolution 120Hz -> not sure I would like the extra wideness on this monitor. The 2080 Ti, even when first launched just was not worth the money and even less now. Unless one can pick up a good used 2080Ti for cheap, has a lower end card to begin with then I would say only buy if your situation would benefit or greatly benefit now, not an ePeen decision but more a practical decision. For some who now can afford one, everyone is different - get what you want - go for it. For others, it might not be a good time to buy.
 
I think much more relevant is that we don’t have any word on when a 3080 (Ti?) might be available. Unless I missed something, it could easily be late 2020 or later before anything new is released, for all we currently know.

I agree with waiting if whatever you have is working, though. Might as well get some more use out of it, and the new cards may get announced in the mean time.
Since AMD made 0 announcement on RDNA 2 and Navi 21 at the CES, and since they should have if there was something ready to launch, it seems Nvidia doesn't have to rush anything and they clearly don't since they are fully aware of what AMD does since they share the same manufacturer, aka TSMC. They will probably announce something to come on Ampere architecture for the second half of 2020 during the Nvidia GPU technology conference in March 22/26, and AMD will have something to announce at the Computex 2020 in Taiwan in June, like Zen 3 at the end of the summer and RDNA2 GPUs for the end of the year including PS5 and Xbox Scarlet. Nothing more to expect until then. You're fresh 2080Ti won't be obsolete in a couple of months as I believed it would be.
Sleep well. :yawn:
 
  • Like
Reactions: Auer
like this
upload_2020-1-7_9-29-38.png
 
I don't think AMD has ever said they will make an 'nvidia killer' They have produced cards to compete and perhaps out perform Nvidia cards. And as long as there is competition that is good. If AMD can bring competition to the top of the line video card segment that is AWESOME for us.
 
Got a Zotac 2080 ti for around 820 all incl. just to play Control :) Probably not worth the $400 premium after selling my 1080 ti here but at my 4K UW resolution, the 2080 Ti does make upto a 40% difference over the 1080 ti in many games now.
 
So my card is dying, it still sort of works where I can have a long gaming session but eventually I will get white squares all over the screen and the computer will lock most of the time. Sometimes it just gives a driver crash and the OS stays up. I put the card in my old i7 930 system and got the same white squares all over the screen so I think it is safe to say the card is the problem. It could be my imagination but I think it has made it slightly more stable when I check "Force Constant Voltage" in MSI Afterburner. What is odd to me is that sometimes I can game for several hours, and then after exiting the game I will get the white squares on the desktop. I ran this card stock for almost a year, then installed an Arctic cooler and overclocked while maintaining pretty good temps around 60C under load.

This was the most expensive video card I've ever purchased so I am bummed it only lasted 3.5 years (Probably the third video card I've had straight up die or start to die in over 20 years of buying gaming cards). I don't know if I should buy a cheap card as a stopgap until the new cards are released, or buy a 2080 super for similar performance, or just go for the gusto and get a 2080ti and be happy with that for a few years.
 
View attachment 218674

Nvidia's next card announcement is just a little over a month away. Once again, Big Navi will battle Nvidia's last gen while the 30 series is announced. Keep waiting.......

With luck it will compete with the RTX **70 level card. Hopefully a bit closer to the 80, but we'll see. AMD clearly won't be competing at the top performance levels.

Marketing can make a big difference. If they market it as a 2080ti competitor it won't pan out so well. If they market it as a **70 competitor and comes out around the same time it will compete well in that space.
 
So my card is dying, it still sort of works where I can have a long gaming session but eventually I will get white squares all over the screen and the computer will lock most of the time. Sometimes it just gives a driver crash and the OS stays up. I put the card in my old i7 930 system and got the same white squares all over the screen so I think it is safe to say the card is the problem. It could be my imagination but I think it has made it slightly more stable when I check "Force Constant Voltage" in MSI Afterburner. What is odd to me is that sometimes I can game for several hours, and then after exiting the game I will get the white squares on the desktop. I ran this card stock for almost a year, then installed an Arctic cooler and overclocked while maintaining pretty good temps around 60C under load.

This was the most expensive video card I've ever purchased so I am bummed it only lasted 3.5 years (Probably the third video card I've had straight up die or start to die in over 20 years of buying gaming cards). I don't know if I should buy a cheap card as a stopgap until the new cards are released, or buy a 2080 super for similar performance, or just go for the gusto and get a 2080ti and be happy with that for a few years.

That blows man sorry to hear. Usually if they last 3.5 years they are good for a long time. You positive the psu isn't giving up on you?
 
That blows man sorry to hear. Usually if they last 3.5 years they are good for a long time. You positive the psu isn't giving up on you?

Same symptoms in two systems, and my older x58 system has a newer corsair ax760 psu that has not seen much use. I was really hoping it was my psu or motherboard causing the problem but I don't think this is the case. Really wish I still had my old gtx980 that I sold off right now, because I just can't decide if I should get a card in the $300 range and then upgrade in 6-9 months or just get a 2080ti right now. I had to fight the urge to drive over to microcenter and grab a 2080ti today. The ~$300 card stopgap will be the better solution in the long term albeit a more expensive one in the end, but a 2080ti right now would feel really good.
 
Same symptoms in two systems, and my older x58 system has a newer corsair ax760 psu that has not seen much use. I was really hoping it was my psu or motherboard causing the problem but I don't think this is the case. Really wish I still had my old gtx980 that I sold off right now, because I just can't decide if I should get a card in the $300 range and then upgrade in 6-9 months or just get a 2080ti right now. I had to fight the urge to drive over to microcenter and grab a 2080ti today. The ~$300 card stopgap will be the better solution in the long term albeit a more expensive one in the end, but a 2080ti right now would feel really good.

It's a tough spot I agree. My prediction is we won't get the 3080 ti (in good supply) until Sept. so waiting is going to suck. I'd pick up something used in your price range from the local hardforum here, it's here I buy most of my cards now.
 
It's a tough spot I agree. My prediction is we won't get the 3080 ti (in good supply) until Sept. so waiting is going to suck. I'd pick up something used in your price range from the local hardforum here, it's here I buy most of my cards now.

And that's assuming they launch with 3080 Ti, they might revert to what they did in previous generations and launch with Titan and 3080, with 3080Ti following many months later.
 
NVIDIA know how to play the game.
My bet is that NVIDIA will wait for AMD to play its hand...and then counter.

NVIDIA gains nothing from going early...besides killing of it's own Turing sales.
 
NVIDIA know how to play the game.
My bet is that NVIDIA will wait for AMD to play its hand...and then counter.

NVIDIA gains nothing from going early...besides killing of it's own Turing sales.

If these core counts for 3070 or 3080 are to be trusted, they are basically mid end cards. So Nvidia isn't really going early because AMD already played its hand with 5700 XT, all Nvidia has to do is wipe AMD in the mid end with their 7nm cards. They can wait for Navi 23 before deploying 3080 Ti.
 
They can wait for AMD before they do anything...again they have no need to impact Turing sales.
 
When has NVIDIA ever waited for AMD to show their hand before releasing a new product line?
 
I think 20 series was an oddity due to RT being new and requiring so much gpu ass to run. I fully anticipate a 3080Ti 6 months after initial 30xx series release like previous generations. I highly doubt it will be on or close to the initial series release. But who knows? Nvidia maybe?
 
When has there been an AMD parade to rain on in the GPU sector? Been beating them since Hawaii.

Look at the resent price change on the RTX 2060...it rained on AMD's parade and forced them to the BIOS debacle (non-verified 14Gbps memory issue)...NVIDIA still has the eye on the ball.
 
Look at the resent price change on the RTX 2060...it rained on AMD's parade and forced them to the BIOS debacle (non-verified 14Gbps memory issue)...NVIDIA still has the eye on the ball.
AMD needs to get better at predicting Nvidia responses. Not only that Nvidia has much leeway in pricing. Almost as if AMD did not want a price war at this time (my opinion).
 
nVidia right now is in the drivers seat. However I do no think they are sitting idle. I would not be surprised if there is not significant ray tracing efforts going on. While their first debut was not all unicorn and rainbows...it was not something that made me say "well, that is dumb". I personally want to game in a ray traced world since the day that translate into VR will be glorious.
 
Back
Top