Buy 2080Ti now, or wait for holidays?

What's the likelyhood of the first gen rtx cards dropping down 25-30% in price after the launch of the 3xxx series cards? We haven't really seen a new value alternative (other than what the 5700 can be argued for) since the oringal launch of the 1080/1070 class gpu.
 
What's the likelyhood of the first gen rtx cards dropping down 25-30% in price after the launch of the 3xxx series cards? We haven't really seen a new value alternative (other than what the 5700 can be argued for) since the oringal launch of the 1080/1070 class gpu.
New? Absolutely none. The used market will depend on demand and what people are willing to spend.
 
New? Absolutely none. The used market will depend on demand and what people are willing to spend.
On the contrary. Absolutely yes. In fact it depends on Nvidia policy.
When Nvidia launched Turing, that was announced some months before, they made it expensive so that the last Pascal GPU on market could be sold.
When Nvidia launched Maxwell it was a bit of surprise. But they announced that Kepler will be made very much cheaper, and all Kepler cards got a huge drop in price, with Nvidia saying their drivers won't be supported for new features. So they sold at really good price, like half the price some times and disappeared in a month from the market when Nvidia finally launched Maxwell.
Because of the jump in lithography from 16nm+ (12nm) to 7nm+ EUV (maybe even 5nm we do not know) Ampère will make a huge difference in performance, so there may be a huge drop of price on Turing cards too before launch. This is not going to be the same as between Maxwell and Pascal or Pascal and Turing, because of the competition from AMD RDNA2. So my bet is you may get this spring a Turing card for half the price but maybe, even then, it won't matter against Ampere. Low end Ampere maybe on par or better than 2080Ti. So wait for Ampere whatever happens.
 
Low end Ampere maybe on par or better than 2080Ti. So wait for Ampere whatever happens.

If that was the case Nvidia will sandbag the shit outta it. They wouldnt launch it at those specs.
 
Might be like the Pascal era and the x70 will be equal to the Ti. That's definitely not a low end SKU.

The problem is that a Pascal era 1070 was $379. A Turing era 2070 launched at $599. Ampere era? I would definitely bet on more than $499, but wouldn't be surprised at $599 again. Any way you cut it, not a low end SKU. I would hope that by this point a $299-$349 3060 would beat a 1080Ti, but based on the last launch, not likely.
 
I just don't see it personally. I would bet that the 3070 matches the 2080 at $500 and the 3080 at $799 is ~5-10% better than the 2080Ti. The forthcoming $1199 3080Ti will be 30% better than the 2080Ti.

That's a reasonable expectation. I would only expect mild shifting in that, based on what AMD brings to the table. But it probably won't shift things much because AMD, just like NViida is interesting in maxing profits, not upsetting the apple cart.
 
That's a reasonable expectation. I would only expect mild shifting in that, based on what AMD brings to the table. But it probably won't shift things much because AMD, just like NViida is interesting in maxing profits, not upsetting the apple cart.

To expand a bit, if Nvidia can get their BOM down versus Turing, they just might launch lower.

One aspect of revenue generation that they've consistently leveraged has been marketshare and the RTX cards in general have been noted as a poor value for those with previous generation GPUs. Nvidia would likely just as easily want to grab the customers that waited like myself as opposed to simply extending their price / performance curve higher and leaving their current lineup as-is.

If they don't go after different customers than they did with RTX (note I'm leaving the non-RTX Turing GPUs out purposefully), they're leaving a wide opening for AMD to grab marketshare, and then putting themselves in a worse position to compete with an upstart Intel a cycle or two down the road.


[note: I expect AMDs Radeon division to get its lunch eaten by Intel; I do hope that they manage to carve out a 'niche' for themselves and for the community, however, Intel is likely to work their way through AMDs most profitable graphics segments before they start to challenge Nvidia at the top end, and well, supposing Intel gets their fabrication operations back in order, that's a level of competition that AMD has yet to see in the graphics space]
 
I just don't see it personally. I would bet that the 3070 matches the 2080 at $500 and the 3080 at $799 is ~5-10% better than the 2080Ti. The forthcoming $1199 3080Ti will be 30% better than the 2080Ti.
High end Ampere, Quadro/pro/Titan version (meaning what Volta/V100 used to be) is going to be 18Tflops FP64, meaning 36Tflops FP32.
Because of AMD Big Navi and RDNA2 that could compete Nvidia's leadership, Nvidia don't support anymore the announcement of being only 1.5x the performance of Turing with 1/2 the TDP. The TDP is going to be higher and performance too.
Nvidia is used to make high end gaming cards very close on gaming specs (FP32) to high end cards especially when using same lithography (which Volta did not but used 12nm vs 16 or Pascal).
Nvidia also said it's going to have more Vram (means probably) and will cost less in the high end and the same as Turing on the rest of the line. So maybe the high end gaming will be at 24Tflops FP32.
Nvidia is going to use 7nm EUV vs 16nm+ (aka called 12nm) DUV lithography. This is a huge step forward.
2080Ti is only 13.5Tflops FP32. RTX 2060 6GB is the low end standard gaming (with ray trace) and has 6.5 Tflops, so half that of the 2080Ti. RTX 2060 replacement cill have more Vram, let's think 8GB, will cost the same, and may be bigger compared to high end gaming because of bigger proportional price, and so will be probably very close to the 2080Ti, even better.
But Nvidia also announced that Ray trace will be much more improved than the rest on those cards. So low end Ampere real gaming cards (meaning starting from what could be the RTX 3060) will be better than 2080Ti, no doubt on that for gaming.
Not sure how much Nvidia will improve their very low end. Maybe not at all and will stay with Turing. Needs to be seen. Real gamers don't really look after those graphics cards only bought for mixed use, home office, web surf, and gaming.
 
What's the likelyhood of the first gen rtx cards dropping down 25-30% in price after the launch of the 3xxx series cards? We haven't really seen a new value alternative (other than what the 5700 can be argued for) since the oringal launch of the 1080/1070 class gpu.
Nothing, it's nvidia we're talking about. They'll slowly turn off the valve at the supply chain before launching the new ones. You wouldn't be able to buy 2xxx cards after the 3xxx launch only from existing stock that will cost more. Same way as you couldn't buy a new 1080ti for a reasonable price after 2080 launched. Used price will depend on how many people think it is worth the switch to 3xxx.
 
3080 Ti will launch at the same prices the 2080 Ti launched, if not higher. They made a lot of money from the 2080 Ti, I doubt they would backtrack. I do see the current stock of 2080 Ti’s dropping to brand new at $800-$900 after launch to clear out old stock, but I honestly don’t see any other major drops before the product is EoL.
 
3080 Ti will launch at the same prices the 2080 Ti launched, if not higher. They made a lot of money from the 2080 Ti, I doubt they would backtrack. I do see the current stock of 2080 Ti’s dropping to brand new at $800-$900 after launch to clear out old stock, but I honestly don’t see any other major drops before the product is EoL.

I think this depends entirely on what AMD brings to the table this year. When the 2080 series launched, AMD had nothing that even came close. A year and a half later, AMD still doesn't have anything that comes close. But big Navi is coming, possibly before the 3-series. If AMD can give us 2080 Ti performance at a hot price, that's obviously going to affect what Nvidia can do. They will remain the performance champ for sure, but they might have to be a bit more reasonable on pricing just to keep people on team green.
 
I think this depends entirely on what AMD brings to the table this year. When the 2080 series launched, AMD had nothing that even came close. A year and a half later, AMD still doesn't have anything that comes close. But big Navi is coming, possibly before the 3-series. If AMD can give us 2080 Ti performance at a hot price, that's obviously going to affect what Nvidia can do. They will remain the performance champ for sure, but they might have to be a bit more reasonable on pricing just to keep people on team green.
If big Navi is actually a performance king, they are going to price it around the 2080 Ti, and not below. My guess just because of how their desktop lineup is priced at the moment.
 
When the 2080 series launched, AMD had nothing that even came close. A year and a half later, AMD still doesn't have anything that comes close.
This may sound like Nvidia fanboyism but I promise it's a legit question: when was the last time AMD competed with Nvidia at the top end? I honestly don't know - I haven't been a flagship-range customer long enough to have seen it.
 
This may sound like Nvidia fanboyism but I promise it's a legit question: when was the last time AMD competed with Nvidia at the top end? I honestly don't know - I haven't been a flagship-range customer long enough to have seen it.

It's definitely been a minute.

Off the top of my head, 2013? This would put us in 780 Ti vs 290x territory. 290x still trailed the 780 Ti, but it was close enough that they were fighting in the same class and the 290x was cheaper. This at least kept AMD relevant to someone wanting a high end part. By 2015 it was 980 Ti vs the 295x2, but the later was a dual gpu. Where supported, it was faster. In applications where it wasn't supported the 980 was faster. And this was at a time where we were starting to see the decline of dual gpus. This meant Nvidia still held the single GPU crown. AMD has been trying to catch up since.
 
I think the 7970 was a pretty well over engineered card. It lasted a good while.
 
This may sound like Nvidia fanboyism but I promise it's a legit question: when was the last time AMD competed with Nvidia at the top end? I honestly don't know - I haven't been a flagship-range customer long enough to have seen it.
For as long as AMD has had RTG after buying ATI, they've only ever come close on average, with an Nvidia GPU jumping ahead again within a few months, maybe six months at most.

And for most of the existence of ATI and AMD/RTG, that's largely been within the 'wait for the custom coolers / wait for the drivers' time period following ATI/AMD GPU releases.

I will say that AMD has had some competitive releases, they just haven't shown the performance jumps that ATI had versus Nvidia before ATI was acquired. And while neither company has had perfect drivers or QA, it seems that AMD has burned a larger percentage of their potential customer base on those fronts, with a higher occurrence of significant, longer lasting issues.

As far as recent top-end competition, AMD has yet to counter Nvidia's current tactic of producing top-end, compute-lite GPUs for the gaming market that are probably alright for a subset of the compute market too. I don't think anyone really doubts that AMD could do this, but it's clear that they've had other priorities as a company, and they've been doing pretty well in the mid-range, nipping at the heals of the high-end.
 
The only way I see RTX 2080Ti dropping in price is if Big Navi utterly destroys it for a similar or lower price

Or if 3080Ti ends up being cheaper
 
The only way I see RTX 2080Ti dropping in price is if Big Navi utterly destroys it for a similar or lower price

Or if 3080Ti ends up being cheaper
I'm still holding out hope that the price bloat on the 20 series was just so Nvidia could justify the cost of their RTX R&D. I'm hoping to upgrade from my 1080ti with the next gen, but that's going to be a tough sell if the flagship stays at $1200.
 
I'm still holding out hope that the price bloat on the 20 series was just so Nvidia could justify the cost of their RTX R&D. I'm hoping to upgrade from my 1080ti with the next gen, but that's going to be a tough sell if the flagship stays at $1200.

The market will never regress to 1080ti flagship pricing.
 
The market will never regress to 1080ti flagship pricing.

Yeah, every generation we have the "cheap-crowd" hoping...and every generation they get disappointed...and somehow forget this every generation.

I am so tired of the same lame argument:
"But THIS time it could happen!!!"

No it will not...get over it.
 
Unless AMD brings out something that crushes Nvidia I expect prices to keep going up on Nvidia side. This will also cause AMD cards to go up also. I honestly wouldn't be surprised by a $1500 3080ti and a $1000 3080.
 
Unless AMD brings out something that crushes Nvidia I expect prices to keep going up on Nvidia side. This will also cause AMD cards to go up also. I honestly wouldn't be surprised by a $1500 3080ti and a $1000 3080.

Yup I think its more probable that if AMD has a nvidia killer card, you can bet it will be priced accordingly. AMD would love to have a $1000+ dlls card
 
Yup I think its more probable that if AMD has a nvidia killer card, you can bet it will be priced accordingly. AMD would love to have a $1000+ dlls card

I think it depends immensely on whatever Nvidia does this year. Speculation is that whatever AMD is working on is roughly 2080 Ti performance. This means Nvidia will still likely keep their performance crown this year with Ampere. Likewise, AMD indicated at their analyst day that they are eyeing a late 2020 release. As for Nvidia, we know basically nothing other than Ampere should surface in some form this year. We don't know release dates, prices points, performance targets, nothing. We don't know if they are going to come out of the gate with a Ti card again (remember, this is not typical of Nvidia), or are they keeping that in their pocket for now? Will they try to beat AMD to market? A lot of unknowns. Whether or not AMD will be besting Nvidia or continuing to play catch up will be important.

I also wonder if the upcoming consoles will have any effect on the rising cost of high-end GPUs. Has anyone ever found a correlation between GPU prices and console generations? The 2080 Ti is still is priced into the stratosphere because a year and a half out, there is still nothing even close to a competitor for people who want 4K 60fps gaming. You want that level of performance, you pay for a 2080 Ti. There is no other option. The same won't be true by holiday 2020. There will be two shiny new game consoles (supposedly) capable of delivering 4K 60fps, likely priced for somewhere around $500. I know it's apples to oranges, but that will severely devalue the cost barrier for that type of performance. It'll be hard for Nvidia or AMD to ask people to drop $1200 for a level of performance that can be achieved by a game console at less than half the price.
 
I wouldnt drag consoles and their prices in to this.

Majority of the PC gaming enthusiasts I know are ppl with decent incomes and can easily afford the hobby on mid-high level.
Very few of them are bargain hunters. Most don't blink at the current pricing at all.
 
I wouldnt drag consoles and their prices in to this.

Majority of the PC gaming enthusiasts I know are ppl with decent incomes and can easily afford the hobby on mid-high level.
Very few of them are bargain hunters. Most don't blink at the current pricing at all.

I'd say that's unique to people you know. We've seen a year and a half of people groaning about the price of the 2080 Ti. All of the people I personally know who play PC games would like one, none of them own one. I'm not here to debate the price of it, but I do think it's a big narrow minded to pretend there isn't a large number of people who think it's too expensive and/or priced outside their means.

In a general sense, PC gamers are PC gamers and console gamers are console gamers. That doesn't mean the two can exist completely oblivious of the other. Especially as consoles continue to get more and more PC like. What happens when those lines continued to get blurred? When consoles have KBM support, widescreen, variable refresh, 120hz, etc?

I'm not saying consoles are going to change the landscape, but I definitely think they are relevant, especially if they can hit reasonable prices as the performance levels being rumored. In the realm of video game hardware, it's still competition.
 
Yeah, every generation we have the "cheap-crowd" hoping...and every generation they get disappointed...and somehow forget this every generation.

I am so tired of the same lame argument:
"But THIS time it could happen!!!"

No it will not...get over it.

The GTX 1080 Ti generation was lower cost than the current generation because the die sizes were much smaller.

I would expect a reduction in prices for Ampere, but not quite down to Pascal levels. Prices will fall because you have a new process node (and thus, smaller chips.) Once the yields stabilize, the $700 GTX 1080 launch part drops in price to $500, and a GTX 1080 Ti drops in at a cheap $700. Because it's not quite the amazing die shrink of Pascal, I expect AT 102 LAUNCH, Ampere 104 chip should be $550, and 102 chip should be $800.

You don't tend to see as massive price drops after launch on a mature process node (Turing, Maxwell), but that doesn't mean the cards themselves weren't a good value on launch. Who can argue with real-time Raytacing on a card that's only $100 more expensive than the Pascal launch version?
 
Last edited:
The GTX 1080 Ti generation was lower cost than the current generation because the die sizes were much smaller.

I would expect a reduction in prices for Ampere, but not quite down to Pascal levels. Prices will fall because you have a new process node (and thus, smaller chips.) Once the yields stabilize, the $700 GTX 1080 launch part drops in price to $500, and a GTX 1080 Ti drops in at a cheap $700. Because it's not quite the amazing die shrink of Pascal, I expect AT 102 LAUNCH, Ampere 104 chip should be $550, and 102 chip should be $800.

You don't tend to see as massive price drops after launch on a mature process node (Turing, Maxwell), but that doesn't mean the cards themselves weren't a good value on launch. Who can argue with real-time Raytacing on a card that's only $100 more expensive than the Pascal launch version?

I think you are doing exactly what he said. Setting up for disappointment, with over optimistic expectations.

7nm silicon costs something like 50% more/mm^2, which makes the die shrink benefits negligible on the cost front.
 
I think the 7970 was a pretty well over engineered card. It lasted a good while.

It was -- Rory Read was being too "conservative" going for under the 1Ghz speed (in order to maximize potential profits by having 3 different product stacks; the "under 1Ghz" and 1 Ghz [or so] ones, as well as the 7950).
Understandable, but dumb move, when your competition is NVIDIA (AMD, via its partners, had to quickly release the 1Ghz 7970s when the GTX 680 turned out to be quite strong vs the "under 1Ghz 7970).
 
I think you are doing exactly what he said. Setting up for disappointment, with over optimistic expectations.

7nm silicon costs something like 50% more/mm^2, which makes the die shrink benefits negligible on the cost front.


The chips are still cheaper to make than a RTX 2070 (much larger than the tiny GTX 1080). Nvidia is taking the hit on the increased engineering costs, with the expectation that the vastly smaller die sizes will make up for the losses.

I think the only unrealistic part of me would be if I expected more than a 40% performance increase over Turning. Pascal was an 80% massive die shrink that will never happen again.

BUT I DO NOT expect it. I expect 40% performance increase + reduced die sizes over Turing.

If 7nm is too expensive for a part like Pascal, why has AMD been shipping Zen 2 AND Navi on it? These are supposed to be cheaper to make than Polaris. You just have to have enough mass production to justify the increased engineering expense of new die nodes, combined with high-enough yields. But that means we all pay an extra $10 per-unit (not the thousands you're perpetually worried about).

Because we're using nodes for high-end production for 3-4 years, the development costs of a new fab node can be absorbed by longer utilization. It used to be that after the first two years, cutting-edge fabs were used for unimportant production. That's what Intel used to do with chipsets.

But now, we have one year of cutting-edge cell phone chps, then two to four year of processors/gpus. We're going to be on 7nm for a couple more years of high-margin products.
 
Last edited:
I think it depends immensely on whatever Nvidia does this year. Speculation is that whatever AMD is working on is roughly 2080 Ti performance. This means Nvidia will still likely keep their performance crown this year with Ampere. Likewise, AMD indicated at their analyst day that they are eyeing a late 2020 release. As for Nvidia, we know basically nothing other than Ampere should surface in some form this year. We don't know release dates, prices points, performance targets, nothing. We don't know if they are going to come out of the gate with a Ti card again (remember, this is not typical of Nvidia), or are they keeping that in their pocket for now? Will they try to beat AMD to market? A lot of unknowns. Whether or not AMD will be besting Nvidia or continuing to play catch up will be important.

I also wonder if the upcoming consoles will have any effect on the rising cost of high-end GPUs. Has anyone ever found a correlation between GPU prices and console generations? The 2080 Ti is still is priced into the stratosphere because a year and a half out, there is still nothing even close to a competitor for people who want 4K 60fps gaming. You want that level of performance, you pay for a 2080 Ti. There is no other option. The same won't be true by holiday 2020. There will be two shiny new game consoles (supposedly) capable of delivering 4K 60fps, likely priced for somewhere around $500. I know it's apples to oranges, but that will severely devalue the cost barrier for that type of performance. It'll be hard for Nvidia or AMD to ask people to drop $1200 for a level of performance that can be achieved by a game console at less than half the price.


As far as I can remember, consoles have never ever performed at the same level as a PC.
 
You haven't made that case. Not only that, we have no specs to compare to either.

So Zen 2 being sold millions of times for lower price per-core than Zen 1.0 doesn't make the case for you?

The fact that prices keep failing on Zen 2 shows you the truth: that once you get yields up, the cost is not higher than the old die. Even though the new package includes a separate memory controller in each processor sold, the die shink makes enough of a difference!

You can argue about reality "not convincing you" all day, but the rest of us work in the REAL WORLD TM
 
Last edited:
So Zen 2 being sold millions of times for lower price per-core than Zen 1.0 doesn't make the case for you?

Nope. Neither is a large die and for the most part is the same price. 2600 vs 3600 had about the same retail IIRC.
 
Back
Top