NVIDIA GeForce Ampere GPU Rumors:

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,785
True or false? 10nm at what die dimensions?

"The flagship GeForce GPU is rumored to be the GA102 and will be the successor to the Turing based TU102 GPU we got to see on the Titan RTX and RTX 2080 Ti graphics cards. The GPU will feature 84 SMs which equals 5376 CUDA cores. This is 16% more CUDA cores than the full-fat TU102 GPU featured on the Titan RTX. The GPU would be able to support up to 12 GB of VRAM across a 384-bit bus interface."

https://wccftech.com/nvidia-geforce-ampere-gpu-rtx-30-series-performance-specs-q4-2020-launch-rumor/
 
according to that article..."the RTX 3070 is said to deliver 95% performance of the RTX 2080 Ti which could be pushed further ahead with custom models and higher factory overclocks at a retail price of under $500"

if true that sounds like an amazing value...will definitely upgrade from my 1070...hopefully it comes out before Cyberpunk 2077 launches
 
Last edited:
Might be possible. On part with a Ti is what it should be (otherwise it isn't worth upgrading to) but the price has to be right. The **70s need to be around $330-450. Shame they're coming so late though. Was wanting to upgrade soon.
 
according to that article..."the RTX 3070 is said to deliver 95% performance of the RTX 2080 Ti which could be pushed further ahead with custom models and higher factory overclocks at a retail price of under $500"

if true that sounds like an amazing value...will definitely upgrade from my 1070...hopefully it comes out before Cyberpunk 2077 launches

Let's be realistic here. It will be $700-900, not 500.
 
according to that article..."the RTX 3070 is said to deliver 95% performance of the RTX 2080 Ti which could be pushed further ahead with custom models and higher factory overclocks at a retail price of under $500"

if true that sounds like an amazing value...will definitely upgrade from my 1070...hopefully it comes out before Cyberpunk 2077 launches
And wouldn't it be funny for Nvidia to troll everyone by making the price of the 3070 the same price as the 2080Ti...




Anyway, weird *rumor* about these considering Samsung already put out a press release almost a year ago saying that Nvidia had signed a contract for their 7nm node, not 10nm, and Samsung said Nvidia was a launch partner for the 7nm with EUV node.
http://www.koreaherald.com/view.php?ud=20190702000692
 
Last edited:
And wouldn't it be funny for Nvidia to troll everyone by making the price of the 3070 the same price as the 2080Ti...

I would hope everyone is expecting exactly this to be true. If NV comes in with 2080 non ti pricing instead... fantastic. I think its a safe bet that NV bumps launch pricing again. They will find something to blame... there is some sort of illness going round or something. Seems like that might be a good excuse to bump launch pricing 20% to me.

Only way that doesn't happen is if AMD really does have a Navi wunder part to ship around the same time frame. Doesn't sound like Intel is going to have any impact on GPUs for a few more years at best. Good chance NV has one more cycle where they can grind out a little extra profit. With some luck AMD has righted their GPU design process, and Intel might produce something that at least competes at some price points in 2021. So perhaps 5000s is where NV deflates some pricing. lol :)
 
I would hope everyone is expecting exactly this to be true. If NV comes in with 2080 non ti pricing instead... fantastic. I think its a safe bet that NV bumps launch pricing again. They will find something to blame... there is some sort of illness going round or something. Seems like that might be a good excuse to bump launch pricing 20% to me.

Only way that doesn't happen is if AMD really does have a Navi wunder part to ship around the same time frame. Doesn't sound like Intel is going to have any impact on GPUs for a few more years at best. Good chance NV has one more cycle where they can grind out a little extra profit. With some luck AMD has righted their GPU design process, and Intel might produce something that at least competes at some price points in 2021. So perhaps 5000s is where NV deflates some pricing. lol :)
Or Intel and AMD just raise their pricing as well.
 
lol at anyone that thinks nv is going to lower prices. IF this was real it'll be $750+
we need a rumour forum...
 
  • Like
Reactions: Halon
like this
$699 reference, $799 FE, $800-950 AIB's.

That's my prediction for 3070 pricing.
 
lol at anyone that thinks nv is going to lower prices. IF this was real it'll be $750+
we need a rumour forum...


Like Pascal GTX 1080 was $ 700 at launch. And then dropped to $500 once yields improved /made room for 1080 Ti at $700.

The launch prices of Ampere should be similar to Pascal, give or take $50. The much smaller die sizes orf Pascal meant you could pack higher performance than a 980 Ti into a $700 launch 1080.

Then once yields improved, the price of a GTX 1080 (without expensive GDDR5x) was dropped to $450.)

All you people who freak out about about new process node shrinks and massive price hikes have no idea how silicon manufacturing works. At-most, Ampere will be $50 higher than Pascal (for the same numbered launch parts, so RTX 3080 = $750, and RTX 3070 - $500). The die reductions over Turing is going to drop prices, an once yields improve the same Pascal price reductions will happen. And the second gen of RTX will probably see massive efficiency improvements (you can get more done with the same die space).
 
Last edited:
Like Pascal GTX 1080 was $ 700 at launch. And then dropped to $500 once yields improved /made room for 1080 Ti at $700.

The launch prices of Ampere should be similar to Pascal, give or take $50. The much smaller die sizes orf Pascal meant you could pack higher performance than a 980 Ti into a $700 launch 1080.

Then once yields improved, the price of a GTX 1080 (without expensive GDDR5x) was dropped to $450.)

All you people who freak out about about new process node shrinks and massive price hikes have no idea how silicon manufacturing works. At-most, Ampere will be $50 higher than Pascal (for the same numbered launch parts, so RTX 3080 = $750, and RTX 3070 - $500). The die reductions over Turing is going to drop prices, an once yields improve the same Pascal price reductions will happen. And the second gen of RTX will probably see massive efficiency improvements.
whos freaking out?
700 plus "give or take $50" = the $750 that i posted.
 
whos freaking out?
700 plus "give or take $50" = the $750 that i posted.

Yes, but it doesn't take long for those price reductions to hit home. 9 months after launch, the price of the GTX 1080 dropped below Maxwell prices.

When the GTX 1070 Ti was using the same GDDR5 ram as the GTX980, price fell to $100 less than Maxwell That was no joke.

Because this is a less impressive die shrink (and the GDDR6 is unavoidable), expect prices equal to Maxwell after yields improve. So yes, that's a massive price drop compared to Turing.

You're stating that Nviidia won't drop prices, and that's just wrong. It only applies for launch-day buyers.
 
Last edited:
Yes, but it doesn't take long for those price reductions to hit home. 9 months after launch, the price of the GTX 1080 dropped below Maxwell prices.

When the GTX 1070 Ti was using the same GDDR5 ram as the GTX980, price fell to $100 less than Maxwell That was no joke.

Because this is a less impressive die shrink (and the GDDR6 is unavoidable), expect prices equal to Maxwell after yields improve. So yes, that's a massive price drop compared to Turing.

You're stating that Nviidia won't drop prices, and that's just wrong. It only applies for launch-day buyers.
9 months after launch
im/we are talking launch price not 9 months down the road.
 
A price drop could be feasible, the 2000 series RTX cards had enough of a markup that it would have paid for the research costs of the initial tech, the 3000's would be a refinement of that tech which is inherently cheaper so their investment into it is lower, also this is going to be launching into what many are predicting to be a recession period, with the possibility of strong competition in that space from AMD. nVidia would have access to the performance numbers from the new AMD supercomputers using the new tech so they have an idea of what they are up against, but it really depends on when they and AMD launch, I can easily see them doing a larger price launch until AMD gets their parts into market then dropping prices accordingly.
 
was just looking at 2080ti's dont see much there in price drops after well over a year. Unless they have to compete on price with AMD i think your dreaming.


And just like Maxwell, the large die size and mature process node meant the 980 Ti also saw no price cuts. But they could introduce it at a lower price than the GTX 780.

I think it has more to do with the massive die size. You're just not going to see price reductions on a known node that pushes the reticle beyond any previous product.
 
  • Like
Reactions: Halon
like this
Nvidia will always be way more expensive...our only hope for a sub $500 RTX card is with AMD's next gen
 
And wouldn't it be funny for Nvidia to troll everyone by making the price of the 3070 the same price as the 2080Ti...

That's exactly what they did with the 2080 compared to the 1080Ti. Same performance...same price!
 
Uh, the 2060 has rtx and is $300 already. You'd be playing at 1080p of course in that price range.

yeah I meant a 'real' RTX card that can actually deliver ray-tracing at acceptable frame rates...the 2070 Super can do that at 1440p and is right around the $500 price point
 
Hear us, hear us, oh great AMD. Unshackle and free us from the green chains of greed and performance. Repeat.
 
Considering the 2080 Ti is 29% faster than a 5700XT AE and AMD RNDA2 is 1/2 the power for same performance or 2X (performance?) at the same power. Nvidia killer coming. So much for rumors.

https://www.techpowerup.com/gpu-specs/radeon-rx-5700-xt-50th-anniversary.c3438
That expands to 55% in aggregate at 4K. Some games are twice as fast on the 2080 Ti at 4K. The 5700 XT also uses 220W on average while gaming while the 2080 Ti uses 270W, making performance per watt on the latter 26% better than the former at 4K or 5% better at 1920x1080 based on your link above. I don't know where you got "1/2 the power for same performance" from.
 
Considering the 2080 Ti is 29% faster than a 5700XT AE and AMD RNDA2 is 1/2 the power for same performance or 2X (performance?) at the same power. Nvidia killer coming. So much for rumors.

https://www.techpowerup.com/gpu-specs/radeon-rx-5700-xt-50th-anniversary.c3438

Those kinds of averages are deceptive because they take into account many games that are tailored for AMD/NVIDIA and skew results in addition to a large portion being DX 11 which Pascal did well in (103% in that graph) but gets murdered in DX 12 by Turing. As Armenius said, 4K is a big differentiator as well.
 
Those kinds of averages are deceptive because they take into account many games that are tailored for AMD/NVIDIA and skew results in addition to a large portion being DX 11 which Pascal did well in (103% in that graph) but gets murdered in DX 12 by Turing. As Armenius said, 4K is a big differentiator as well.

I don't have a problem with it as long as they don't skew toward one company or the other in the games they pick. My understanding is they just pick their suite of games based on what's popular, etc. and then run their tests. In that scenario, you're going to get an idea of what the general performance is relative to other cards. If all you play is Fortnite, or GTA5, or whatever, then obviously pick the best card for that game. Likewise, if you have a 1080p or 1440p monitor and plan on keeping it, for any length of time, then 4K results are meaningless. If they pick games that are well known to skew towards Nvidia or AMD and then try to emphasize the lack of difference or the significant difference using those results, then obviously I'd object. I don't think that's what's happening here.
 
They pick popular games because that is what people are playing... picking some weird off the wall, hardly known game because it favors AMD, yeah, that's dumb.
 
While nvidia likes to play it safe with proven manufacturing process, IMO 7nm is pretty much as mature as it can get, so I can't think of a reason why ampere won't be 7nm.

BTW I keep reading rumors that state 8, 6 nm process on either TSMC and Samsung, but AFAIK there are no such thing on either foundry.
 
Why do people even care how it benchmarks compared to AMD? AMD doesn't even anything that competes with NVIDIA's top end cards. If it was a mid range card then sure, but top end doesn't matter because AMD is so far behind.

If you're buying top end the only thing you're comparing it with is other NVIDIA cards.
 
They pick popular games because that is what people are playing... picking some weird off the wall, hardly known game because it favors AMD, yeah, that's dumb.

A lot of the games aren't very popular though, they just happen to be convenient for testing.
 
I don't think we've reached that point. 700 is nowhere near mid-range

2070 Super's are already in the $500's. If rumor is true that the 3070 will be just about as powerful as a 2080ti then you can bet they're going to charge at least $700 for it. Nvidia high on crack with their pricing.
 
Back
Top