NVIDIA Ampere to offer 10-20% IPC increase over Turing, 4x RT performance with minimal FPS impact, up to 2 GHz OC clocks, and an overhauled software

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,891
Rumor has it that ..

".. If the above information is indeed true, what this essentially means is that Turing will not age well in comparison to the upcoming Ampere cards. Those who have spent quite a lot for the high-end RTX 2080 Super and RTX 2080 Ti may find that a US$300-350 Ampere card might very well offer a similar performance incentive. Tom's sources say that "Turing doing RT will be like Kepler doing DX12".

Oh, and not to forget, apparently there will be no GTX cards in the entire Ampere lineup with even the lowest spec cards speculated to feature at least a few RT cores. A speculated potential overhaul of the software stack that will now integrate both the GeForce Experience and GeForce Control Panel is also on the anvil.

Tom's sources say NVIDIA is likely to announce Ampere HPC products during the GTC 2020 pre-recorded keynote with the consumer GeForce lineup expected to be announced sometime in September. Needless to say, we are super stoked."


https://www.notebookcheck.net/NVIDI...e-stack-to-take-on-AMD-RDNA-2-0.464033.0.html
 
https://www.nvidia.com/en-us/gtc/

Screenshot_20200509-102432_Bing.jpg
 
Rumor has it that ..

".. If the above information is indeed true, what this essentially means is that Turing will not age well in comparison to the upcoming Ampere cards. Those who have spent quite a lot for the high-end RTX 2080 Super and RTX 2080 Ti may find that a US$300-350 Ampere card might very well offer a similar performance incentive. Tom's sources say that "Turing doing RT will be like Kepler doing DX12".

Oh, and not to forget, apparently there will be no GTX cards in the entire Ampere lineup with even the lowest spec cards speculated to feature at least a few RT cores. A speculated potential overhaul of the software stack that will now integrate both the GeForce Experience and GeForce Control Panel is also on the anvil.

Tom's sources say NVIDIA is likely to announce Ampere HPC products during the GTC 2020 pre-recorded keynote with the consumer GeForce lineup expected to be announced sometime in September. Needless to say, we are super stoked."


https://www.notebookcheck.net/NVIDI...e-stack-to-take-on-AMD-RDNA-2-0.464033.0.html

Hopefully this is an undershoot. I was hoping for 20-30% increase in raw performance over a high end 2080ti. I am more interested in higher frames@higher resolution than RT.
 
Turing was hardly a upgrade over Pascal so basically NVIDIA could has been stalling.
Which is good for them for the consumer it doesn't matter because you're at the mercy of them.
 
Hopefully this is an undershoot. I was hoping for 20-30% increase in raw performance over a high end 2080ti. I am more interested in higher frames@higher resolution than RT.


IPC is just instructions-per-clock, for the same number of SMs. That's the exact same IPC improvement as we saw with Turing. That is why the 1660 Ti is 40% faster than the 1060 6gb with only 20% more SMs, and similar clocks

I'd expect sms to grow 30-40% (thanks to the die shrink), for a grand total of 40-50% non-RT performance improvement!

This was where I was estimating performance improvement 6 months back, and it turns-out, I was right! The massive increase in RT is totall bellieveable as-well (first-generation products with completely new tech always get rolled by the second generation )

GeForce 256 T&L unit got rolled by the GeForce 2.
GeForce 3 programmable shaders got rolled by the ATI 9700 Pro.
 
Last edited:
Gotta love competition. Next gen will actually be an upgrade and maybe bring prices down to pre-mining craze. Now it depends how big navi performs to see how nvidia responds
 
Turing was hardly a upgrade over Pascal so basically NVIDIA could has been stalling.
Which is good for them for the consumer it doesn't matter because you're at the mercy of them.

Can you remember more than one launch back?
Turing was inside the norm since the G80...that "Pascal" was an outlier is not a valid argument for anything...besides that Pascal was a outlier.

Amazing how peoples "memory" conforms to their agenda when not checked...but put the data up and it is revealed as a fallacy.
 
Gotta love competition. Next gen will actually be an upgrade and maybe bring prices down to pre-mining craze. Now it depends how big navi performs to see how nvidia responds

Moore's law is dead and transistor cost keep rising every node...even AMD states this, there is hope...and then there is the fools hope.
 
GeForce 256 T&L unit got rolled by the GeForce 2.
GeForce 3 programmable shaders got rolled by the ATI 9700 Pro.

Ahhhh. The good ole` days when we got big performance increases. Not going to lie, gotta lot of extra cash last fall and thought "treat yourself to the 2080Ti you wanted." Now, after selling the 1080Ti I felt as I shouldn't have. It made AC:Odyssey and Origin smoother at 4K, but not much else. I been more excited over playing around with this Ryzen 3900x more so.

Can we just shovel "BIG NAVI" off somewhere? Over in that pile over their next to Big Fury and Big Vega. Generation late.
 
Moore's law is dead and transistor cost keep rising every node...even AMD states this, there is hope...and then there is the fools hope.

That has nothing to do with competition bringing prices down.
 
Without it we'll be seeing $2000 cards from nvidia. So no.
We'll only be seeing whatever the market wants to pay. Same as always. A company like NV doesn't price a product arbitrarily with no market support. If the market wants to pay $5000 for a GPU, the price will be $5000.

AMD is in a tough spot because they'd have to produce a product substantially more powerful at same or less cost, since Nvidia's marketshare, inertia and brand halo status factor heavily into buying decisions.
 
Last edited:
I'd expect sms to grow 30-40% (thanks to the die shrink), for a grand total of 40-50% non-RT performance improvement!

That is what I'd like especially with the prices they're charging these days. But I doubt Nvidia would do that these days. Maybe with a new console generation they do have to offer more performance bang for buck though.
 
Hopefully this is an undershoot. I was hoping for 20-30% increase in raw performance over a high end 2080ti. I am more interested in higher frames@higher resolution than RT.

I don’t really need more rasterized FPS on the 2080ti. A 20-30% increase in rasterized and 400% increase in RT would be more than appealing to me.

I hope they are wrong about the RT. A massive increase in that is basically the only reason I’d buy a card. It’d make RT viable for VR too.. which is where it’d really shine. I don’t want to spend more money on hardware so I am hoping it underperforms lol.
 
If it is true that Ampere RT will be 4x RT of Turing & assuming RTX 3050 is the lowest tier, lets say that RTX 3050 has (8 GB &) 2x RT of RTX 2060, & assuming it will release by October next year.

Then all games being released from that point will be developed for RT performance of atleast 2x of RTX 2060. Maybe you can still play RT games after that, on the RTX 2060, using DLSS & reducing all settings to the minimum

https://hardforum.com/threads/nvidi...ecause-of-coronavirus.1994239/post-1044579770

Looks like no new GTX products

Below RTX 3050 ti/super/non-super it might be RTX 16xx series only

https://www.thefpsreview.com/2020/0...e-over-turing-ray-tracing-4x-better-per-tier/
 
We'll only be seeing whatever the market wants to pay. Same as always. A company like NV doesn't price a product arbitrarily with no market support. If the market wants to pay $5000 for a GPU, the price will be $5000.

AMD is in a tough spot because they'd have to produce a product substantially more powerful at same or less cost, since Nvidia's marketshare, inertia and brand halo status factor heavily into buying decisions.
Huh? When AMD couldn't compete with Intel no one "wanted" to pay obscene prices for a 6-core. People did becuase even at that time there was a need for it and there was no other option. When there's a monopoly anywhere there is no market force. You either buy one or not but thats not really a market force.
 
You either buy one or not but thats not really a market force.

Ummm that is literally THE market force. Everything else (competition, pricing etc) ultimately leads to the same question. Do I want to buy this thing.
 
Ummm that is literally THE market force. Everything else (competition, pricing etc) ultimately leads to the same question. Do I want to buy this thing.

YOU...as an indiviual has no power over the price.
It will be decided by the masses as always.
 
This rumour sounds believable, because it’s reasonably conservative, unlike the “Ampere will give you a 40% performance boost, feed your cat, fold your laundry, and get you laid” rumours I’ve seen in some other articles.
 
I disagree with people saying we don’t need more shader power. 2080ti still isn’t getting what we need FPS wise for newer games at 4K. 3080ti if 40% faster will help, but it’s still not there.
 
I disagree with people saying we don’t need more shader power. 2080ti still isn’t getting what we need FPS wise for newer games at 4K. 3080ti if 40% faster will help, but it’s still not there.

Yeah I don't know why anyone would say we don't need more raster power, of course we do! I want at least 50% from Ampere and I think I'll get it. The RT uplift is nice but not critical right now and I suspect by the time it really makes a big difference, we'll be passed Ampere and Turing will also be long forgotten. It's nice to enjoy the occasional DXR/RTX game, but it's certainly not my first priority when gaming either.
 
Ray Tracing: still waiting to be impressed from anything that isn't a tech demo, every game implementation is basically yawnsville when you look at the price you pay to enable it.
 
Ray Tracing: still waiting to be impressed from anything that isn't a tech demo, every game implementation is basically yawnsville when you look at the price you pay to enable it.
It's still new. It will get there. Not sure on the price if it did not have the RT cores.
 
Yay I can finally put my GTX 1080 FE to rest. Either build a 2nd PC or sell to cover the cost of the new RTX xx70 or xx80. Whatever these new cards are Im damn sure I'll buy one.
 
I disagree with people saying we don’t need more shader power. 2080ti still isn’t getting what we need FPS wise for newer games at 4K. 3080ti if 40% faster will help, but it’s still not there.

how much more do we need for 4K then? Mchart
 
how much more do we need for 4K then? Mchart

For me? Probably at least double what my 2080ti is doing. That would bring me to around the 80-100 FPS range @ 4k w/ max settings in most newer games. Ideally at least triple the power because you'd be at 120 FPS + @ 4k max settings in newer titles.

Given that we're already moving to 8k displays I really disagree with nvidia's approach of devoting more space to RT cores. Also, i'll point out that most of the slow-down from DXR currently is not due to a lack of RT cores, but the increased demand on the shader side anyways.
 
For me? Probably at least double what my 2080ti is doing. That would bring me to around the 80-100 FPS range @ 4k w/ max settings in most newer games. Ideally at least triple the power because you'd be at 120 FPS + @ 4k max settings in newer titles.

Given that we're already moving to 8k displays I really disagree with nvidia's approach of devoting more space to RT cores. Also, i'll point out that most of the slow-down from DXR currently is not due to a lack of RT cores, but the increased demand on the shader side anyways.
I have to slightly disagree, RT cores are what we need them working on for now. We are going to get more shaders they are coming, but not till things are in place for mainstream consumer MCM cards. I don’t think anybody can cram any significant increase in shaders into a Monolithic design in a feasible way. But before MCM cards can make their way to the consumer market we are going to need a new WDDM from Microsoft and a few new graphics engines to hit the market. For now the major players can develop those things using the Titan cards but I think that their work won’t really come together until they have access to the next generation hardware. I expect to see these things once PCIE 4 is mainstream and mature.
 
Back
Top