NVIDIA Could Tease Next-Gen 7nm Ampere at GTC 2019

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
It isn’t clear whether NVIDIA will have any surprises to share at next week’s GPU Technology Conference (GTC), but some speculate the company could reveal aspects of its next-generation architecture, “Ampere,” which will purportedly be built on the 7nm node. TweakTown and TechSpot suggest it could be the right time to do so, as the luster of Volta and Turing continues to wear thin. The former predicts it won’t be a gaming part, however, suggesting “a new GPU architecture tease that will succeed Volta in the HPC/DL/AI market.”

For now, NVIDIA has used the Ampere name for their future 7nm GPUs. If that's the case, the Ampere GPUs would bring power efficiency improvements, higher clock rates, and perhaps higher memory bandwidth. Now would be a good time for NVIDIA to make a big announcement, considering the company just had one of the worst fiscal quarters its ever had. Consumer and investor faith in the company is slipping, especially since the adoption of RTX technology has been much slower than expected.
 
Waiting on a 7nm part to upgrade, feels like this will be one of the last decent jumps in performance as node shrinks get harder and more expensive. Hope to get another 4years our of my rig. Running 980s in SLI circa 2014.

Needs to run all current and near future titles and next gen console ports @ 4K60 or better.
 
Since they are FINALLY separating GPUS into true gaming and workstation divisions, maybe this will be the generation that goes all in on performance, power be damned. It would be beautiful if they dropped the silly tensor/rtx cores and went flush with shaders. Picture a 3ghz ti with over 2x the cores of current gen. A gamer can dream~
 
Doesn’t really make sense unless they want to write off Turing as failed.

As soon as they tease 7nm everyone is just going to camp and wait for it rather than buy turing.

Heck, a lot of people are doing that anyway, so why the hell not... it would just being a subtle way for nV to admit Turing isn’t doing what they want without spooking Wall Street too bad
 
Since they are FINALLY separating GPUS into true gaming and workstation divisions, maybe this will be the generation that goes all in on performance, power be damned. It would be beautiful if they dropped the silly tensor/rtx cores and went flush with shaders. Picture a 3ghz ti with over 2x the cores of current gen. A gamer can dream~

Heh, yeah, and from what we saw with the 20XX cards and their pricing, you can expect to have to mortgage your house in order to get one unless the fucking miners snap 'em up first. :rolleyes:

I'm just about done with PC gaming completely, I'm sick of paying to be a beta tester for these companies, and "escaped samples" that could burn my damn house down if I leave my PC on. I'm old and I remember when PC gaming really meant something, now it is just another corporate cocksucker cash grab. Just my opinion.
 
Doesn’t really make sense unless they want to write off Turing as failed.

As soon as they tease 7nm everyone is just going to camp and wait for it rather than buy turing.

Heck, a lot of people are doing that anyway, so why the hell not... it would just being a subtle way for nV to admit Turing isn’t doing what they want without spooking Wall Street too bad

Except it's not a secret that 7nm was coming this year.
 
Except it's not a secret that 7nm was coming this year.

Really?

I'd say it's no secret that nVidia would move to 7nm sooner than later. But no one has said 2019, especially since they still don't have the full Turing lineup out. Most rumor sites were pegging Ampere at 2020. I still don't see anything indicating that it will be out this year in this rumor... just a hint that it could be announced.

Regardless, just an announcement will cause people to sit and wait. Means a poor sales cycle gets even worse, but if it can move the stock price needle, that seems to be the game nVidia is playing.
 
NVidia is in no rush to 7nm, if they were, they wouldn't be releasing a full stack of 16nm+ Turing parts from 2080 Ti all the way to 1660 so far and 1650 likely soon. They are more power efficient at 16nm+ than AMD is at 7nm, and it is probably pretty cost effective and very high yield at this point.

They will definitely want to get all the 16nm+ sales exhausted before moving on to 7nm.
 
NVidia is in no rush to 7nm, if they were, they wouldn't be releasing a full stack of 16nm+ Turing parts from 2080 Ti all the way to 1660 so far and 1650 likely soon. They are more power efficient at 16nm+ than AMD is at 7nm, and it is probably pretty cost effective and very high yield at this point.

They will definitely want to get all the 16nm+ sales exhausted before moving on to 7nm.

NVIDIA is on 12nm....
 
Hey nVidia, start with a 20% discount on your GPUs and work your way to higher sales from there...otherwise good luck....I am enjoying the show and eagerly await your next Q results.
 
I really hope these are for the Quadro and Tesla series, I need some upgrades and the GP100’s are still 9K a pop in Canada.
 
NVIDIA is on 12nm....

In reality 16nm+ is more apt description since it didn't actually shrink, just like the GF "12nm" that AMD uses that is really 14nm+ and the exact same size it was in the previous year when they called it 14nm.
 
  • Like
Reactions: N4CR
like this
The price for the 2080ti in 2019 is not all that ridiculous when compared to the Quantum3D Obsidian X-24 in 1998.
 
Speaking of price.. Are there any hard numbers on Turing sales yet? I have a feeling the 1660 exists due to poor sales.

It's pretty bad when the most budget card on the market is over $200, at least it performs very well, but I bet when the next round of consoles drop we'll be seeing yet another round of decline in PC gaming and thus sales. This is becoming cyclical.
 
Heh, yeah, and from what we saw with the 20XX cards and their pricing, you can expect to have to mortgage your house in order to get one unless the fucking miners snap 'em up first. :rolleyes:

I'm just about done with PC gaming completely, I'm sick of paying to be a beta tester for these companies, and "escaped samples" that could burn my damn house down if I leave my PC on. I'm old and I remember when PC gaming really meant something, now it is just another corporate cocksucker cash grab. Just my opinion.

I agree and disagree with some of your points. I'm firmly in the camp that the high end offering should come in around $500-$600 not $1200. The fact that games aren't really pushing graphics over the last 3 years doesn't give most gamers any reason to upgrade over a 970 class card. After changing to 4k, I know that it looked crisper, but it was still the same games with a little better resolution, nothing game changing.

Regarding you quips about burning your house down, I try not to sensationalize media reports as typical scenarios. Anyone that bought a defective card either exchanged it or refunded it. I've had multiple AMD flagship cards arrive DOA on launch and had to return them for another roll. I think we can all attest to the fact that Nvidia wanted to maintain their profits following the crypto bubble and clearly took a route to the dismay of many gamers that made them really consider weather an upgrade offered any better experience for the massive increase in price over old refreshes. As I've done many times before, I got 2 ti's for SLI just to sell one off a few weeks afterwards due to lack of compatibility in games I play most. To be really honest, even 1 2080ti is more than enough for everything I need right now, and may be for years to come.
 
It gets sensationalized but it does happen. I had two 8800 gtx cards burn on me under warranty. Reality is the power supply kicked off immediately and the cards were just left with charred VRMs and some smoke, but I'm sure freak accidents and corner cases exist. I can name quite a few cards going back in time that were pretty well known for failure all the way back to the v3 voodoo which was when people started modding their own fans in.

So I don't think some cards with higher failure rates are new, what is new is paying $1200 on a graphics card.. So I'm with you all there, it's too expensive, especially these days when the average Joe has way less spending cash then he did in 1999..
 
I agree and disagree with some of your points. I'm firmly in the camp that the high end offering should come in around $500-$600 not $1200.

So I don't think some cards with higher failure rates are new, what is new is paying $1200 on a graphics card.. So I'm with you all there, it's too expensive, especially these days when the average Joe has way less spending cash then he did in 1999..

Why? Why must the price be sticky and why must the market only have products at prices you desire to pay?


To be clear, I'm not thrilled with the $1200 price tag either and thus, I don't have one. I'll stick with my 1080Ti for a while longer.

That said, should the high end sports car only cost $50k? Should a high end home top out at $500k? I understand that in general we all want to buy the best and are happy when the price we want to pay for being top dog is aligned with the market. However, why should there be no higher end products that cost more than I'm willing to pay? It's really just that you grew accustom to paying that amount for the ego of having the best. Any $500-$600 card you purchase today blows any 5, 10, 15 year old card out of the water. You're still getting more for the same price point (ignoring inflation), just not the ego of being able to say you have the best.

Also, this has been beat to death, but inflation really matters. That doesn't make a 2080Ti relatively inexpensive, but rather that your fixed of $500-$600 won't buy a loaf of bread at some point in the future. It's silly to have fixed price constructs in our minds.

The real issue is that the exponential scaling of semiconductors has seriously spoiled us. Sort of like the US manufacturing situation after WW2, it was never a long-term sustainable situation...things were always going to change.
 
Last edited:
snip..

So I don't think some cards with higher failure rates are new, what is new is paying $1200 on a graphics card.. So I'm with you all there, it's too expensive, especially these days when the average Joe has way less spending cash then he did in 1999..

Average Joe should be playing consoles and maybe should just stick to his phone for games if that's the case.
 
Hey nVidia, start with a 20% discount on your GPUs and work your way to higher sales from there...otherwise good luck....I am enjoying the show and eagerly await your next Q results.

Says the guy running a $1200 cpu with other ultra expensive products to go along with it. LOL.
 
Why? Why must the price be sticky and why must the market only have products at prices you desire to pay?


To be clear, I'm not thrilled with the $1200 price tag either and thus, I don't have one. I'll stick with my 1080Ti for a while longer.

That said, should the high end sports car only cost $50k? Should a high end home top out at $500k? I understand that in general we all want to buy the best and are happy when the price we want to pay for being top dog is aligned with the market. However, why should there be no higher end products that cost more than I'm willing to pay? It's really just that you grew accustom to paying that amount for the ego of having the best. Any $500-$600 card you purchase today blows any 5, 10, 15 year old card out of the water. You're still getting more for the same price point (ignoring inflation), just not the ego of being able to say you have the best.

Also, this has been beat to death, but inflation really matters. That doesn't make a 2080Ti relatively inexpensive, but rather that your fixed of $500-$600 won't buy a loaf of bread at some point in the future. It's silly to have fixed price constructs in our minds.

The real issue is that the exponential scaling of semiconductors has seriously spoiled us. Sort of like the US manufacturing situation after WW2, it was never a long-term sustainable situation...things were always going to change.

Nvidia reacts to crypto MSRP over pricing by setting msrp to new heights to adjust for what they were selling for during the peak of the crypto boom. Now that gamers are buying the cards again the prices don't make any sense. You can complicate it as much as you want but I just spelled it out for you.
 
Nvidia reacts to crypto MSRP over pricing by setting msrp to new heights to adjust for what they were selling for during the peak of the crypto boom. Now that gamers are buying the cards again the prices don't make any sense. You can complicate it as much as you want but I just spelled it out for you.
You completely ignored the question. The GPU market is now far more complex than just gamers and crypto.

The GPU you buy today at any given inflation adjusted price is faster than any GPU you've EVER been able to buy at that price point. What's changed is your money doesn't buy you the status it previously did. What's changed is instead of having the best at a price point you prefer, you must balance other competing requirements to subjectively determine value.
 
Last edited:
You completely ignored the question. The GPU market is now far more complex than just gamers and crypto.

The GPU you buy today at any given inflation adjusted price is faster than any GPU you've EVER been able to buy at that price point. What's changed is your money doesn't buy you the status it previously did. What's changed is instead of having the best at a price point you prefer, you must balance other competing requirements to subjectively determine value.

When pricing is subjective inflation doesn't = market price.
 
You completely ignored the question. The GPU market is now far more complex than just gamers and crypto.

The GPU you buy today at any given inflation adjusted price is faster than any GPU you've EVER been able to buy at that price point. What's changed is your money doesn't buy you the status it previously did. What's changed is instead of having the best at a price point you prefer, you must balance other competing requirements to subjectively determine value.

Please explain that when taking a look at a 1080Ti vs a 2080.
They have basically the same performance within margin of error.
The 1080Ti cost around 700.
The 2080 costs around 700.
How exactly is this "the fastest GPU ever"?

It's a stagnation, and the next better model costs _twice_ the amount.
 
Heh, yeah, and from what we saw with the 20XX cards and their pricing, you can expect to have to mortgage your house in order to get one unless the fucking miners snap 'em up first. :rolleyes:

I'm just about done with PC gaming completely, I'm sick of paying to be a beta tester for these companies, and "escaped samples" that could burn my damn house down if I leave my PC on. I'm old and I remember when PC gaming really meant something, now it is just another corporate cocksucker cash grab. Just my opinion.

Is consumer card mining still a thing? I can't imagine with the huge drop in BTC that it's even profitable at this point.
 
Average Joe should be playing consoles and maybe should just stick to his phone for games if that's the case.

When the now middle of the road "performance" GPU costs more than a mid level cpu and mobo combined (and in some cases ram included) you might have a problem with runaway prices. Especially when that cpu will likely last you 2 times longer than the GPU will.
 
Please explain that when taking a look at a 1080Ti vs a 2080.
They have basically the same performance within margin of error.
The 1080Ti cost around 700.
The 2080 costs around 700.
How exactly is this "the fastest GPU ever"?

It's a stagnation, and the next better model costs _twice_ the amount.
You’re only looking at a subset of capabilities and metrics. The CUDA core/shader performance is about the same for the same money (not surprising based on specs and Nvidias announcement focus). However, there’s additional units on the 2080 which make it significantly faster than a 1080Ti in other tests. How much value that provides you is individually dependent on your use case. Turing cards also have dual INT/FP execution that a 1080Ti does not. There are other changes too.

It’s like Intel and AVX512 support, single or dual issue. It’s much faster than AVX2 if you have the right problem. It’s about the same if you don’t. RTX shouldn’t tempt you in this case any more than AVX512.

Any particular subsystem may/may not be faster and you can cherrypick tests to make virtually any point you want. However, across all tests, the 2080 is at least as fast and much faster in some (where RT, tensor, or simultaneous INT/FP provides a benefit).

A 2080 is faster than a 1080Ti, just not in everything. Nothing new about that in tech nor any other market. E.g. for cars, sometimes 0-60 drops, or skid pad Gs increases, or lap times around some circuits decreases or whatever. Rarely is every metric significantly improved in a single generation.

Edit: Here's a similar example. More CPU cores doesn't always make a faster CPU. It can in some tests, but in others it can actually cause a regression. Threadripper is amazing, but more expensive than Ryzen and not faster in every test. However, in general, we consider Threadripper to be the "more powerful" CPU even if it's slower in some cases. The option to buy Threadripper, even if it's an expensive CPU, is a great option to have. Despite the fact that it's not the right CPU for many and priced well above average.
 
Last edited:
When pricing is subjective inflation doesn't = market price.
And the point?

Pricing is always subjective in a free market. Inflation just gives a way to normalize, not for subjectivity, but time since the value of currency is not constant.
 
Back
Top