Did nvidia throw a 1-2 punch to AMD with Ampere?

Stoly

Supreme [H]ardness
Joined
Jul 26, 2005
Messages
6,713
I know its still early to tell, but it seems to me that Ampere is a pretty big blow to AMD on 2 fronts.

Absolute performance blow with the RTX3090, I don't think AMD will have anything that could come anywhere close to it. So once again AMD won't be able to claim the performance crown.

Price/performace blow with the RTX3070, since its basically a RTX2080Ti for less than half the price. I think this is what's gonna hurt the most. If the rumors on NAVI20 performance are true on it being on par/faster than a RTX2080Ti. I'm sure AMD would have loved to release a card that could go head to head with the 2080Ti for cheaper, say $900~$1000, but now it will have to price it BELOW the $499 mark just to compete with the 3070.

Even if NAVI20 is faster then the 3070, still can't go much higher as it would get too close to RTX3080 territory.
 
I don't think there were any performance surprises, but likely the pricing was a surprise, as it was to most people.

AMD was probably expecting to drop it's 2080 Ti equivalent for $600, and look good doing it. $600 is half the $1200 2080 Ti FE, so they could be the Heroes disrupting 4K...

Now with a $500 3070, they will have to kiss significant margin goodbye, and they will be hugging NVidia pricing to not lose more, so they won't have the Hero story to tell.

Though probably better for them that NVidia launched first, they didn't need another 5600 XT style, self jebaiting, shuffle, after shipping.
 
If AMD bothered to do any homework whatsoever, they should know that nvidia does this every generation. The 980TI performance was roughly matched by the 1070. The 1080ti was roughly matched by the 2070. And now, gasp, omg, shocker!!111, the 2080ti is being roughly matched by the 3070. The only reason people are flipping their shit is because of the sky-high price that nvidia attached to the 2080ti. The reason they were able to do so? Absolutely no competition by AMD. Under no circumstance was anyone getting $1200 of actual value here.
 
AMD doesn't seem overly stressed - they seem to be sticking to whatever their plan is (they recently teased the 6000 branding in a custom Fortnite map). I'm hoping this is confidence in their product, and not panic.

I've said before I do like AMD's approach, I find their silence better than the drip feed/trash talk of the Raja era (Poor Volta, Vega shirts, etc). Just hoping with Nvidia showing some of their deck of cards, AMD would give a glimpse of theirs - especially being so close to the rumored launch.
 
As someone who wants AMD back on their feet (duopolies are bad, monopolies are worse), I just want AMD to build off their 5000 series and have a strong product stack from top to bottom. They have been missing that for years. Be competitive against nvidia from top to bottom then try and take the halo crown.
 
The 1080ti was roughly matched by the 2070.

Nope, it was the 2080 (launch $800) that matched the 1080 Ti (Launch $700). This was actually a step backwards in perf/dollar.

Now it's the 3070 (launch $500) matching the 2080 Ti (Launch $1200). This is massive gain in perf/dollar.

It's significantly better than Turing, NVidia moved price performance further than everyone was expecting.
 
Nope, it was the 2080 (launch $800) that matched the 1080 Ti (Launch $700). This was actually a step backwards in perf/dollar.

Now it's the 3070 (launch $500) matching the 2080 Ti (Launch $1200). This is massive gain in perf/dollar.

It's significantly better than Turing, NVidia moved price performance further than everyone was expecting.

https://www.techpowerup.com/gpu-specs/geforce-rtx-2070.c3252

The 2070 (Launch $500) is 10% slower than the 1080ti. The 2070 Super (Launch $500) is 1% faster. Thats matching performance just fine. And don't bring the perf/dollar arguement into this with prices that aren't tied to a competitive marketplace. Anyone who thinks that the 2080ti was worth every penny of that $1200 is a damn fool. Do you not know how diminishing returns work on halo products?
 
https://www.techpowerup.com/gpu-specs/geforce-rtx-2070.c3252

The 2070 (Launch $500) is 10% slower than the 1080ti. The 2070 Super (Launch $500) is 1% faster. Thats matching performance just fine. And don't bring the perf/dollar arguement into this with prices that aren't tied to a competitive marketplace. Anyone who thinks that the 2080ti was worth every penny of that $1200 is a damn fool. Do you not know how diminishing returns work on halo products?

Now you are shifting goalposts. That isn't the normal 2070, and that wasn't part of the Turning launch.
 
I'm not shifting anything. Look at that link. The 980Ti (Launch $650) is 8% slower than the 1070 (launch $350). Nvidia does this every generation, the only difference is that this year all the idiots think they are getting some huge new value. You aren't. The fact of the matter is that every single one of these cards in the 2000 and 3000 generation is $100 to several $100s more expensive then previous generations for the basically exact same performance increase. TechSpot actually covered this pretty well in a recent article that pitted the 980ti vs 1080ti vs 2080ti. Their conclusion?

While we could observe a few instances where the RTX 2080 Ti was ~50-60% faster than the 1080 Ti, that’s not quite enough to justify the 70% increase in price, especially when talking about GPUs of different generations. In most games that margin is closer to 20-30%, and frankly we’d have hoped that would be a worse case for Turing vs. Pascal at the same price point.

Is it fair to say Turing was a disappointment? That was my opinion upon release and it’s still my opinion two years later, and I'd have loved to be proven wrong. At one point we thought Nvidia would be forced to replace the GeForce 20 series much sooner than they have, but our mistake was placing too much trust in AMD’s ability to hit them with a range of Radeon 7nm GPUs, Radeon VII anyone?

Cmon man 11K posts here you should be smarter than this. Stop putting any type of credence on that $1200 figure, it literally means nothing. But hey, if you want to drink the koolaid that a vendor fair gives you, feel free man, I own stock in Nvidia and I need people like you to drive the price up after todays and yesterday Nasdaq slaughter.
 
Reviews at launch had the 2080 and 1080ti within a few percent of eachother in most benchmarks, which made upgrading pointless unless you went for a 2080 ti and is why Turing was marked as a bad value unless you believed ray tracing.

Over time the gap has widened due to optimisation.
 
I believe in the future of RT, but it sucked balls on the 2000s. Maybe its better with RDNA2 or 3000s, but i'll wait for reviews.
 
Cmon man 11K posts here you should be smarter than this. Stop putting any type of credence on that $1200 figure, it literally means nothing. But hey, if you want to drink the koolaid that a vendor fair gives you, feel free man, I own stock in Nvidia and I need people like you to drive the price up after todays and yesterday Nasdaq slaughter.

Smart enough to realize that actually was the price whether you like it or not, AIB cards were often selling higher $1200. And the 2070 did not launch with equal performance to, the 1080 Ti as you claimed.

Smart enough to relalize every launch is not the same as you claimed. Pascal was one the best ever launches for NVidia, Turing was one of the roughest, Ampere looks to be shaping up as one of the best...
 
I'm not shifting anything. Look at that link. The 980Ti (Launch $650) is 8% slower than the 1070 (launch $350). Nvidia does this every generation, the only difference is that this year all the idiots think they are getting some huge new value. You aren't. The fact of the matter is that every single one of these cards in the 2000 and 3000 generation is $100 to several $100s more expensive then previous generations for the basically exact same performance increase. TechSpot actually covered this pretty well in a recent article that pitted the 980ti vs 1080ti vs 2080ti. Their conclusion?

Cmon man 11K posts here you should be smarter than this. Stop putting any type of credence on that $1200 figure, it literally means nothing. But hey, if you want to drink the koolaid that a vendor fair gives you, feel free man, I own stock in Nvidia and I need people like you to drive the price up after todays and yesterday Nasdaq slaughter.

Appreciate being called an idiot. Like Snowdog said, the 2070S didn't launch with Turing. The 2070 didn't match the performance of the 1080Ti. That was the 2080. In some cases the 1080Ti was faster than the 2080.

Ampere has a pretty compelling value proposition. Turing didn't.
 
3060 at $400 is twice the average price of previous xx60 gens, with the exception of the 2060. These were traditional $200-250
3070 at $500 is more than $100 as previous gens, which were $329-399
3080 at $700 is more than $50-200 as previous gens, which were $500-650

How is any of this a better value? Yay higher asking prices, I can't believe the deal we are getting! But hey, I'm glad you all think these prices are a good thing. Nvida stock is down $100 in two days, i need you all to boost it up. Your money not mine lmao
 
3060 at $400 is twice the average price of previous xx60, with the exception of the 2060
2060 launched at $350, so a $50 increase here.
3070 at $500 is more than $100 as previous, which were $329-399
2070 launched at $500. Same price.
3080 at $700 is more than $50-200 as previous, which were $500-650
The 2080 launched at $700 ($800 for the FE), making the 3080 the same price, and cheaper for the FE.

How is any of this a better value? Yay higher asking prices, I can't believe the deal we are getting!
 
To add, when Turing released, the 2080 was about the same price, give or take, as the 1080Ti. At the time, it gave you negligible performance gains. Now we have the 3080 around the same price, with something like 60-80% performance gains. That's why this is a much better value than the last generation.
 
Exactly. The 2000 series started the increased prices and the 3000 series is continuing it. Let see how the actual reviews are for performance. A x6 for $400? thats the x7 level. Pricing has made all products go up a tier
 
Exactly. The 2000 series started the increased prices
2000 series was a big price jump, but prices had increased before.
and the 3000 series is continuing it. All while bringing performance increases roughly equal to previous gens. Unless one of you want to make the argument otherwise?

Disagree. See previous posts. That's just not aligned with reality. I don't think you were paying attention to the video card market around the Turing launch. If you had, you wouldn't be saying these things.
 
3060 at $400 is twice the average price of previous xx60 gens, with the exception of the 2060. These were traditional $200-250
3070 at $500 is more than $100 as previous gens, which were $329-399
3080 at $700 is more than $50-200 as previous gens, which were $500-650

How is any of this a better value? Yay higher asking prices, I can't believe the deal we are getting! But hey, I'm glad you all think these prices are a good thing. Nvida stock is down $100 in two days, i need you all to boost it up. Your money not mine lmao

I need to keep correcting your mistakes.

We don't have a 3060 price yet. The same rumor that claimed $400 3060, were $100 high on 3070 and 3080.
2070 launched at $600 for FE cards, 3070 FE is $500, $100 less.
2080 launched at $800 for FE cards, 3080 FE is $700, $100 less.

On top of launching cheaper than Turing, Ampere is delivering a much bigger performance jump.
 
I highly doubt we are getting 60-80% perf. increase, thats purely speculative. AMD isnt pressuring Nvidia that much
 
$50 here and there for top end products isn't going to do anything. This same thing happened last gen with that whole 'jebaited' crap
 
BTW there has been no mention of the rumored memory compression technology for Ampere, that would help explain the "low" memory size in the RTX3070/3080 cards.
 
BTW there has been no mention of the rumored memory compression technology for Ampere, that would help explain the "low" memory size in the RTX3070/3080 cards.

Probably because more memory wasn't really necessary, at least that is the official NVidia party line:
https://www.nvidia.com/en-us/geforce/news/rtx-30-series-community-qa/
Q: Why only 10 GB of memory for RTX 3080? How was that determined to be a sufficient number, when it is stagnant from the previous generation?

We’re constantly analyzing memory requirements of the latest games and regularly review with game developers to understand their memory needs for current and upcoming games. The goal of 3080 is to give you great performance at up to 4k resolution with all the settings maxed out at the best possible price. In order to do this, you need a very powerful GPU with high speed memory and enough memory to meet the needs of the games. A few examples - if you look at Shadow of the Tomb Raider, Assassin’s Creed Odyssey, Metro Exodus, Wolfenstein Youngblood, Gears of War 5, Borderlands 3 and Red Dead Redemption 2 running on a 3080 at 4k with Max settings (including any applicable high res texture packs) and RTX On, when the game supports it, you get in the range of 60-100fps and use anywhere from 4GB to 6GB of memory. Extra memory is always nice to have but it would increase the price of the graphics card, so we need to find the right balance.
 
Also, It didn't really take that long for people saying $499 is now "cheap". I do gotta say the RTX3070 gives a lot of punch for the money. This would be my very 1st $499 card ever. My previous top was $369 for my current GTX1070Ti
 
Also, It didn't really take that long for people saying $499 is now "cheap". I do gotta say the RTX3070 gives a lot of punch for the money. This would be my very 1st $499 card ever. My previous top was $369 for my current GTX1070Ti

Most I ever paid was $275 for a 9700 pro. This time it's going to be MUCH more .
 
I agree to some extent, but I just can't live with 640k anymore :D :D

I was hoping for memory increases everywhere. I didn't figure on G6X, so I thought every price point would get more memory channels and thus more memory.

Though I have said many times, I don't get why they haven't already let AIBs double up on memory if they want to. It really seems like there is a market for more memory, at least until they see the price. :D
 
I was hoping for memory increases everywhere. I didn't figure on G6X, so I thought every price point would get more memory channels and thus more memory.

Though I have said many times, I don't get why they haven't already let AIBs double up on memory if they want to. It really seems like there is a market for more memory, at least until they see the price. :D
Mainly because Micron only has 8gb chips for now. 16gb are coming I think early-mid 2021
 
Mainly because Micron only has 8gb chips for now. 16gb are coming I think early-mid 2021

I am referring to the past. Regular GDDR5 and 6 double density chips have existed for a long time AFAIK.

So why no 16GB 2070 Super and 5700XT cards?
 
I am referring to the past. Regular GDDR5 and 6 double density chips have existed for a long time AFAIK.

So why no 16GB 2070 Super and 5700XT cards?
Well, to start, 16GB would mainly be useful for 4K and up, and none of those cards would have the power to drive games that would require that much memory.
 
I had no idea so many of you were doing creative work, deep learning, or extreme graphics high resolution gaming.

As fast as AMDs response, they need to shore up their software. I saw a problem with Radeon recording software being divorced from driver from last month or so. Freaked out it the kid I built a box bc content creation is just as important. I'd he can't share a clip, it didn't happen I guess.

Nvidia's audio and backround features as applied to work recordings or streaming is valuable enough for me for work. I hate playing around with OBS compressor and noise gate settings. Id like easy mode bc audio isn't my thing. Backround would be nice bc I don't feel like rapidly cleaning my home office. Amd should have something to counter these gaming adjacent functions. Running cams eats into a 1 pc rig.

It's be nice if Amd open sourced a similar latency suite. They'll have to attack the halo concept of latency reduction.

We will see what happens.
 
Not really, though I imagine the pricing announcement had a few red team senior managers exclaim “ah fuck”, I imagine both sides have been playing espionage chicken on that for as long as possible. looks better for Nvidia to get out in front with a low price than the pr of dropping them in a couple of months.


I’m seeing it as nvidia coming straight out their corner in the sixth and landing a clean power jab to the face but the the cross going into AMDs guard. Good hit, gets the crowd going a bit, but it’s not putting anyone on the canvas.

Now we see if AMD keep their hands up and get fighty, or just keep getting jabbed out. Better than the 4th round liver punch they took with the 1000 series. Luckily the smelling salts of a soaring stock price and a ton more resources has gee’d amd up.

Labored boxing metaphor aside, the most interesting thing for me has been hearing from people about the announcements, People that I’ve never heard from before on tech like this. Don’t know if this is a covid thing where a lot of us more ‘seasoned’ people are getting more play time than we’ve had for years and FS2020 being a beast but it has been notable.
 
I had no idea so many of you were doing creative work, deep learning, or extreme graphics high resolution gaming.

I know right, demographically I’m sure they’re over represented here, but given the generally wince inducing views on enterprise tech and cloud, there’s a disconnect.

There’s a few people that obviously understand ML and the technical parts of graphics and vision but many more that er...don’t.

One of the easy tests is price, anyone going “that many ‘cores’, nvlink and 24gb for $1500... sign me the fuck up” they know the score.
 
The real price, the one that you can purchase online or at retail is what is important to me. If Nvidia and/or AMD has no cards or so little that it makes it an utter choir besides a natural price hike from sellers then the indicated prices mean squat. How many could buy a good quality 2080Ti at advertised starting price of $999, during the whole generation? That price was meaningless, some bought the EVGA 2080Ti black edition with a lot of regrets following, rarely in stock and usually over that price elsewhere. Before we conclude these are excellent pricing, only if they actually reflect reality later.

As for AMD, it would be hard for me think AMD was shooting for 2080Ti level of performance, maybe that is all they could get, I expect something much more than that from AMD - we will see.
 
As for AMD, it would be hard for me think AMD was shooting for 2080Ti level of performance, maybe that is all they could get, I expect something much more than that from AMD - we will see.

I don't think anyone is arguing that Big Navi tops out at 2080 Ti level.

The argument was:

That they wouldn't reach 3090 - so the miss the performance crown.
That 3070 was cheaper than expected and messes with AMD bang/buck plans.

No real discussion of the 3080 matchup that is interesting.

I think Big Navi, will mix it up with 3080 in most ways. Similar performance, similar power usage and similar price.

After all they have time to tweak clock/memory speeds and pricing before lunch to get there, or close enough. There is a surprising amount of room to squeeze more performance if you are willing push more power.

AMD is no doubt assessing 3080 performance, and digging into their silicon headroom, and cooler efficiency. They Tweak it enough to get what they need even if a bit noisy they can get there. AIBs will give quieter coolers just like they always do.

Now that the card is delivering the performance it needs. Then they just have to set the appropriate price. Maybe 16GB of HBM, they match the 10GB 3080, and highlight the extra VRAM. Sell a 8GB version that undercust the 3080 pricing.
 
I think AMD is better off going HBM2e, they already have experience with it and traditionally AMD cards are more memory bandwidth hungry.

16GB pm GDDR is only practically possible on a 256bit bus. while a 512bit bus is possible, its quite complex/costly to implement, more so at high memory speeds, so HBMe would be a better choice IMO
 
I have a feeling Big Navi will be somewhere between a 3080 and 3090 with normal raster performance. However Ray Tracing remains a wild card, but I could care less about that.
 
  • Like
Reactions: noko
like this
Back
Top