Rumor: 3080 31% faster than 2080 Ti

The second sentence in that post is a non sequitur from the first sentence in that post.

Nvidia doesn't sell GPUs for the price of what they cost to produce. Nvidia sells them at an upcharge in order to make profits, with the difference between the cost to manufacture and distribute and how much they sell for being the profit-margin.

How much people are willing to pay for a graphics card guides how much Nvidia choose to upcharge their graphics cards. If Nvidia think they can charge more and enough people will still pay it, then they'll increase the price. If they think they'll make more money from a lot more people being willing to buy their GPUs at a lower performance-per-dollar point, then they'll lower the prices of their GPUs.

The RTX 2000 series is surely not as exorbitantly and prohibitively expensive as it is because Nvidia kept the same profit-margin on them compared to Pascal but the expense of developing and manufacturing them just cost that much more. It was surely because Nvidia are greedy and thought they could get more for them and so pushed the profit-margin way up - but the Steam hardware stats don't appear to validate Nvidia's gambit. If part costs increase and the market's disposable income simultaneously decreases, then that might mean that Nvidia have to reduce their product cost along with their profit-margins in order to maintain their volume of sales. The new consoles are also a competitor to 3000-series sales and that too surely factors into what price they'll be set at.

Since the 2000 series already wasn't within most people's budgets, and since the consumer market overall has less disposable income now due to covid-19, it makes sense that Nvidia would take that into account when pricing their new products. That's not me saying that Nvidia will price the 3000 series cheaper. But there are a lot of reasons for them to do so.
I'm not sure nVidia has much wiggle room on the pricing without cutting R&D costs the bulk of the 2000 series costs were because of TSMC, I don't expect much to change on the 3070 or up, I could see the actual card manufacturers doing all sorts of Mail in Rebates or cutting $50 off MSRP but nVidia certainly isn't going to sell the parts at a loss. I do expect them to get a lot more competitive in the sub $300 range which is where the bulk of the sales are anyways, but that will have more to do with AMD being very aggressive in this segment. I am expecting this year for both nVidia and AMD to post low numbers and any unsold chips get rebadged for next gen as a reduced SKU at a discount. (IE: 3060 rebranded to 4055)
 
Arguments about prices dropping because of CV-19 are nonsense. Most parts are increasing in price.

Costs and price are unrelated.

No business that knows what they are doing uses a "cost plus" model to set their pricing. That's a great way to drive yourself out of business.

Pricing is set entirely based on what the market will bear. If over time, the market won't sustain the same high prices, that comes out of the business profit margins in the short term, and they try to reduce cost in design long term. If the market won't sustain a high enough price to provide an acceptable profit margin over cost, then the product just doesn't get made.

"Cost plus" pricing is something that died outside of unsophisticated mom and pop shops back in the 60's. It's just not how economics works.
 
My problem isn't just the high price and low performance jump but the time frame. After 2+ years I'd hope for more than 30%. Probably closer to 40-45%. If the price shoots up I'd expect a bigger performance jump, or a shorter time between generations.

Though I realize this is largely because AMD isn't competing out of the $200 range. I know their recent cards can match RT 2070 levels but the market share isn't strong enough clearly. Seems like they had supply issues and came too late this generation to compete properly there.
 
My problem isn't just the high price and low performance jump but the time frame. After 2+ years I'd hope for more than 30%. Probably closer to 40-45%. If the price shoots up I'd expect a bigger performance jump, or a shorter time between generations.

Though I realize this is largely because AMD isn't competing out of the $200 range. I know their recent cards can match RT 2070 levels but the market share isn't strong enough clearly. Seems like they had supply issues and came too late this generation to compete properly there.
until something big happens with rendering pipelines I doubt we are going to see any huge increases in performance without radically breaking titles from a few years ago, we are at one of those tipping points, graphics cards need to move to MCM for the performance increases and cost decreases the architecture can provide but the software engines need to be developed for the architecture first, otherwise we are going to have cases where yeah on a new title you see like a 200% performance increase but you see a 75% decrease on old titles. I know that Intel, AMD, and nVidia are working on bringing MCM graphics packages to the consumer market but until they get here we are where we are.
 
until something big happens with rendering pipelines I doubt we are going to see any huge increases in performance without radically breaking titles from a few years ago, we are at one of those tipping points, graphics cards need to move to MCM for the performance increases and cost decreases the architecture can provide but the software engines need to be developed for the architecture first, otherwise we are going to have cases where yeah on a new title you see like a 200% performance increase but you see a 75% decrease on old titles. I know that Intel, AMD, and nVidia are working on bringing MCM graphics packages to the consumer market but until they get here we are where we are.

Bring on Hopper.
 
My problem isn't just the high price and low performance jump but the time frame. After 2+ years I'd hope for more than 30%. Probably closer to 40-45%. If the price shoots up I'd expect a bigger performance jump, or a shorter time between generations.

Though I realize this is largely because AMD isn't competing out of the $200 range. I know their recent cards can match RT 2070 levels but the market share isn't strong enough clearly. Seems like they had supply issues and came too late this generation to compete properly there.

I ave no facts or leaks to base this off of, but I expect them to try to push the hell out of improved raytracing features and try to use that to drive up pricing, but for overall raster improvements over the current gen to be disappointing.
 
Costs and price are unrelated.

The hell they aren't. The price of goods is proportional to production costs. If production costs go up, end user prices go up.

We are much more likely to get CV-19 price increases, than decreases.

There will still be plenty of buyers for $1000+ GPUs in a CV-19 world.

There are lesser models for those that want to spend lesser money.
 
That's half the problem, price war with who???? At this point we only have one GPU vendor and they are TSMC.

I am pretty sure someone can use Samsung as well if they were unhappy with the pricing at TSMC.
 
I don't disagree. If price are set high to start with, they can sell what they can, then lower price and so on. But the limit on production capacity mean they don't have the incentive.
 
The hell they aren't. The price of goods is proportional to production costs. If production costs go up, end user prices go up.

We are much more likely to get CV-19 price increases, than decreases.

There will still be plenty of buyers for $1000+ GPUs in a CV-19 world.

There are lesser models for those that want to spend lesser money.

You should take a few economics classes.

Start with this:
https://en.wikipedia.org/wiki/Value-based_pricing
 
I am pretty sure someone can use Samsung as well if they were unhappy with the pricing at TSMC.
Not if they want to compete, would require a ~20% die size increase, a 10% clock decrease, and slight bump in energy use. Samsungs new 7nm process is supposed to be on par with TSMC's current one but they don't have that fab up and running
 
You should take a few economics classes.

Start with this:
https://en.wikipedia.org/wiki/Value-based_pricing

That's about selling things at obscene markups, like fashion that are completely divorced from reality. Like $3000 Gucci bags that cost $10 to manufacture.

GPUs with tens of millions of R&D and large expensive silicon chips are not in this kind of category.

There have been cost tear-downs and estimations on GPUs, and their margins are quite grounded in reality.
 
Not if they want to compete, would require a ~20% die size increase, a 10% clock decrease, and slight bump in energy use. Samsungs new 7nm process is supposed to be on par with TSMC's current one but they don't have that fab up and running

They were doing just fine on a older process then AMD or they could delay and wait if TSMC is gouging, which I doubt. Looking forward to prices and reviews tho.
 
They were doing just fine on a older process then AMD or they could delay and wait if TSMC is gouging, which I doubt. Looking forward to prices and reviews tho.
It's an optics/marketing thing but even then Samsung is also heavily booked and can only do some 500 more wafers per day than TSMC so while there would be a slight savings it would not be a huge amount. 7nm is expensive end of story, TSMC isn't gouging by a long shot but the fabs cost something like 30B a pop and they need to make their ROI.
 
That's about selling things at obscene markups, like fashion that are completely divorced from reality. Like $3000 Gucci bags that cost $10 to manufacture.

GPUs with tens of millions of R&D and large expensive silicon chips are not in this kind of category.

There have been cost tear-downs and estimations on GPUs, and their margins are quite grounded in reality.


Yep, and if you read it, you learn that these days its pretty much reserved for payment for custom work, or projects, like defense contracting.

Essentially, if it isn't a custom job, or a government contract, pricing was probably set through some sort of value based pricing model.

Just about all consumer goods,m everything from yes, expensive Gucci Bags, but even cheap fast fashion, or cars or computer parts utilize value based pricing models these days, and have for 50+ years, because that's how you both maximize revenues when you have a market leading product, and minimize losses when you don't.
 
They going to be even more expensive. $1000 for 3080. $1500 for 3080ti. $2k+ for the 3090. I don't see AMD releasing something more powerful then a 2080ti.
I don't know, by all counts the GOU in the Xbox X should keep pace with the 2080... I would hope they can put a discrete card that performs at least at 2080ti levels considering they are much less constrained (TDP and $). If they can make a CPU + GPU and sell it for $500, I can't see why a $700-$800 GPU (or $1,000+) shouldn't be a good bit faster. Maybe not twice, but 50% without the limited TDP and need to keep costs as low as possible. My wild a$$ guess is the top end part will be slightly faster than a 2080ti (depending on the game) and fall short of the top tier 30x0 part(s). I have nothing but assumptions to base this on and could easily be off the mark.
 
They going to be even more expensive. $1000 for 3080. $1500 for 3080ti. $2k+ for the 3090. I don't see AMD releasing something more powerful then a 2080ti.

Well.

Based on the little we know thus far we can make some predictions.

AMD is on record stating that RDNA2 has "up to 50% better performance per watt".

That "up to" dilutes the usefulness of the information a little, but if we give them the benefit of the doubt, here's what that looks like.

Typically stock GPU's tend to top out at about 250W, if that's where they go, expect something up to 60% faster than a current 5700 XT, which means it would be trading blows with a 2080 ti.

AMD have not been shy about upping the TDP a whole lot more in the past with fancy AIO coolers though. If they are willing to go up to 350W again like they did with one of the liquid cooled Vega 64 (Fronteir edition or something?) we could be talking 125% faster than the 5700 XT which would make it the fastest consumer GPU on the market, at least until Ampere hits. And if the fastest 3000 series card is indeed 31% faster than a 2080ti, a 350w TDP RDNA2 could wind up being faster than that. That liquid cooled Fronteir Edition Vega card was targeted at Worstation users though, and had a $1,499 price tag... And who knows if they'd be willing to go that extreme power wise.

It's going to be interesting to see where this one lands for sure. In reality, we likely won't see the extreme 350W scenario above, but that is the upper limit based on what they have told us about perf/watt compared to RDNA.

The performance of AMD's hardware raytracing is a big unknown, too, as is whether or not it will really take off the way Nvidia hopes it will.
 
Yep, and if you read it, you learn that these days its pretty much reserved for payment for custom work, or projects, like defense contracting.

Essentially, if it isn't a custom job, or a government contract, pricing was probably set through some sort of value based pricing model.

Just about all consumer goods,m everything from yes, expensive Gucci Bags, but even cheap fast fashion, or cars or computer parts utilize value based pricing models these days, and have for 50+ years, because that's how you both maximize revenues when you have a market leading product, and minimize losses when you don't.

It's not purely one model. You are acting too much like GPUs are a fashion accessory business. When they aren't.
https://www.investopedia.com/terms/v/valuebasedpricing.asp
Examples of Value-Based Markets
The fashion industry is one of the most heavily influenced by value-based pricing, where value price determination is standard practice. Typically, popular name-brand designers command higher prices based on consumers' perceptions of how the brand affects their image. Also, if a designer can persuade an A-list celebrity to wear his or her look to a red-carpet event, the perceived value of the associated brand can suddenly skyrocket. On the other hand, when a brand's image diminishes for any reason, the pricing strategy tends to re-conform to a cost-based pricing principle.

Other industries subject to value-based pricing models include name-brand pharmaceuticals, cosmetics, and personal care.

All of these examples are products where productions costs are an insignificant part of the final price, and you are acting like GPU production cost are an insignificant part of the final price as well.

But that is completely opposite from reality.

GPUs are VERY expensive to produce, and engineer. From the looks of things GPU companies need about >40% margins for their business to maintainable. NVidia has been running near 60% Gross margin.

There are no fantastic margins to evaporate here.

And regardless, even if your fantasy did exist CV-19 would not lower them.

If we go buy your theory, Gucci Handbags will soon be $75 next week because CV-19 people can't afford as many $3000 handbags. Right?

But that's not going to happen. Because even if you want pretend that NVidia has designer handbag inflated prices, the last thing those excess priced designer items want to do is expose the illusion that might deflate their pricing bubble, so Gucci Handbags and NVidia GPU will be priced as usual, crisis or no crisis, even if your fantasy had merit.

Back in practical reality. Covid-19 only seems to be driving computer component prices up, from good old fashioned supply and demand, and production costs increases.
 
Well.

Based on the little we know thus far we can make some predictions.

AMD is on record stating that RDNA2 has "up to 50% better performance per watt".

That "up to" dilutes the usefulness of the information a little, but if we give them the benefit of the doubt, here's what that looks like.

Typically stock GPU's tend to top out at about 250W, if that's where they go, expect something up to 60% faster than a current 5700 XT, which means it would be trading blows with a 2080 ti.

AMD have not been shy about upping the TDP a whole lot more in the past with fancy AIO coolers though. If they are willing to go up to 350W again like they did with one of the liquid cooled Vega 64 (Fronteir edition or something?) we could be talking 125% faster than the 5700 XT which would make it the fastest consumer GPU on the market, at least until Ampere hits. And if the fastest 3000 series card is indeed 31% faster than a 2080ti, a 350w TDP RDNA2 could wind up being faster than that. That liquid cooled Fronteir Edition Vega card was targeted at Worstation users though, and had a $1,499 price tag... And who knows if they'd be willing to go that extreme power wise.

It's going to be interesting to see where this one lands for sure. In reality, we likely won't see the extreme 350W scenario above, but that is the upper limit based on what they have told us about perf/watt compared to RDNA.

The performance of AMD's hardware raytracing is a big unknown, too, as is whether or not it will really take off the way Nvidia hopes it will.
Additionally, remember we have benches from Jan showing unknown AMD card/config beating 2080ti by sizeable percentage in VR Bench.
https://videocardz.com/newz/mysterious-amd-radeon-gpu-appears-in-openvr-benchmark-leaderboard
Too early to tell performance for either RTX 3XXX or RDNA2 card. I'm just hoping for competition to get these high end prices lowered a bit.
 
Pascal was out for 2 years and 4 months before Turing released. Turing will have been out for 2 years when Ampere releases. Not too much of a difference there.

That's not completely accurate. If you look at the high volume segment Pascal (1060) was out for 29 months before the Turing replacement arrived (2060). That replacement has only been on the market for 17 months. There's no doubt Pascal was extremely popular and Turing isn't selling as well but Pascal was on the market for much longer.

I think the other factor is we're at the end of a console generation and games simply aren't that demanding. For the vast majority of people who aren't trying to game at 4K Pascal cards are just fine. This new console generation will usher in a wave of upgrades in a year or two when games actually need more horsepower.
 
It's not purely one model. You are acting too much like GPUs are a fashion accessory business. When they aren't.
https://www.investopedia.com/terms/v/valuebasedpricing.asp


All of these examples are products where productions costs are an insignificant part of the final price, and you are acting like GPU production cost are an insignificant part of the final price as well.

But that is completely opposite from reality.

GPUs are VERY expensive to produce, and engineer. From the looks of things GPU companies need about >40% margins for their business to maintainable. NVidia has been running near 60% Gross margin.

There are no fantastic margins to evaporate here.

And regardless, even if your fantasy did exist CV-19 would not lower them.

If we go buy your theory, Gucci Handbags will soon be $75 next week because CV-19 people can't afford as many $3000 handbags. Right?

But that's not going to happen. Because even if you want pretend that NVidia has designer handbag inflated prices, the last thing those excess priced designer items want to do is expose the illusion that might deflate their pricing bubble, so Gucci Handbags and NVidia GPU will be priced as usual, crisis or no crisis, even if your fantasy had merit.

Back in practical reality. Covid-19 only seems to be driving computer component prices up, from good old fashioned supply and demand, and production costs increases.

I didn't say they were necessarily going to drive down prices. I did say that the current depressed consumer market would impact the price setting demand curve model in some way. Its not clear how.

And yes, Nvidia is raking in huge margins on some consumer GPU's. Let's not forget that they are selling top level GPU's for $1,200. This pricing is insane, especially considering most performance increases in the last 15 years can be explained almost entirely by process node improvements, not by architecture engineering.

My first top end GeForce product was the GeForce 3 500TI. Brand new it cost $350.

Then there was the Geforce GTX 285 which was $400 brand new.

Now these are two examples of top end cards which were the cheapest top end cards Nvidia has ever made, and a lot of th ereason they were cheap was price pressure from Ati/AMD at the time that they launched, but I highly doubt they were selling them at a loss.

And yes, cost of gods sold have gone up since then in this market, in large part due to the difficulty of working with shrinking process nodes, but it has not gone up THAT much.

Mid range and low end cards are probably selling with more reasonable margins due to price pressure from AMD, but there is no reason a founders edition 2080ti should be $1200 other than the fact that Nvidia can because they don't have competition, and thus can uniquely provide value that no one else can. At a more reasonable markup, if there were competition at that performance level, a more reasonable price for a founders edition 2080ti is probably just below where the 980ti was at about $600. My completely unfounded guess is that they would have a completely reasonable 30-35% margin if they sold a founders edition 2080ti at $600. This means a founders edition 2080ti today is 62.5% to 65% profit margin, which is pretty grotesque, and not something they could do to if they had competition.

Trust me, they have plenty of margin to drop out of the high end cards. Not so much out of the mid and low end though.

GeForce GPU's are very much in a "value pricing" market. Nvidia prices them high right now because they provide Halo performance that no one else can provide, and that has a lot of value. It is a total value pricing market.
 
Last edited:
I didn't say they were necessarily going to drive down prices. I did say that the current depressed consumer market would impact the price setting demand curve model in some way. Its not clear how.

And yes, Nvidia is raking in huge margins on some consumer GPU's. Let's not forget that they are selling top level GPU's for $1,200. This pricing is insane, especially considering most performance increases in the last 15 years can be explained almost entirely by process node improvements, not by architecture engineering.

My first top end GeForce product was the GeForce 3 500TI. Brand new it cost $350.

Then there was the Geforce GTX 285 which was $400 brand new.

Now these are two examples of top end cards which were the cheapest top end cards Nvidia has ever made, and a lot of th ereason they were cheap was price pressure from Ati/AMD at the time that they launched, but I highly doubt they were selling them at a loss.

And yes, cost of gods sold have gone up since then in this market, in large part due to the difficulty of working with shrinking process nodes, but it has not gone up THAT much.

Mid range and low end cards are probably selling with more reasonable margins due to price pressure from AMD, but there is no reason a founders edition 2080ti should be $1200 other than the fact that Nvidia can because they don't have competition, and thus can uniquely provide value that no one else can. At a more reasonable markup, if there were competition at that performance level, a more reasonable price for a founders edition 2080ti is probably just below where the 980ti was at about $600. My completely unfounded guess is that they would have a completely reasonable 30-35% margin if they sold a founders edition 2080ti at $600. This means a founders edition 2080ti today is 62.5% to 65% profit margin, which is pretty grotesque, and not something they could do to if they had competition.

Trust me, they have plenty of margin to drop out of the high end cards. Not so much out of the mid and low end though.

GeForce GPU's are very much in a "value pricing" market. Nvidia prices them high right now because they provide Halo performance that no one else can provide, and that has a lot of value. It is a total value pricing market.
Maybe $1200 is insane as you say but it is rumoured that it costs nVidia just shy of $300 to make the chip for the 2080TI where the costs for the 1080TI were closer to $125. There are some pretty clear breakdowns out there covering the component costs of the cards, and then roughly confirming that number based on the financial reports. The 2080TI is just an expensive card no conspiracy no price gouging it’s just straight up expensive. Here’s hoping that 7nm allows them to do more with less. Well know in another few months but until then fretting over it won’t do you any good just raise your blood pressure.
 
Maybe $1200 is insane as you say but it is rumoured that it costs nVidia just shy of $300 to make the chip for the 2080TI where the costs for the 1080TI were closer to $125. There are some pretty clear breakdowns out there covering the component costs of the cards, and then roughly confirming that number based on the financial reports. The 2080TI is just an expensive card no conspiracy no price gouging it’s just straight up expensive. Here’s hoping that 7nm allows them to do more with less. Well know in another few months but until then fretting over it won’t do you any good just raise your blood pressure.

How do graphics card breakdowns estimate the component costs - do they look at bulk prices for 10 million of some of the components, and several million for others?
 
Remember that video cards for gaming are luxury goods in and of themselves. Their pricing reflects that too.

Also, Nvidia has mostly been competing with themselves for the last decade, and absolutely so for the last three years.
 
How do graphics card breakdowns estimate the component costs - do they look at bulk prices for 10 million of some of the components, and several million for others?
Yeah. places that do board level repair get full BP's that contain trace information voltage expectations, components, readings the works. You can parse that information and get full component lists, get a cost breakdown from there. That gives you retail pricing but most of the parts have known bulk pricing much like Intel and AMD list their prices by the batch of 1K. So while not precise it does give a pretty good estimate.
 
My completely unfounded guess is that they would have a completely reasonable 30-35% margin if they sold a founders edition 2080ti at $600. This means a founders edition 2080ti today is 62.5% to 65% profit margin, which is pretty grotesque, and not something they could do to if they had competition.

You are right, that is completely unfounded. Go look at gross margin vs profit histories of both companies. Both start losing money, as a whole company, when gross margins fall below 40%. This isn't Gucci paying starving children $10 to make a handbag they turn around and sell for $3000.

Production prices have increased dramatically as each fab generation gets significantly more expensive than the previous.

nano3.png

These bleeding edge technology companies need fat margins to sustain their heavy investment in R&D. If you are waiting for AMD to save you are going to be waiting forever, since Lisa Su took over and started making a profit again, AMD GPU have largely sold for near identical pricing to NVidias. The higher up the scale, the closer to NVidia pricing.

Vega 56 and 64, sold for the same price GTX 1070/1080, Radeon 7 sold for the same price has RTX 2080.

It's really only with 5700 series where you get a slight discount, but that's with small performance and feature deficit.

Finally, yes, there is excess margin in 2080Ti. But in no matter how competitive the GPU landscape gets, a modern GPU on an expensive process with a massive 754 mm2 would never debut at less than $999. If it had to start at significantly less than that, then it simply wouldn't be built, because there would be no business case.

If this crazy high end GPU market was just all margin and thus super lucrative, AMD would have been chasing it. For years AMD simply bowed out of the top end market, because the business case for these giant GPUs is a dicey one, you sink maybe a hundred million dollars into designing it, and it has crazy production costs due to the insanely large die, and if the market gets more competitive, you could soon have to price them too low to ever recoup your up front costs.
 
Last edited:
What the hell is wrong with some of you people? 31% would be absolutely pathetic. Even Turing did that and that was pretty much a joke. I guess some of you are just absolutely forgetting the jump the 980ti had over the 780ti and then the 1080ti jump over the 980ti. Both of those were 75% or better.

If it's 2080Ti to 3080Ti I'll be a bit disappointed, but I'm not expecting huge shader gains in the 30xx series because to get RayTracing from a tech demo where flagship cards can do it at 1080p60hz to 4k60z or 1440p120hz they need to spend most of the process gains on RT cores. As I estimated earlier this month at the same die size 12 to 7nm has enough transistors to do +50% shading and +300% raytracing. Only 30% shading suggests NVidia might be using a slightly smaller die size this time around; with enormous die sizes being a big part in why the 20xx series were so expensive, this might mean that prices will be returning to more normal levels.

All of that said, I've seen enough conflicting rumors about which GA10x cores will end up in which GPU models that I'm not convinced NVidia has finalized what goes where, which makes me more cautious than normal about any leaks that don't claim to be comparing the biggest Ampre model to the biggest Pascal one.
 
However, I still don't see an indication that the RTX 2000 series sold well.

In June 2020, RTX cards account for 8.39% of Steam user GPUS. The 1000 series accounts for 40.31% of Steam user GPUs.

Turing GPUs account for 15.56%, while Pascal GPUs account for 33.14%.

https://store.steampowered.com/hwsurvey

The 2000 series certainly hasn't had nearly the same draw as the 1000 series, likewise for Turing compared to Pascal. So, whatever constitutes selling well appears to be measured by a distinct barometer than Pascal. In which case, I would ask: How are those sales figures determined to be good when they aren't measured as relative to the previous generation of GPUs?

Ultimately, it would be up to nvidia to decide if RTX 2000 sold well or not.

But we can come up with some reasonable conclusions. So, looking at your percentages, and knowing that there are 95 million steam users:
RTX cards, (2060/2070/2080 including supers and Ti) 8.39% so 95000000 * .0839 = 7,970,500 units.
1000 series GPU's, which you would expect to be a big segment as they've been out for 4 years (stock should be 100% depleted), a portion of that time in which there was a crypto boom, so high production counts. 40.31% = 38,294,500

Looks like they all sold good to me. 38 million 10xx cards in what a 2 1/3 year period, 16 million a year. This has a crypto boom influence.
8 million RTX cards in 2.6ish years, or 3.2 million sold per year. Less than their previous generation, but a number that still looks pretty good.

A comparison to competitors products sold over the same period could give another perspective:
Vega was released June 2017, it doesn't line up with RTX september 2018 for 15 months, and its successor the 5000 series was released July 2019. so for 10 of 25 months of vega, overlaps RTX, or 40%. We can add all of the 5000 cards in.
Vega 1.93% includes both Radeon R7 and the vega mobile numbers in the steam survey. x.4 = 0.772 % = 733,400 units. Hell, lets just include the WHOLE vega run, from June 2017: 1,833,500 units to date
5000 series 0.87% = 826,500 units.
So, the AMD total over the same estimated period of 2.6 years, is 1,559,900 units, compared to 7,970,500 RTX units sold. Or compare ALL of the Vega sales from June 2017 to present, plus the 5000 series = 2.66 million units sold, compared to 7.97 million RTX units sold in a shorter timeframe.

From the above, taking it all into context, I think the statement "RTX sold well" was accurate. Even taking into account that RTX main competitor was nvidia's own 10xx GPU's, they sold well.
 
if they're going to charge $1000 for the new high end cards then a lot of people might just decide to get a PS5 or Xbox X instead...interesting release time for the new GPU's as it lines up with the new console generation...
 
When it launches if it does it's going to be a battle between the corporate 3rd party scalpers and anyone who just wants one.
 
Production prices have increased dramatically as each fab generation gets significantly more expensive than the previous.

Samsung’s desperation to challenge TSMC in the fab game could result in lower fab costs. Of course there’s no reason nvidia would pass along those savings to us.
 
Samsung’s desperation to challenge TSMC in the fab game could result in lower fab costs. Of course there’s no reason nvidia would pass along those savings to us.

Despite earlier rumors, AFAIK, Ampere is on TSMC.
 
Despite earlier rumors, AFAIK, Ampere is on TSMC.
Samsung pricing competitively should still affect pricing at TSMC indirectly.
Of course there’s no reason nvidia would pass along those savings to us.
That would require demand to drop catastrophically, one way or another. And at that point Nvidia would simply make smaller dies instead of making faster ones with the gains from process and architectural advances.
 
I don't know, by all counts the GOU in the Xbox X should keep pace with the 2080... I would hope they can put a discrete card that performs at least at 2080ti levels considering they are much less constrained (TDP and $).
I'm skeptical that the marketing fluff and fanboy magical thinking about the console's APU isn't going to meet the assumed 2080 levels of performance.

Otherwise why would AMD give them away at cost to cheapskate Sony/MS - who are trying to shave every last dollar from their BOMs - when instead they could fast track the chip into the discrete/retail market and completely disrupt the Nvidia party? Doesn't add up.
 
Last edited:
Despite earlier rumors, AFAIK, Ampere is on TSMC.

You're not putting any stock into this somewhat official report from a year ago? Of course we also have word directly from the horse's mouth that most of the business is still going to TSMC.

http://www.koreaherald.com/view.php?ud=20190702000692

“It is meaningful that Samsung Electronics’ 7-nanometer process would be used in manufacturing our next-generation GPU,” said Yoo. “Until recently, Samsung has been working really hard to secure (partners) for foundry cooperation.”
While Yoo did not reveal the exact amount of foundry production by Samsung, he acknowledged that production would be “substantial.” He declined to comment on whether Nvidia is planning to make further requests to Samsung for foundry production.


https://finance.technews.tw/2019/12/20/nvidia-7nm-for-tsmc/

"The founder of NVIDIA, clarified in an interview with the media that most orders for 7nm process products will still be handed over to TSMC in the future, and Samsung will only get a small number of orders."
 
Back
Top