RTX 4xxx / RX 7xxx speculation

Seeing the margins for nvidia at 70+ % already and having them increase the pricing on cards in depressed market, I have lost all respect for them much like apple.
70%+ margins is pure greed, and as a consumer I will not support that.
Does the 70% profit margin include the cost of R&D and parts that needs to be rejected in production due to faults, things like that?
 
Why wouldn’t you turn on DLSS? It looks inconsequentially different (in some cases better) on Cyberpunk and gives an enormous frame rate advantage. Even at launch I played Cyberpunk with DLSS and was impressed. I’m sure it’s only gotten better since.

Prices moving up aren’t surprising. CEO Jensen H. said like 5 years ago that he intended to move Nvidia prices higher successively. I bet these ada (40x0) cards have a dip in sales figures for Nvidia relative to ampere (30x0) — much like the turning cards (20x0) did in relation to pascal (10x0) cards. The crypto boom and bust waves seem to assure that.
In my opinion, dlss is eye candy, I'm not a high user and I haven't tried it (except in rdr2 and it's nothing special at all, but it obviously doesn't work well for those who have a 1080p monitor) but I don't believe in dlss, it's a scam and eye candy.
It can't look visually better in any way in my opinion

Secondly, here in my local forums, they also cried about the rtx 3000 series, how expensive it is, and most of them bought the most expensive rtx 3090 and rtx 3080 for both gaming and mining, and it's impossible to know what anyone has because many have started to hide what they own.
Now it's the same situation, everyone seems to be crying and most will buy rtx 4000 anyway.
And thirdly, some make money from cards and it is not related to gaming, and also for such, the prices are not too expensive.
Fourth, who says that mining will not increase again? rtx 4000 series will certainly be good for mining, I see no reason why it wouldn't be.
Fifth, as I already mentioned, the only question is whether Jensen will be able to print enough of them to satisfy the huge demand.
 
Does the 70% profit margin include the cost of R&D and parts that needs to be rejected in production due to faults, things like that?
At previous places I worked yes. The R&D (and other) costs were rolled into the unit cost from an accounting perspective. Nvidia could be different as I never worked for them.
 
Let's be clear, so that it doesn't appear that I favor nvidia, both dlss and amd fsr are scams, both by those who make games and by those who make graphic cards.
And I don't know if there is that information, but I think that nvidia makes the most sales in Europe and that Europe brings them the most profit, I believe that it will be the same with the rtx 4000 series.Most rtx 4090 and rtx 4080 cards will be sold in Europe.
 
Last edited:
That size, weight, and power draw of the 4090 is just ridiculous.

Freaking card will plop and fall out of motherboards LOL.

And probably sound like a hair dryer and heat up the room.

The 90 series cards are just stupid, even the 3090. Just get the smaller more efficient 80 version instead.

Like yeah the 3090 was the best video card for 2 years, but as soon as a next gen card comes out it's immediately outdated, and then who wants that huge heavy nuclear power plant in their PC anymore?

Plus the pricing. F off nVidia especially the 4080 for $1200. I bought my RTX -3080 in Nov. 2020 brand new for $799
 
Last edited:
Yea the 4080 pricing is ridiculous. An argument can be made for the 4090 MSRP since the 3090 came out at $1499 but even then wasn't worth the price over the 3080.
 
On the 12th more then likely. Nvidia generally doesn't send review cards til a day or two before release.
I know Digital Foundry has a 4090 because they released a preview of the performance testing they had done this far. As for NDA lift, I haven't seen that mentioned anywhere.
 
Why wouldn’t you turn on DLSS? It looks inconsequentially different (in some cases better) on Cyberpunk and gives an enormous frame rate advantage. Even at launch I played Cyberpunk with DLSS and was impressed. I’m sure it’s only gotten better since.

Prices moving up aren’t surprising. CEO Jensen H. said like 5 years ago that he intended to move Nvidia prices higher successively. I bet these ada (40x0) cards have a dip in sales figures for Nvidia relative to ampere (30x0) — much like the turning cards (20x0) did in relation to pascal (10x0) cards. The crypto boom and bust waves seem to assure that.
DLSS for Cyberpunk only looks good using the Quality setting. Everything looks like ass and lowers image quality.

DLSS is Fantastic....but only with 1 setting.
 
Hope nvidia is making the most of this new node because AMD is leap frogging.
AMD has typically always had the advantage in raw clock speed. That doesn't mean it's faster than the competition when the architectures are completely different. AMD still uses a general compute core design. Asynchronous compute came about because AMD's architecture had its cores sitting idle a lot of the time.
I mean sure, but they said that with raytracing back in the day too, yet we had people doing it with GTX 980tis (though it wasn't great, but neither were gen 1 raytracing cards)
I don't think anybody ever said old video cards couldn't do ray tracing, it just runs like ass. Turing added cores dedicated to performing ray tracing operations and that is what makes the difference.
 
Yea the 4080 pricing is ridiculous. An argument can be made for the 4090 MSRP since the 3090 came out at $1499 but even then wasn't worth the price over the 3080.

Exactly. Wasn't the original RTX-3090 only like 10% to 15% at best a better performer than the RTX-3080? But it's price was like 75% higher. Made zero sense.

Like I said I got lucky getting my RTX-3080 at Microcenter the first couple months of release for the MSRP of $799+ tax. This new RTX-4080 selling for $1200 is stupid.

I don't care about DLSS, I just want to crank up my game settings to Ultra max, and use 8x AA, and have the game run @ 120 fps.
 
Nvidia is fucked in raster. The $900 4080 is about 5% faster than the $800 3080 Ti. The 16 GB model is about 20% faster than the 3090 Ti.
Easy home run for AMD.
The other problem is the "age of ray tracing" still hasn't come, yet. Oh sure, a handful of titles have it, but even fewer use it well.

Getting tired of people using the same, what, 5 titles?, to justify this premium pricing.

Oh boy, cyberpunk is a featured title...yet again.
 
The other problem is the "age of ray tracing" still hasn't come, yet. Oh sure, a handful of titles have it, but even fewer use it well.

Getting tired of people using the same, what, 5 titles?, to justify this premium pricing.

Oh boy, cyberpunk is a featured title...yet again.
Well I can see why. There really isn't a much more demanding game out there with good RT and visuals. If you really think about it, all the games with RT support are mostly SP games. You aren't going to be playing a MP game with RT on when you need high frame rate. The last thing you worry about is how pretty the game looks while getting frags.
 
Oh boy, cyberpunk is a featured title...yet again.

I planned to wait a year on Cyberpunk for bugs to be fixed and had kind of forgotten about it. Glad I did, because it gives me a single valid reason to grab a 4090 on launch day :D
 
Rise a hand if you have a 3090Ti Kingpin. The forum is not that hard it seems.
Can't afford one, even if I could I probably wouldn't but it. Well, maybe if I made 3-400k a year I wouldn't care but below that amount I don't think i'd spend more than 800 bucks on a GPU. Also, this forum is not representative of PC gaming at large. Prices jumping this much makes me feel like PC gaming is going to price almost everyone out of the market. What's a 4060 going to cost, 600? These prices make the future look pretty dire. I'd rather us have a large user base than be a niche, it'll mean less software coming to PC 10 years down the line when no one can afford it any longer.
 
Last edited:
At previous places I worked yes. The R&D (and other) costs were rolled into the unit cost from an accounting perspective. Nvidia could be different as I never worked for them.
I agree. The only thing I question about the graph: is that 70% margin just for consumer graphics cards or overall? If it is for consumer stuff, then, wow. But if that comes from enterprise/supercomputer/ai products then it isn't as big of a "wow". those areas are boom-or-bust so big profits and big losses are more likely.
 
I kinda like that 4090, it's a monster of a card compared to what 3090 was when it was released.
Smart move by Nvidia. 4080 16gb is also a fantastic card but 4090 is so much stronger for not THAT much money. I could live spending a bit extra. Those 3090 better sell around €1000 because 4090 are going to be sold at about €2000-€2100 in EU while being roughly twice the performance.
I have nothing against 4080 12gb except its price. But I understand. It's a bit faster than 3080Ti which is still €1100+ in EU.

Still, there's a good chance I'll get AMD this year. I'll wait.
 
Last edited:
I'm skeptical of the RTX 40's supposed performance improvement over the 30. Is that theoretical improvement with DLSS 3 or is that real raw power? Can't wait to see real benchmarks. I could probably do a 4080 16GB, but I should try to resist.
 
F this guy;

In his defense, he has to say that or face investor backlash. He has to say they're going to sell these parts for a ton of money.

He can't say we're selling them for what they can while they can before AMD comes in, along with the used mining wave, and they have to revisit their pricing scheme.
 
2x to 4x means 100% to 200% faster better performance than previous gen cards. Sorry but I call BS.

I've been buying video cards since the Voodoo 2 days, and historically a next gen card is maybe 25% to 50% faster than the previous card, and maybe once in a great while a huge upgrade comes out at 75% faster.

But never in 25 years have I seen a new card be 200% faster than the last gen. So if it gets 60fps on a game, the new card alone will give you 120fps? Uh doubt it.

DLSS is meaningless to me, I've never once used it on my 3080. I just play games at Ultra max settings with full AA and want the smoothest consistent high fps, no gimmicks or tricks to give me that.
 
Last edited:
2x to 4x means 100% to 200% faster better performance than previous gen cards. Sorry but I call BS.

I've been buying video cards since the Voodoo 2 days, and historically a next gen card is maybe 25% to 50% faster than the previous card, and maybe once in a great while a huge upgrade comes out at 75% faster.

But never in 25 years have I seen a new card be 200% faster than the last gen. So if it gets 60fps on a game, the new card alone will give you 120fps? Uh doubt it.

DLSS is meaningless to me, I've never once used it on my 3080. I just play games at Ultra max settings with full AA and want the smoothest consistent high fps, no gimmicks or tricks to give me that.
It's going to be 2x to 4x with the new frame insertion DLSS mode. They'll use a interpolation algorithm along with some vector motion data to add frames to a native frame rate that are generated from draw call data. Latency will likely remain the same as native frame rate but look smoother. I'm a little concerned it'll be disorienting having a 60 fps game natively look like 120 fps but still feel like 60 fps.
 
Let's be clear, so that it doesn't appear that I favor nvidia, both dlss and amd fsr are scams, both by those who make games and by those who make graphic cards.
And I don't know if there is that information, but I think that nvidia makes the most sales in Europe and that Europe brings them the most profit, I believe that it will be the same with the rtx 4000 series.Most rtx 4090 and rtx 4080 cards will be sold in Europe.

Somehow I highly doubt that, while the 3000 series was out of stock almost always in the US, in Europe they had overpriced stock for over a year, and with the prices for the 4000 series beeing 20+% higher then the US I don't see these selling well
"Prices in Europe are out: RTX 4080 (12GB) 1099€ - RTX 4080 (16GB) 1469€ - RTX 4090 1949€"
 
I’ll be trying for day 1 with the RTX 4090.

I’ve been trying AMD for years (last card was the 6900 XT). They’re good - but just not what I prefer when it comes to features, performance, and drivers.

Everyone really thinks AMD is gonna be #1 this year?
 
1949 euros. They will get fucked with that price.

EU$ 1,099 for a xx70-tier card is a joke.

Everyone really thinks AMD is gonna be #1 this year?

The rumor mill says it's up to AMD and their partners, and AMD is confident they will win with RDNA4. So what they want to do now is take as much market share from Nvidia as they can, and improve brand recognition and loyalty. So they could possibly punch up the wattage and win, or they could make good competitive cards that cost less in every way and run cooler.

They will take mobile because they're more efficient.
 
The quashed dream of a competitively priced lineup in the wake of the crypto selloff and a glut of unsold stock reminds me of the fabled sub-$500 2080Ti's that would flood the streets back when the 3000 lineup was announced.
 
I saw quite a few 2080ti selling for 500-550 on this forum and other places. That was thanks to it being similar enough in price and performance to the incoming 3070. I guess Nvidia learned their lesson...
 
I saw quite a few 2080ti selling for 500-550 on this forum and other places. That was thanks to it being similar enough in price and performance to the incoming 3070. I guess Nvidia learned their lesson...
There's a part of me that wonders if they priced the 3000 series so low, at least comparatively, at launch because they anticipated raising it down the line and placing all the blame on the supply chain, scalpers, and mining demand while trying to appear the good guy.
 
As someone stated in one of the many videos about the RTX 4090, the graph that Nvidia put out showed only three games without DLSS and compared to the 3090 Ti, the 4090 seems to perform ~ 50% better in pure rasterization performance. Now, this could be at stock clocks so assuming the massive OC's are possible (Jensen alluded to 3000+Mhz being doable), the gap should widen.

I don't give a rat's ass about DLSS since it's a scam - rendering a game at a much lower resolution and upscaling it != native resolution rasterization performance. Instead of focusing on that solely as the barometer for GPU performance, Nvidia is occupied with gimmicks like DLSS.

It'll be interesting to see actual benchmarks with just rasterization performance on a variety of games on the 4090 vs 3090 Ti.

Also, "long live SLI." 😭
 
At previous places I worked yes. The R&D (and other) costs were rolled into the unit cost from an accounting perspective. Nvidia could be different as I never worked for them.
That would be the proper way of doing it. Where did you get that number btw?
 
Back
Top