NVIDIA GeForce RTX 4070 Priced at $600

Welp, guess I’m waiting for a 5060 then!
I'm waiting for a proper market crash, and I don't mean just the GPU market either. Nobody is going to lower prices until cat and dogs are living together due to mass hysteria, probably because the housing market is so fucked. Until we're at that point, prices won't drop, and this goes double for GPU's because Jensen Huang said AI a bunch of times to keep Nvidia's stock higher than the people who bought a 4090 and thought they got a good deal. AMD is not doing any better because they base their prices entirely on Nvidia's prices. Intel is still barely putting out a GPU that performs like a RTX 3060, with worse drivers than AMD. You won't see lower prices by praying to the leather jacket man. The market needs a proper crash.
 
I'm waiting for a proper market crash, and I don't mean just the GPU market either. Nobody is going to lower prices until cat and dogs are living together due to mass hysteria, probably because the housing market is so fucked. Until we're at that point, prices won't drop, and this goes double for GPU's because Jensen Huang said AI a bunch of times to keep Nvidia's stock higher than the people who bought a 4090 and thought they got a good deal. AMD is not doing any better because they base their prices entirely on Nvidia's prices. Intel is still barely putting out a GPU that performs like a RTX 3060, with worse drivers than AMD. You won't see lower prices by praying to the leather jacket man. The market needs a proper crash.
Based and blackpilled
 
this goes double for GPU's because Jensen Huang said AI a bunch of times to keep Nvidia's stock higher than the people who bought a 4090 and thought they got a good deal.
They did. $100 more than the previous xx90 card, for nearly double the performance and frame generation which is being adopted rapidly (fruit should come within the next year for most releases like with dlss2, already some high profile games with it). That ends up bringing it to a place of 3.5x to 4x performance. The RTX 4090 is one heck of an amazing card by all accounts from everyone who's bought one!
 
Yeah, but one has to wonder what other concessions have been made for Arizona tea to still be 0.99. Have the workers gotten raises? High turnover rate if they have lower wages? Did they find more efficient ways to brew tea to save on manufacturing costs? Just because a can is still .99 doesn't mean that it is a good thing.
 
Well AZ is really close to Mexico. The real question is what else is being transported in those ice tea semis.
 
They did. $100 more than the previous xx90 card, for nearly double the performance and frame generation which is being adopted rapidly (fruit should come within the next year for most releases like with dlss2, already some high profile games with it). That ends up bringing it to a place of 3.5x to 4x performance. The RTX 4090 is one heck of an amazing card by all accounts from everyone who's bought one!

So the lesson here is the price anchoring strategy is working.
 
So the lesson here is the price anchoring strategy is working.
We used to buy two high end cards for sli. Now we buy one card that is essentially that in a single gpu. I don't care how they do it, so long as we keep getting performance jumps like these!
 
We used to buy two high end cards for sli. Now we buy one card that is essentially that in a single gpu.
Right, but generational performance gains are expected. When the pricing increases outpace performance, that's just inflation (which is considerably higher than actual inflation). It's not hard to see, just look at one of the billion $/frame breakdowns out there. Gamer's Nexus did a good one.
 
We used to buy two high end cards for sli. Now we buy one card that is essentially that in a single gpu. I don't care how they do it, so long as we keep getting performance jumps like these!

Actually we used to have real gains generation over generation in every category for similar or reasonably inflation-adjusted pricing. Now we have linear pricing scaling with how much faster the card is over the previous generation in the same market segment. This isn't a good trend for consumers no matter how you slice it, but it's great if you're an Nvidia shareholder.
 
Right, but generational performance gains are expected. When the pricing increases outpace performance, that's just inflation (which is considerably higher than actual inflation). It's not hard to see, just look at one of the billion $/frame breakdowns out there. Gamer's Nexus did a good one.

Bingo. We saw the same thing with the Turing launch, restoring to relative sanity with Ampere. Now we're seeing a restart with Ada as companies once again hope they can anchor pricing to what a miner is willing to pay, not what a gamer is willing to pay. It's possible we'll see a return to relative sanity for the next generation pending no rally in the price of magic internet money, but I have no idea at this point.
 
Actually we used to have real gains generation over generation in every category for similar or reasonably inflation-adjusted pricing. Now we have linear pricing scaling with how much faster the card is over the previous generation in the same market segment. This isn't a good trend for consumers no matter how you slice it, but it's great if you're an Nvidia shareholder.
As it turns out, I am an nvidia shareholder.
 
At these prices for gpu's, it will kill the PC gaming market, as most will just buy a console instead at this point. Hopefully as these over priced cards rot on the shelves and they will be forced to lower prices to something closer to reasonable. At current GPU prices gaming will go back to catering to consoles and everything will just be a port for PC.
 
Actually we used to have real gains generation over generation in every category for similar or reasonably inflation-adjusted pricing. Now we have linear pricing scaling with how much faster the card is over the previous generation in the same market segment. This isn't a good trend for consumers no matter how you slice it, but it's great if you're an Nvidia shareholder.
Uh, no. As I said, it's $100 more than last gen yet provides much more than the traditional generational uplift (30 percent).
They did. $100 more than the previous xx90 card, for nearly double the performance and frame generation which is being adopted rapidly (fruit should come within the next year for most releases like with dlss2, already some high profile games with it). That ends up bringing it to a place of 3.5x to 4x performance. The RTX 4090 is one heck of an amazing card by all accounts from everyone who's bought one!
Done. That's a heck of a lot more than the traditional generation's uplift as I said.
 
Uh, no. As I said, it's $100 more than last gen yet provides much more than the traditional generational uplift (30 percent).

Yes, and that would matter if last gen pricing for the 3090 card were reasonable. It was not, at least not from a gaming perspective. You can argue it’s more reasonable from a productivity perspective, but these cards are being marketed to gamers.
 
Uh, no. As I said, it's $100 more than last gen yet provides much more than the traditional generational uplift (30 percent).
Done. That's a heck of a lot more than the traditional generation's uplift as I said.
OK, now do the rest of the stack.
 
  • Like
Reactions: emphy
like this
They did. $100 more than the previous xx90 card, for nearly double the performance and frame generation which is being adopted rapidly (fruit should come within the next year for most releases like with dlss2, already some high profile games with it). That ends up bringing it to a place of 3.5x to 4x performance. The RTX 4090 is one heck of an amazing card by all accounts from everyone who's bought one!
Nobody cares. Nvidia's Turing was overpriced because they just came out of the 2017 crypto crash and they introduced Ray-Tracing gimmick to try to justify their prices. Nobody bought RTX 2000 cards as a result, but thankfully the COVID situation boosted crypto and Nvidia couldn't stop selling overpriced cards to idiots. Crypto semi crashed and so did the GPU market, so of course Nvidia priced their RTX 4000 series to the moon because of fucking AI. The only reason the RTX 4090's sold is because there are a group of people who will buy the best. A RTX 4070 at $600 won't sell, just like why the 4080 isn't selling well either.
We used to buy two high end cards for sli. Now we buy one card that is essentially that in a single gpu. I don't care how they do it, so long as we keep getting performance jumps like these!
Who's we? Only idiots bought two GPU's because they didn't read the reviews where few games benefited from SLI, and those that did would be extremely unstable. The same people who bought 4090's were the same people who bought two GPU's for SLI.
 
Nobody bought RTX 2000 cards as a result,
Counting only the 2000 card and not the other turings one, we seem to be talking about 18-19 millions peoples, what would a significant amount of people would look like ?
 
They did. $100 more than the previous xx90 card, for nearly double the performance and frame generation which is being adopted rapidly (fruit should come within the next year for most releases like with dlss2, already some high profile games with it). That ends up bringing it to a place of 3.5x to 4x performance. The RTX 4090 is one heck of an amazing card by all accounts from everyone who's bought one!

I am doubting the 4070 will be better than the typical performance gain. The 2070 itself was fairly underwhelming and had a large price jump. The 3070 only seems "okay" because the 2070 was underwhelming. It could do ray tracing, but not fast enough to enable it in games in 90% of scenarios.

Frame generation and DLSS is a nice add on but not a primary feature. They have downsides. Sure, the ray tracing performance increase might be higher but it matters little when you end up turning it off. To really make ray tracing a good selling point for the 4070 the performance in ray tracing needs to be 80% higher or more from the 3070. Otherwise it runs into the same issues, being that you need to turn it off or turn down quality settings. Which isn't much of a practical change over the 3070.

For the time being, raster performance is king.
 
He wasn't wrong about the $750 price tag, Nvidia changed it at the last second! Sure...
I've said it multiple times already, how much stuff does he have to get wrong before people ignore him?
 
He wasn't wrong about the $750 price tag, Nvidia changed it at the last second! Sure...
I've said it multiple times already, how much stuff does he have to get wrong before people ignore him?

If you watched his earlier video what he said was very clearly caveated and equivocal, i.e. this is what Nvidia is currently discussing with AIB partners, it may well change and I would not be surprised to see it change because $750 is bat shit crazy.
 
"But if we compare it to a $2000 card from last gen, here's how it's a steal!"

And the linear progression continues....

RTX 4070 will cost just $100 more than RTX 3070 but have 50% more vram, around 25-30% more performance, and have DLSS3 frame gen available. Not a bad deal at all. Plus, a 186w average gaming tdp means you won't sweat while gaming!

"But muh (insert luxury car analogy here)!"
Nvidia cards are. Hence why they've drawn over 85 to 90 percent market share for discrete chips. If you can afford it, it's the stuff to buy and the market knows it. Enough hyperbole.
 
RTX 4070 will cost just $100 more than RTX 3070 but have 50% more vram, around 25-30% more performance, and have DLSS3 frame gen available. Not a bad deal at all. Plus, a 186w average gaming tdp means you won't sweat while gaming!

12GB is a bit underwhelming to be honest, especially given the price. But probably will be fine. 3070 should have been 12GB. GTX 570 1.28GB, GTX 670 2GB, 970 4GB, 1070 8GB, 2070 8GB, 3070 8GB. A bit of stagnation there.
 
12GB is a bit underwhelming to be honest, especially given the price. But probably will be fine. 3070 should have been 12GB. GTX 570 1.28GB, GTX 670 2GB, 970 4GB, 1070 8GB, 2070 8GB, 3070 8GB. A bit of stagnation there.
12gb will be fine for a 1440p cars like the RTX 4070 for years to come.
 
12gb will be fine for a 1440p cars like the RTX 4070 for years to come.
If only the folks over at Reddit and Youtube could have this level of common sense.

Honestly I believe the overhype on VRAM capacity is way overblown. Sure, if you're doing 4K gaming with Ultra Settings then you want a card with a nice fat memory bus, but there's a good chance you won't be using anywhere near the amount of VRAM the GPU comes with, and if you get to that point, there's a good chance it's not going to be a good experience since pushing all the graphical features to the max also includes using Ray Tracing, that or using a ton of texture mods that'll push the usage up that high--then at that point the only cards you'll want to be using are the 4080, or 4090.

Ultimately, I think 12GB is a nice spot for 1440p gaming, no game I've tested has had an actual usage that was near 12GB, I think the highest usages I saw were both in Hogwarts Legacy with Native 1440p Ultra with RT at Ultra, it was sitting at around 9.7GB of usage, and Cyberpunk 2077 with all high settings and Psycho RT at Native 1440p was hitting 10.1GB, the rest of the games I tried all hovered around 5GB-7GB lowest, with Witcher 3 being the only other game that broke the 8GB barrier. In the case of Witcher 3 and CP2077 the 4070Ti easily beat out the 7900XT, even with both games utilizing 8 and 10 GB of VRAM respectively, those two scenarios required RT, something that's almost necessary for AMD buyers to be able to claim having more VRAM as an advantage outside of using more mods, but won't happen since Nvidia's got the advantage with RT, and anything that comes close to pushing 16-20GB of VRAM would probably struggle to run even on a 4090.
 
RTX 4070 will cost just $100 more than RTX 3070 but have 50% more vram, around 25-30% more performance, and have DLSS3 frame gen available. Not a bad deal at all. Plus, a 186w average gaming tdp means you won't sweat while gaming!
That is not progress, that is stagnation and it set in with Turing.

Rather than the actual performance gains per price that we used to get, Nvidia is pushing "features" no gamer asked for. Now we have shiny reflections that tank the framerate, so we have to run a lower resolution and fake it up to full scale. If that's not enough, we get whole fake frames to pad the marketing graphs.
 
RTX 4070 will cost just $100 more than RTX 3070 but have 50% more vram, around 25-30% more performance, and have DLSS3 frame gen available. Not a bad deal at all. Plus, a 186w average gaming tdp means you won't sweat while gaming!


Nvidia cards are. Hence why they've drawn over 85 to 90 percent market share for discrete chips. If you can afford it, it's the stuff to buy and the market knows it. Enough hyperbole.
Meh, I'm glad that they got one card at an almost reasonable price I guess?

I would be more excited if we were talking about the 4080 only being $100 more than the 3080 :woot:. At $799 I would have upgraded but $1199 is a tough pill to swallow.

Pricing in general is probably hardest for us aging gamers to accept. We got decades of generational performance increases at fairly fixed prices. GTX 280 $499 (after ATI forced price adjustment) GTX 480 $499, GTX 580 $499, GTX 680 $499, GTX 780 $499, GTX 980 $549. We didn't really see massive jumps until the 10 and 20 series. It's also weird to remember that you could buy cards well under MSRP if you waited 6 months. (Don't even get me started on low\mid range cards like the x60s that have been historically under $200)

I think one of the biggest factors was Lisa Su changing AMD's pricing strategy of constantly under cutting NVidia. She had the brilliant idea of pricing their cards in parity with Nvidia, prices have been jumping ever since. (Scalping proving GPUs were being sold way under market value clearly didn't help either lol.)
 
Last edited:
They did. $100 more than the previous xx90 card, for nearly double the performance and frame generation which is being adopted rapidly (fruit should come within the next year for most releases like with dlss2, already some high profile games with it). That ends up bringing it to a place of 3.5x to 4x performance. The RTX 4090 is one heck of an amazing card by all accounts from everyone who's bought one!
I guess that makes my 7700k worth like $63000 because think of how much faster it is compared to that 386 I paid $400 for back in 1990.

The idea that it is faster so it should cost more is BS (for new generation same tier hardware). Nvidia needs people to upgrade since they sell hardware. No hardware sales = no Nvidia as a business. This means they need to provide me a reason to upgrade. Linear cost to performance doesn't give me a reason to upgrade. $300 next year needs to buy me more than $300 this year - - - and enough improvement to justify the cost.
 
Last edited:
They did. $100 more than the previous xx90 card, for nearly double the performance and frame generation which is being adopted rapidly (fruit should come within the next year for most releases like with dlss2, already some high profile games with it). That ends up bringing it to a place of 3.5x to 4x performance. The RTX 4090 is one heck of an amazing card by all accounts from everyone who's bought one!
I guess. "90 card" was just a rebrand of what used to be "80 Ti" cards which were themselves a rebrand of what used to be just "80" cards.

Don't give me any of this "3090 is a Titan" bullshit either.
 
Pricing in general is probably hardest for us aging gamers to accept. We got decades of generational performance increases at fairly fixed prices. GTX 280 $499 (after ATI forced price adjustment) GTX 480 $499, GTX 580 $499, GTX 680 $499, GTX 780 $499, GTX 980 $549. We didn't really see massive jumps until the 10 and 20 series. It's also weird to remember that you could buy cards well under MSRP if you waited 6 months. (Don't even get me started on low\mid range cards like the x60s that have been historically under $200)

A lot of us remember those days, and wish something like that would happen. But it doesn't seem like it will anytime soon. Intel is our best bet but they have a long way to go.
 
Well gamers have to say enough is enough. Until they do nVidia will keep doing what they do because it sells.
Gamers already have, but Nvidia is trying to play out the AI wank game to see if they can continue to be profitable while waiting to wear out gamers patience. If Nvidia is profitable then they have no reason to lower prices but I have a feeling this AI gamble is going to lose big for Nvidia.
RTX 4070 will cost just $100 more than RTX 3070 but have 50% more vram, around 25-30% more performance, and have DLSS3 frame gen available. Not a bad deal at all. Plus, a 186w average gaming tdp means you won't sweat while gaming!
This logic is so bad I can't even. You're looking at it with tunnel vision, meaning you aren't looking past Nvidia and their product launches. RTX 4070 has 50% more ram than who, themselves? The reason products get improved is because their competitors are improving. If you don't offer a better product for that price, than someone else will. The problem is that Nvidia has AMD and Intel as competitors and most people here would write them off as a choice. AMD has been including more ram than Nvidia for nearly a decade and that has paid off as anyone looking to play The Last of US on Nvidia is going to have a bad time. Sure newer more expensive RTX cards have the VRAM but this has been a thing for Nvidia forever. The RTX 4070 will have the same vram as AMD's 6700 XT.

 
Nah, just write it off as "bad console port". Just write off every single game that Nvidia cards just happen to be worse at as a bad port or "AMD title". The trick is to act like the only games "worth" playing are heavily ray traced ones.

Well that is a bad PC port. And the recent patch seems to have lowered VRAM usage by 1GB, at least according to someone's pre/post patch benchmark. Not too sure how accurate that is. Doesn't mean other games might start using more though.
 
Well that is a bad PC port. And the recent patch seems to have lowered VRAM usage by 1GB, at least according to someone's pre/post patch benchmark. Not too sure how accurate that is. Doesn't mean other games might start using more though.
No, but it's a good assumption to make that games will continue to use more vram. You really should never turn down having more vram. Like...it's amazing that Nvidia has people arguing against having more of it. People actually downgraded from 11 to 8 (1080ti vs 3070) and that was applauded.
 
Back
Top