NVIDIA's RTX 4080 Problem: They're Not Selling & MSRP Doesn't Exist

I've been silent about this for a while now, but it's been bothering me almost since the release. How can people tout the cost saving of chiplet and celebrate a company releasing a $1K GPU? A GPU that has had so many issues and drama no less.

GPU pricing is just such a shit show. We can only pray it comes back to sane pricing at some point. I consider myself lucky to be running a 3070, and I'll hold on to dear life until a sanely priced replacement gets released.
I don't think the chiplet GPUs will get particularly exciting until AMD manages to bring out a multi-GCD design, without any of the jank plaguing their past dual-GPU cards going all the way back to the Rage Fury MAXX (roughly a competitor to the ol' 3dfx Voodoo5 5500) since CrossFire and SLI are super dead right now.

If they can link up multiple GCDs and present them as seamlessly as a single monolithic GPU would, then we're in for some real breakthroughs with GPU scaling, without the PCB complexity and VRAM inefficiencies you get with discrete GPUs. (Case in point: 3dfx Voodoo5 6000. Four VSA-100s, 32 MB of RAM each that meant you only had 32 MB of effective VRAM rather than 128 MB, and insanely elaborate multi-layer PCBs to handle all that which drove production costs up, right up until 3dfx's bankruptcy and prompt devouring by NVIDIA with their much more elegant, single-chip GeForce 256 that was rapidly iterated upon over the next few years.)

May your RTX 3070 hold up; it might even make it seven years like my GTX 980 did.

It was a Vista thing that forced Nvidia to make last-minute revisions to key parts of the API which resulted in a nightmare.
Microsoft decided between the final Vista Beta release and the Retail release to close off parts of the Kernel, which yeah was more secure, but it resulted in things that did work on the very last revision of the beta release being completely incompatible with the retail release with nothing more than a quick "heads up" email to device manufacturers a week before the launch date.
Nvidia, Creative, HP, and just about everybody else was completely blindsided by it and it required them to completely abandon large parts of their driver code. HP simply decided then and there that much of their hardware was simply not going to get Vista drivers in retaliation, Creative struggled, and Nvidia got off relatively lucky as they were completely rehashing their drivers at the time anyways, but yeah there were growing pains from Vista. It's no coincidence that CUDA was launched around 2007, supposedly CUDA was developed as a result of trying to keep up with all the API changes so they just sort of created a framework so they could better adapt. (I can't verify that, I was told it once many years ago and it just sort of stuck with me)
I recall hearing something like this before, but for Microsoft to pull that kind of dick move literally a week before release? Wow, no wonder Vista had to fail so that Vista SE - I mean Windows 7 could succeed! The drivers were more or less stable by the time Win7 was released.

That's the sort of rapid "Yeah, I broke it so that your stuff no longer functions, too bad, adapt or die" mentality I expect from that rival bitten fruit company.
 
I don't think the chiplet GPUs will get particularly exciting until AMD manages to bring out a multi-GCD design, without any of the jank plaguing their past dual-GPU cards going all the way back to the Rage Fury MAXX (roughly a competitor to the ol' 3dfx Voodoo5 5500) since CrossFire and SLI are super dead right now.

If they can link up multiple GCDs and present them as seamlessly as a single monolithic GPU would, then we're in for some real breakthroughs with GPU scaling, without the PCB complexity and VRAM inefficiencies you get with discrete GPUs. (Case in point: 3dfx Voodoo5 6000. Four VSA-100s, 32 MB of RAM each that meant you only had 32 MB of effective VRAM rather than 128 MB, and insanely elaborate multi-layer PCBs to handle all that which drove production costs up, right up until 3dfx's bankruptcy and prompt devouring by NVIDIA with their much more elegant, single-chip GeForce 256 that was rapidly iterated upon over the next few years.)

May your RTX 3070 hold up; it might even make it seven years like my GTX 980 did.


I recall hearing something like this before, but for Microsoft to pull that kind of dick move literally a week before release? Wow, no wonder Vista had to fail so that Vista SE - I mean Windows 7 could succeed! The drivers were more or less stable by the time Win7 was released.

That's the sort of rapid "Yeah, I broke it so that your stuff no longer functions, too bad, adapt or die" mentality I expect from that rival bitten fruit company.
Well Microsoft did it because drivers accessing the kernel directly and ignoring the APIs was the cause of something like 90% of the BSOD’s reported during the beta. The big developers were trying yo just reuse their XP drivers with minor tweaks instead of fixing the problems Microsoft had requested.

Microsoft at the time was facing lawsuits and government inquiries over the security vulnerabilities in Windows and they had planned a gradual phase out of kernel access. But seeing the hardware guys hadn’t made any significant progress Balmer kind of lost his shit and basically said screw them then and cut it off. Major dick move but he kind of was.

AMD does have multi GCD cards in the MI250x, and reports there are a mixed bag of not great to pretty OK. AMD currently can’t report the multi GCDs as a single one, Apple is the only company who figured that out and it only works there because the GPU is so limited. When dealing with the scale of data top AMD and Nvidia cards move it’s very hard to coordinate the resources. NVidia talked about it a few years back when they were hinting at Lovelace being multi chip and at the time they were saying the chip they designed to coordinate the GPU cores was more costly than the GPU cores themselves.
That was back when Pascal was new, and data requirements have only gone up since then. I don’t want to think what the cost of that chip would be today, if it’s even technically possible to accomplish.

The reality is graphics engines are going to need a redesign for multi GCD, just as software needed a redesign for multi-core.
 
Steve calling out Nvidia's 4070Ti marketing BS



Hilarious that Nvidia admitted what we all suspected, i.e. positioning the 30X0 series as the mainstream option (presumably because they have a shit load of inventory they still need to clear out).
 
FE are still available as of this post if you want them. You have your pick of any 4080 model at Best Buy right now, so they're still "not" selling. 4090 are selling out in seconds whenever stock shows up.

https://www.bestbuy.com/site/nvidia...rd-titanium-and-black/6521431.p?skuId=6521431
1672843076045.png

Nope. 3070 ti was $599, 2070 FE was $499.
https://www.anandtech.com/show/13395/geforce-rtx-2070-gets-a-release-date-october-17th
1672842847296.png


https://www.techpowerup.com/review/nvidia-geforce-rtx-2070-founders-edition/
1672843310958.png
 
I don't know if 60 class card is quite right, from the performance leaks so far the 4070TI is on par with the 3090TI, plus or minus 7%. I can't really think of any cases where the next generation 60 class was on par with the previous generation top tier (90/Titan/80TI)
The 3070TI though did match up about evenly with the 2080TI, just as the 2070S did against the 1080TI.
Fair. Its doing better than I thought it would. Still its quite cut down specs for $800. That said, guess the architecture is making up for it.

But I still feel the $850+ this will cost in the real world is a joke as that's literally the same cost as what you can find a 3090 used for (give or take). So put aside the new vs used for a sec and DLSS 3 on 4070 Ti, and functionally you've got price/performance stagnation.
 
Evga, asus, FE 3090 where still going at $840-$850 price tag on ebay:
https://www.ebay.com/sch/i.html?_from=R40&_nkw=3090+rtx&_sacat=0&rt=nc&LH_Sold=1&LH_Complete=1

Strix level well over $1000

outside Zotac 3090TI does not seem to go under $1000, with $1100/$1200+ being common:
https://www.ebay.com/sch/i.html?_fr...kw=3090+rtx&_osacat=0&LH_Complete=1&LH_Sold=1

So at least it could put some pressure on that market to push them down, maybe not that much because of the mindshare around them since their announcement too.
 
As an eBay Associate, HardForum may earn from qualifying purchases.
Ebay 30 series prices have been slowly creeping up since the 40 series announcement.
3080's are almost back to MSRP.
 
Fair. Its doing better than I thought it would. Still its quite cut down specs for $800. That said, guess the architecture is making up for it.

But I still feel the $850+ this will cost in the real world is a joke as that's literally the same cost as what you can find a 3090 used for (give or take). So put aside the new vs used for a sec and DLSS 3 on 4070 Ti, and functionally you've got price/performance stagnation.
Yeah it's still $300 too much, but how much of that is Nvidia, and how much is TSMC, and how much are retailers demanding better margins from the channel, and blah blah blah. It sucks and I don't know where to point my fingers but I do know that outside of not buying them it's not going to change anything.
If anything I feel this is going to push people back down to 1080p lots of affordable options there, 1440p you start looking at big numbers, and 4K is too rich for me.
 
Yeah it's still $300 too much, but how much of that is Nvidia, and how much is TSMC, and how much are retailers demanding better margins from the channel, and blah blah blah. It sucks and I don't know where to point my fingers but I do know that outside of not buying them it's not going to change anything.
If anything I feel this is going to push people back down to 1080p lots of affordable options there, 1440p you start looking at big numbers, and 4K is too rich for me.
Those factors and excuses only cover Nvidia so much. If their cards rot on shelves like the 4080 did a lot of here locally, then at some point reality sets in about supply/demand and something has to change for them to start moving inventory. That also includes TSMC and anyone else in that chain. The best thing that can happen right now is for gamers to just not buy these cards, simple as that, and considering the market saturation of 30-series cards still out there, if you aren't a top of the line 4090 shopper, that's not difficult considering the price/performance stagnation for the rest of the stack.
 
Last edited:
Those factors and excuses only cover Nvidia so much. If their cards rot on shelves like the 4080 did a lot of here locally, then at some point reality sets in about supply/demand and something has to change for them to start moving inventory. That also includes TSMC and anyone else in that chain. The best thing that can happen right now is for gamers to just not buy these cards, simple as that, and considering the market saturation of 30-series cards still out there, that's not difficult.
If they sit and rot then the mail in rebates will come around eventually :D
 
Thanks, guess I misremembered.
 
Used isn't the same thing as new.... Not comparable.
Just talking real market conditions here. If anything 3090/3090 Ti used should be like $400-$500 now if we are going off of past gens. Are they? No.

And if you look hard enough and find one new that isn't price gouged, should be around the $1k mark give or take which will also be close to the price of higher end new 4070 Ti AIB card. So point still stands.

This is flat out gen over gen price/performance stagnation.
 
Last edited:
Well looks like Nvidia figured out a solution for the dust collecting 4080's on shelves -> Just allocate them for GeForce Now, with monthly subscriptions going straight to Nvidia without having to share the loot with anyone else.
New Ultimate membership
Go beyond fast with RTX 4080 power in the cloud. Get ready for 240 fps streaming, ultrawide monitor support, and 4K gaming at 120 fps.

https://play.geforcenow.com/mall/#/layout/games
hmmm, wonder if that was Nvidia's plan or design in the first place . . . Maybe Jensen will have yet another motto like . . " You won't own any game or GPU but will love it". ;)
 
I don't understand why the 4070Ti is getting all the hate? If Tech Reviewers had any integrity and honesty they would have hated on the 4080 and the 7900 cards as well. They are all terrible for the price they are asking. These cards are all so badly priced that the 4090 looks like good value.

All the 4070Ti does is highlight how bad things are right now. Every Tier under you go under the 4090 becomes increasingly bad value.

Every reviewer should have lambasted Nvidia and AMD. Every Youtuber should have been complaining from the start.

No card release this generation should have received a good review. The only card that comes close to deserving a good review is the 4090.

The most amazing thing is people are still buying all these cards.
 
I don't understand why the 4070Ti is getting all the hate? If Tech Reviewers had any integrity and honesty they would have hated on the 4080 and the 7900 cards as well. They are all terrible for the price they are asking. These cards are all so badly priced that the 4090 looks like good value.
1) It feel a lot like mind share storyline
2) No FE, so not paid via received card direct from Nvidia in advance for the reviews, so they are more "free" ?

Because:
performance-per-dollar_2560-1440.png


It has an argument if one find an Asus at $800 to be the best 1440p choice of the new card it seem, perfectly in line with the terrible all around market.

3) And has you go down the who care that just rich gamers spending ridiculous money narrative disappear.
4) The strange narrative of bad value to sales Ampere must be gone by now and the idea that they are temporary launch price tag has a way to sell the almost no 3090 on newegg illusion get removed.
 
no bad GPUs, just bad prices

- Anand Lal Shimpi
A quote that sums up this generation in a nutshell, really.

Except for Intel, because their Arc Alchemist drivers sure make AMD look good - albeit not from lack of trying, they've apparently made a lot of headway since launch.

With that said, I struggle to see the value proposition of a 4070 Ti over a used 3090. The 4080 at least has the convenient excuse of outclassing the prior flagship, if not to the staggering extent of the even pricier 4090.
 
A quote that sums up this generation in a nutshell, really.

Except for Intel, because their Arc Alchemist drivers sure make AMD look good - albeit not from lack of trying, they've apparently made a lot of headway since launch.

With that said, I struggle to see the value proposition of a 4070 Ti over a used 3090. The 4080 at least has the convenient excuse of outclassing the prior flagship, if not to the staggering extent of the even pricier 4090.

Regarding Intel, I'd be very inclined to suffer through their driver development hell if the GPUs were cheap enough.
 
Regarding Intel, I'd be very inclined to suffer through their driver development hell if the GPUs were cheap enough.
Given what both NVIDIA and AMD are both doing right now, the A770 sure looks like an appealing 1080p card just for being $350 tops - barely more than I paid for my GTX 980 seven years ago.

That said, drivers are indeed a deal-breaker for me.

One potential use (cheap upgrade for older systems) is shot down when it practically needs Resizeable BAR for decent performance, VR performance is janky with inconsistent frame delivery relative to an RTX 3060 12 GB for $10 more and it doesn't work with SteamVR-native HMDs like Index or Vive either (but VR is something for which you really want a 3090-tier card at the very least, ideally 4090), and most damning of all, support for older Direct3D games is a travesty without DXVK on Windows (yes, not Linux, Windows).

I'm hoping someone does a re-review within the next few months so we can see how much the drivers have matured, and how much workarounds like DXVK improve the experience. If Intel can fine wine to an extent that makes AMD users envious, they might see a surprising uptick in sales before Battlemage hits the market in 2024.
 
I don't understand why the 4070Ti is getting all the hate? If Tech Reviewers had any integrity and honesty they would have hated on the 4080 and the 7900 cards as well. They are all terrible for the price they are asking. These cards are all so badly priced that the 4090 looks like good value.

All the 4070Ti does is highlight how bad things are right now. Every Tier under you go under the 4090 becomes increasingly bad value.

Every reviewer should have lambasted Nvidia and AMD. Every Youtuber should have been complaining from the start.

No card release this generation should have received a good review. The only card that comes close to deserving a good review is the 4090.

The most amazing thing is people are still buying all these cards.
It's actually fascinating. This is the first time I'm aware of that the top of the product stack is where the most value exists.

Historically, regardless of product, the higher you go, the less value you get - diminishing returns. Nvidia has accomplished the precise opposite: the 4090 is the best value per dollar, and the further down you go, the worse the value becomes.

It's actually a remarkable achievement.
 
Maybe I'm wrong but I feel like hardware power simply far exceeds user needs. Even at 4k, last gen cards are serviceable for the vast majority of use cases. More than ever do these cards appeal to people who are chasing the bleeding edge just for the sake. As someone who worked at a computer shop, I observed a lot of people have a tendency to "skip a gen", myself included. This seems like a particularly good generation to skip, at least at this moment.
 
Back
Top