NVIDIA's RTX 4080 Problem: They're Not Selling & MSRP Doesn't Exist

But the 3090 was a terrible perf by dollars, beating it mean nothing, why does it matter that it perform like a 3090 to brush a portrait of its value, look at an array of options
See TaintedSquirrel 's post if you want to compare prices :). Personally I decide how much perf I need and buy it unless I can't afford it.
 
Sounds like both are good deals to me!
The 3080 was like 50% faster than the card it replaced at the same price.
Typical gen improvements are 30%+ with no price increase. Nvidia is only giving us 20% this time with an extra $100.

It's a bad deal compared to the 4090 and it's a bad deal compared to past generations except maybe the 2080 vs 1080 Ti.
 
The 3080 was like 50% faster than the card it replaced at the same price.
Typical gen improvements are 30%+ with no price increase. Nvidia is only giving us 20% this time with an extra $100.

It's a bad deal compared to the 4090 and it's a bad deal compared to past generations except maybe the 2080 vs 1080 Ti.
Worse doesn't mean bad. It's still a marked improvement. EDIT: It also has dlss 3 which is a performance multiplier, already starting to show up in demanding games.
 
Last edited:
Stop getting so hung up on the name and focus on the performance/pricing ;).

That is why I posted the prices. The same level of performance went from around $500 to $800-1000. Inflation isn't near that high.

From Nvidia's own cherry picked benchmarks, the 4080 12GB was more on par with a regular **70 level card. For a **70ti, this is probably going to be an under performer. So we're likely going to get mid range performance for twice the price.

We're getting a 2560x1440 card for 4K card prices. Unless Nvidia shocks us with a decent price or much higher performance, but I doubt either will happen.
 
That is why I posted the prices. The same level of performance went from around $500 to $800-1000. Inflation isn't near that high.
It did? The 4070ti is going to be far faster than the 3070 was. They aren't anywhere near the same level of performance.
 
Not sure if the 3070 was really still $500 too, even to this day they would be hard to find at the price point
 
It did? The 4070ti is going to be far faster than the 3070 was. They aren't anywhere near the same level of performance.

You comparing the wrong metrics. You compare price/performance. The price has increased greatly for the same performance bracket. An upper mid range card now costs a lot more than it did previously. If you want the same level of performance (2560x1440, 60 frame rates in modern games) you're going to have to spend a lot more. That level of performance used to be in the $300-400 range. Of course there is inflation and rising costs, so $500 isn't too unreasonable, and we can see that performance range gradually increase in price over the years. But now we're looking at a good 60-100% price hike in a single generation.

Not sure if the 3070 was really still $500 too, even to this day they would be hard to find at the price point

Due to mining. But at launch there were models for around $520, they were just impossible to find (due to mining). I got mine for $570 but I had to get a higher end model because that is all that was in stock. I wanted the $500 basic EVGA version but EVGA obviously prioritized the more expensive versions. $1 back plate plus increased clocks for $70 more = more profit margin. When you're selling every video card produced and there is a shortage it makes sense why they essentially only built the higher end versions that had better margins.

EVGA bumped up the price of the most basic (Black) version from $500 to $530 eventually, not that they ever made any of them.
 
Exactly and I have no issue using real price (even if we look at the non scalped) one versus theorical that barely existed to judge value, but even on those terms I am not it bode that well.
 
The new $1600 card is 60% faster than the old $1500 card.
The new $800 card is 20% faster than the old $700 card.

Hello. You're the one who keeps saying 3090, 3090, 3090.
I understand what you're saying but the 3000 series received new msrps.

3090/Tis were going for $1000-1100 until Nvidia and AIBs sold through the stock. That shows Nvidia overcharged us on the 3090.
So the new MSRP $999 3090 made the 4090 60% more expensive. We paid for the performance increase.

None of us are getting a great deal on anything from Nvidia.
 
I understand what you're saying but the 3000 series received new msrps.

3090/Tis were going for $1000-1100 until Nvidia and AIBs sold through the stock. That shows Nvidia overcharged us on the 3090.
So the new MSRP $999 3090 made the 4090 60% more expensive. We paid for the performance increase.

None of us are getting a great deal on anything from Nvidia.
If I could buy a 4090 FE, I'd call that a good deal even at $1,600 - the performance is just spectacular.

Sadly, they are largely vaporware.
 
The 3070 Ti was 64% faster than the 2070 FE for 0% more money.

I think the 3070ti MSRP was $600. 2070 was $500.

We'll really have to see how the 4070 non-Ti does. Seeing how big of a performance difference there is between the 4080 and 4070ti, there may be a big performance gap between those as well. But I'm not seeing a $300-500 price drop either from a 4070tis $800-1000. If you're into a 2560x1440 gaming, things are going to become a lot more expensive.
 
I think the 3070ti MSRP was $600. 2070 was $500.

We'll really have to see how the 4070 non-Ti does. Seeing how big of a performance difference there is between the 4080 and 4070ti, there may be a big performance gap between those as well. But I'm not seeing a $300-500 price drop either from a 4070tis $800-1000. If you're into a 2560x1440 gaming, things are going to become a lot more expensive.
$500 was the AIB base price, $600 was the FE. Did any AIB model actually target the $500 price?
 
$500 was the AIB base price, $600 was the FE. Did any AIB model actually target the $500 price?

Not sure for the Tis. For the regular 3070s yes but only for another a month or so before they got bumped up to $530 from EVGA and Galaxy (?). I was looking at two models at $500 and then they bumped up the price. A number of gamers did get 3070s at normal prices, I paid $570 for mine which was EVGA's 2nd highest end model. Those typically cost $40 or so more for slightly higher clocks. I bought it because it was in stock.
 
Well, here it is: nVidia made the $799 4070 Ti official today at CES: https://www.tomshardware.com/news/live/ces-2023-live-blog

Here is hoping it has slow sales but I am doubtful.

I do wonder how the regular 4070 will perform. The 4080 to 4070ti performance gap is fairly large, at least from what Nvidia showed us. I assume we might see a $600 4070 (more like $650-700 street prices), and the performance might be a good 20% or so slower.
 
Stop getting so hung up on the name and focus on the performance/pricing ;).
Forget performance/pricing. Look at the die used and the rest of the spec and the 4070 Ti starts looking more like a 60-class card. One that they originally try to brand as a 4080, and now are trying to get away with calling it a 70-class.192-bit card for $800. Joke. 3090 is a 4k card. This 40"7"0 Ti is probably at best a 1440p card.
 
Forget performance/pricing. Look at the die used and the rest of the spec and the 4070 Ti starts looking more like a 60-class card. One that they originally try to brand as a 4080, and now are trying to get away with calling it a 70-class.192-bit card for $800. Joke. 3090 is a 4k card. This 40"7"0 Ti is probably at best a 1440p card.
I don't know if 60 class card is quite right, from the performance leaks so far the 4070TI is on par with the 3090TI, plus or minus 7%. I can't really think of any cases where the next generation 60 class was on par with the previous generation top tier (90/Titan/80TI)
The 3070TI though did match up about evenly with the 2080TI, just as the 2070S did against the 1080TI.
 
Last edited:
Forget performance/pricing. Look at the die used and the rest of the spec and the 4070 Ti starts looking more like a 60-class card. One that they originally try to brand as a 4080, and now are trying to get away with calling it a 70-class.192-bit card for $800. Joke. 3090 is a 4k card. This 40"7"0 Ti is probably at best a 1440p card.
Performs better than 3090ti at 4k. Who cares about the bus width between architectures? Bandwidth and architecture matter.
 
The 3070 Ti was 64% faster than the 2070 FE for 0% more money.
3070 Ti is $100 more, plus MSRP didn't matter for the past 2 years, and still doesn't. Quick look on Amazon shows they're selling for $750 or more. On Ebay you can find them for less than $500, but that's much of a discount for a used mining GPU. I can see why the RTX 3070's are popular and not the Ti's.

The more you spend the more you save!
800%+ value!
FYZ9PdKWIAAvuxy.png


Here is hoping it has slow sales but I am doubtful.
For $700, Nvidia won't sell many. RTX 3070's are going for sub $400 used on Ebay.
 
3070 Ti is $100 more, plus MSRP didn't matter for the past 2 years, and still doesn't. Quick look on Amazon shows they're selling for $750 or more. On Ebay you can find them for less than $500, but that's much of a discount for a used mining GPU. I can see why the RTX 3070's are popular and not the Ti's.


800%+ value!
View attachment 539262


For $700, Nvidia won't sell many. RTX 3070's are going for sub $400 used on Ebay.
Maybe... but I'd probably take a $700 4070TI over a $900 7900xt
 
Maybe... but I'd probably take a $700 4070TI over a $900 7900xt
Leaving all opinions regarding pricing aside this right here is a problem for AMD. Once the reviews are out the street price for performance for these two cards is something to consider.
 
Leaving all opinions regarding pricing aside this right here is a problem for AMD. Once the reviews are out the street price for performance for these two cards is something to consider.
Leaks show the 4070 TI pairing up against the 3090 TI plus or minus 7%, the 7900XT does about the same, so chances are in most use cases they are going to be about equal so it's going to come down to street pricing. The 7900xt has managed to stay around that $900 mark, as long as the 4070ti manages to stay below there it should be a clear winner.

Note:
Winner between the two, lets not kid ourselves, the price of both should be considered criminal but Nvidia bent us over and when we looked to AMD for help they simply handed us a bottle of lube.
 
Leaks show the 4070 TI pairing up against the 3090 TI plus or minus 7%, the 7900XT does about the same, so chances are in most use cases they are going to be about equal so it's going to come down to street pricing. The 7900xt has managed to stay around that $900 mark, as long as the 4070ti manages to stay below there it should be a clear winner.
Not to mention Nvidia's advantages in raytracing, dlss, nvenc, Cuda, etc. 4070ti is the clear winner.
 
Not to mention Nvidia's advantages in raytracing, dlss, nvenc, Cuda, etc. 4070ti is the clear winner.
And driver stability, and frequent patches, I was trying not to sound too team green, but assuming the performance leaks are accurate then it really is going to force AMD to respond. Which they can, they have been bragging about how their new chiplet design saves them so much money they can afford to bring them down in line.
 
And driver stability, and frequent patches, I was trying not to sound too team green, but assuming the performance leaks are accurate then it really is going to force AMD to respond. Which they can, they have been bragging about how their new chiplet design saves them so much money they can afford to bring them down in line.
Yep... I have only ever had one major problem with Nvidia and that was the 3.5gb stuff 8+ years ago. Amd has burnt me every time I try them, even with a handful of issues on the 9700 pro way back when.

I have little shame in my preference for Nvidia, just as some have even less shame for worshipping amd while decrying everything Nvidia ever does. Noticing that amd fans tend to be a lot more negative on Nvidia than vice versa. My attitude is live and let live unless it's a lie (or lying by omission) about Nvidia stuff :)
 
Which they can, they have been bragging about how their new chiplet design saves them so much money they can afford to bring them down in line.
I've been silent about this for a while now, but it's been bothering me almost since the release. How can people tout the cost saving of chiplet and celebrate a company releasing a $1K GPU? A GPU that has had so many issues and drama no less.

GPU pricing is just such a shit show. We can only pray it comes back to sane pricing at some point. I consider myself lucky to be running a 3070, and I'll hold on to dear life until a sanely priced replacement gets released.
 
I've been silent about this for a while now, but it's been bothering me almost since the release. How can people tout the cost saving of chiplet and celebrate a company releasing a $1K GPU? A GPU that has had so many issues and drama no less.

GPU pricing is just such a shit show. We can only pray it comes back to sane pricing at some point. I consider myself lucky to be running a 3070, and I'll hold on to dear life until a sanely priced replacement gets released.
As sad as it is, Nvidia forced that price on AMD. Had they sold it for less, say $500 the bulk of us would never see that card at that price, bots would gobble it up faster than we could refresh a page and flip them on Amazon with the price parity of the 3090TI's. Had it been posted less that price would only remain the price for as long as AMD had the capability of supplying the bot demand, which they can't, and when they fail and the price gets put to $800 on the 3'rd party retailers then AMD has fans complaining that they cant get a card while they have investors threatening lawsuits because AMD "undervalued" its product, it is a loose loose scenario.
 
Yep... I have only ever had one major problem with Nvidia and that was the 3.5gb stuff 8+ years ago. Amd has burnt me every time I try them, even with a handful of issues on the 9700 pro way back when.

I have little shame in my preference for Nvidia, just as some have even less shame for worshipping amd while decrying everything Nvidia ever does. Noticing that amd fans tend to be a lot more negative on Nvidia than vice versa. My attitude is live and let live unless it's a lie (or lying by omission) about Nvidia stuff :)
I just look at what is available at the time, what it does and careless about the name or even who made it. Agree with you on buying performance but not limited to one brand. Then again having a variety can give you the benefits of like both AMD and Nvidia.

Energy costs up, worker pay is up, shipping costs are up, smaller nodes are increasingly way more expensive, yet some are comparing this to old pricing? Sounds like some may have more issue with the world than Nvidia or AMD.

From my limited prospective, ADA and RNDA 3 cards have their faults that make this generation rather dissapointing. I have not skipped a generation of a graphics card in over 2 decades. Probably get at least one this generation be it Nvidia or AMD.
 
Yep... I have only ever had one major problem with Nvidia and that was the 3.5gb stuff 8+ years ago. Amd has burnt me every time I try them, even with a handful of issues on the 9700 pro way back when.

I have little shame in my preference for Nvidia, just as some have even less shame for worshipping amd while decrying everything Nvidia ever does. Noticing that amd fans tend to be a lot more negative on Nvidia than vice versa. My attitude is live and let live unless it's a lie (or lying by omission) about Nvidia stuff :)

Nvdia drivers from the 2007-2009 era were trash though. May have been a Vista thing. Once they sorted them out, it was great ever since. Have a few problems here and there but generally very stable.

ATI/AMD always had some more shortcomings and lagged on the software aspect. Last ATI/AMD card I owned was an X700 Pro 256MB though. Seems like driver quality increased but clearly, Nvidia comes out with better technology and software and AMD is always playing catch up. Gsync, DLSS, ray tracing. AMD is always a generation behind in implementation of those.
 
Nvdia drivers from the 2007-2009 era were trash though. May have been a Vista thing. Once they sorted them out, it was great ever since. Have a few problems here and there but generally very stable.

ATI/AMD always had some more shortcomings and lagged on the software aspect. Last ATI/AMD card I owned was an X700 Pro 256MB though. Seems like driver quality increased but clearly, Nvidia comes out with better technology and software and AMD is always playing catch up. Gsync, DLSS, ray tracing. AMD is always a generation behind in implementation of those.
I had 7900gt and then 8800gt sli on Vista but I had no issues thankfully.

My last amd card was an r9 290 in late 2014, reference cooler, but it ran hot as heck and loud while still throttling (850mhz clocks, in ESO, plus some glitchiness to boot) in the same setup my 970 sli worked quiet and cool in, so I returned it.

I got $30 back from Nvidia for each card (so 60 total) plus 20% of my purchase price from Newegg for the 3.5gb issue. Overall I felt satisfied enough once that happened.
 
Nvdia drivers from the 2007-2009 era were trash though. May have been a Vista thing. Once they sorted them out, it was great ever since. Have a few problems here and there but generally very stable.

ATI/AMD always had some more shortcomings and lagged on the software aspect. Last ATI/AMD card I owned was an X700 Pro 256MB though. Seems like driver quality increased but clearly, Nvidia comes out with better technology and software and AMD is always playing catch up. Gsync, DLSS, ray tracing. AMD is always a generation behind in implementation of those.
It was a Vista thing that forced Nvidia to make last-minute revisions to key parts of the API which resulted in a nightmare.
Microsoft decided between the final Vista Beta release and the Retail release to close off parts of the Kernel, which yeah was more secure, but it resulted in things that did work on the very last revision of the beta release being completely incompatible with the retail release with nothing more than a quick "heads up" email to device manufacturers a week before the launch date.
Nvidia, Creative, HP, and just about everybody else was completely blindsided by it and it required them to completely abandon large parts of their driver code. HP simply decided then and there that much of their hardware was simply not going to get Vista drivers in retaliation, Creative struggled, and Nvidia got off relatively lucky as they were completely rehashing their drivers at the time anyways, but yeah there were growing pains from Vista. It's no coincidence that CUDA was launched around 2007, supposedly CUDA was developed as a result of trying to keep up with all the API changes so they just sort of created a framework so they could better adapt. (I can't verify that, I was told it once many years ago and it just sort of stuck with me)
 
Back
Top