NVIDIA rumored to be preparing GeForce RTX 4080/4070 SUPER cards

Yes and no. The 90/90 Ti is clearly also the rebrand of the 80 Ti tier and what it used to be with a price hike, but also sort of a hybrid between that and what used to be Titan with the 24GB. That said, historically, Titans had functionality that was gimped on GeForce cards and their own drivers to enable those workloads. That might not be the case anymore I'm not sure, but at release there were definitely things the RTX Titan still outperformed the 3090 on because of that. To just blanket say 90's are Titans....I don't know.

What I do know is that historically, the 80 Ti cards were the top end GeForce cards of a given gen, and that simply isn't true anymore since the 30-series, and the 40-series has yet to even have one over a year later.
The Titans ran consumer hardware, but professional firmware. That when paired with the creator drivers (which don’t get the game optimizations) led to a weird hybrid state which left the cards existing half way between both. But you could use the game ready drivers to get the same or slightly better performance than the 80 series of that same generation, but at a price that made it not worth it for all but the FPS chasers who needed the fastest price be damned.
 
no one bought the 4080 because the 4090 was a significant upgrade...the people that can afford to pay $1200 for a GPU can also afford to pay $1600...if the 4080 was priced at $800 it would have sold much better (probably better than the 4090)...the problem is that Nvidia wanted everyone to get the 4090 and purposely made every other 40 series card irrelevant
The 4090 was made from silicon that wasn’t viable for the 5000 or 6000 series workstation cards but still too good to gimp.
The 4000 series launched at a price designed to not move, they wanted to cut back on the consumer 4000 series silicon they were moving so the enterprise and business parts could be prioritized while making the huge quantities of 3000 series parts look much better in comparison to get them off the shelves.
No company likes competing against their own products especially when those products are being discounted to move. So by pricing the 4000 series high they limit the discounts on the 3000 and throw their AIBs a bone if you will letting them clear their huge overstock at not a loss. Because AIB’s way over produced feeding the COVID demand and mining boom leading them all to over pay for critical components which inflated all the other prices and enter the cost creep snowball of fuckery.

But make no mistake the shitty launch MSRP was 100% the result of the AIB’s being greedy. Nvidia may be assholes but even they know their limits.
 
The 4090 was made from silicon that wasn’t viable for the 5000 or 6000 series workstation cards but still too good to gimp.
The 4000 series launched at a price designed to not move, they wanted to cut back on the consumer 4000 series silicon they were moving so the enterprise and business parts could be prioritized while making the huge quantities of 3000 series parts look much better in comparison to get them off the shelves.
No company likes competing against their own products especially when those products are being discounted to move. So by pricing the 4000 series high they limit the discounts on the 3000 and throw their AIBs a bone if you will letting them clear their huge overstock at not a loss. Because AIB’s way over produced feeding the COVID demand and mining boom leading them all to over pay for critical components which inflated all the other prices and enter the cost creep snowball of fuckery.

But make no mistake the shitty launch MSRP was 100% the result of the AIB’s being greedy. Nvidia may be assholes but even they know their limits.
I do wonder if the 50-series will be a return to saner pricing and branding/tiering of chips/cards. Sort of like how we went from 20-series to 30-series.
 
The other side of that could very well be that the limited GPU capabilities of the existing 7000 series could similarly see performance fall off with future titles even if they are really good now because of the extra memory, trying to future-proof anything GPU-related right now is a hard ask because things are changing too fast.

I personally think the likelihood that VRAM becomes a bottleneck for a card like the 4070 Ti necessitating a replacement sooner than RT performance or whatever of the 7000 series is limited at is far greater. That's already the case, and I don't see how the 4070 Ti will somehow pass a 7900 XT in 4K considering things like RT also tend to demand more VRAM, and in both cases you're not going to be using RT unless you like a good slideshow.

Granted, this also is wholly dependent on how long you intend to keep the card. Historically I've kept mine for 5 years or so, but I also would have upgraded my 1070 Ti to a 3080 had crypto not broken the market. In any case, I wasn't going under 16GB VRAM for this upgrade cycle since 12GB is already showing limitations. If you're the kind of person who upgrades every cycle anyway, then you probably don't care nearly as much.
 
I personally think the likelihood that VRAM becomes a bottleneck for a card like the 4070 Ti necessitating a replacement sooner than RT performance or whatever of the 7000 series is limited at is far greater. That's already the case, and I don't see how the 4070 Ti will somehow pass a 7900 XT in 4K considering things like RT also tend to demand more VRAM, and in both cases you're not going to be using RT unless you like a good slideshow.

Granted, this also is wholly dependent on how long you intend to keep the card. Historically I've kept mine for 5 years or so, but I also would have upgraded my 1070 Ti to a 3080 had crypto not broken the market. In any case, I wasn't going under 16GB VRAM for this upgrade cycle since 12GB is already showing limitations. If you're the kind of person who upgrades every cycle anyway, then you probably don't care nearly as much.
at 4K for sure, but everything below that will all balance out, at 4K even 16GB is a bit of a crunch.
 
no one bought the 4080 because the 4090 was a significant upgrade...the people that can afford to pay $1200 for a GPU can also afford to pay $1600...if the 4080 was priced at $800 it would have sold much better (probably better than the 4090)...the problem is that Nvidia wanted everyone to get the 4090 and purposely made every other 40 series card irrelevant

No doubt that was a big part of their strategy.
 
I do wonder if the 50-series will be a return to saner pricing and branding/tiering of chips/cards. Sort of like how we went from 20-series to 30-series.
Saner relative terms perhaps, but TSMC is upping its rates, silicon wafer costs are going up, and prices on components like capacitors and such are staying stable at their COVID prices so I doubt we will see the price reductions we want.
I expect instead Nvidia will split off their consumer and business parts so they are on different silicon on different nodes, enterprise will pay to be the best of the best and they can pay for the big-sized chips.
Putting consumers a year behind so they aren't pushing the envelope will help cut costs while keeping margins intact and they can work on smaller chips none of those 600mm^2 chips and instead work on keeping things sub 400 or preferably in the 200's to keep prices in check.
Better packaging will help out a lot, be it Intel or TSMC doing that it doesn't matter but it would let Nvidia potentially separate off the IO and a few things much like AMD has done but the newer methods and hardware do a far better job at removing latency and bottlenecks that unfortunately AMD could not use for the 7000 series. But those don't open up for consumer devices until H1 2025 at best.
 
the reason the 4080 didn't sell well had nothing to do with it being a bad card...the reason is because it was a bad value card (in relation to the 4090)...the 4080 Super is replacing the 4080 and watch it sell much better than the 4080...the problem now is that a lot of people might just be waiting for the 5000 series
 
the reason the 4080 didn't sell well had nothing to do with it being a bad card...the reason is because it was a bad value card (in relation to the 4090)...the 4080 Super is replacing the 4080 and watch it sell much better than the 4080...the problem now is that a lot of people might just be waiting for the 5000 series
I'm waiting for the next generation has always been and will always be a solid reason not to buy now, but the 5000 series is still a full calendar year away so between now and then if you can't wait AMD has some price adjustments to make to counter these it would seem,
 
I'm waiting for the next generation has always been and will always be a solid reason not to buy now, but the 5000 series is still a full calendar year away so between now and then if you can't wait AMD has some price adjustments to make to counter these it would seem,

nothing AMD can do in terms of pricing will make their cards perform as well in RT as Nvidia...if you care about ray-tracing and path tracing there is only 1 option
 
RTX 4070 should be around 50 € cheaper than the RX 7800 XT to really be appealing considering how vast the difference truly is in the most recent benchmarks. If you still would pick RTX 4070 over RX 7800 XT, I can very much understand this if power consumption is a concern to you, or something driver or software related - then yes very much, but I don't feel at all that the better ray tracing performance is a thing till we reach RTX 4080 kind of performance
7800 XT is a great GPU. But.....from what I have seen with benchmarks, raster performance is more similar than it is different, from the 4070. At least at stock speed.

Indeed, 7800 XT usually overclock well. And that can get it really close to a 4070 ti, especially at 4K. But, you are also using 300 watts for such an overclock. With out of box performance for most 7800 XT, they usually only beat 4070 in raster, by less than 10 frames.

4070 does have better RT performance and it is enough to be more usable, overall. For example, you can run Elden Ring at 1440p/60 with RT, on a 4070. You can't do that with a 7800 XT. Over 20 million people own Elden Ring. Wins like that are important.

4070 is appealing for low power usage, better RT, better quality video encoding, Nvidia Broadcast (ML noise canceling for voice and incoming audio. Also, camera effects which run on tensor cores and rival or surpass the best CPU powered options---for free), and if you like framegen-----Nvidia has it in A LOT more games than AMD. I think Nvidia is probably near 50 games now, with framegen. Whereas AMD has.....4?


All that said, I think the 7800 XT is a great GPU. AMD's best release this generation. 4070 is also a great GPU. And if we see street pricing of some 4070 super at less than $600, that will be very good, indeed.
 
Last edited:
nothing AMD can do in terms of pricing will make their cards perform as well in RT as Nvidia...if you care about ray-tracing and path tracing there is only 1 option
Sort of true?
It really depends on what sort of display people have hooked up, the bulk of PC gamers out there are still on 1080p.
The Nvidia advantage there sort of disappears unless you are really just chasing numbers, Nvidia Reflex plays a bigger part in making things feel better at that resolution than not.

But that is me being pedantic... if you are willing to pay for the features and convenience then Nvidia is hard to go wrong with, but like anything you need to do your research to make sure its a fit for you.
 
It really depends on what sort of display people have hooked up, the bulk of PC gamers out there are still on 1080p.
The Nvidia advantage there sort of disappears unless you are really just chasing numbers

True. Since the time of 6800 / 6700xt, AMD has been delivering better value at $500 or below
 
This comment, likely unintentionally, exactly proves my point. The memory thing is demonstrably NOT overblown. You can check the benchmarks yourself. The 4070 Ti drops off more than it should at 4K in many titles vs the 7900 XT, which is it's nearest price competitor and remains playable at that resolution.. If you're gaming at 4K, unless you're using a 4090, you're likely not taking advantage of a lot of what Nvidia has to offer because things like RT will tank the frame rate, at least that's the decision I made when I decided to buy AMD this round. I didn't need 24GB of RAM, but I needed more than 12GB, so the 4070 Ti, which was the card I would have initially been shopping for, was immediately ruled out. I can tell you my personal experience with the drivers has been more or less the same as far as gaming is concerned, and going from Gsync to Freesync has provided an identical experience. CUDA is essentially irrelevant to me because I'm exclusively gaming with this card.

There is an easy case to be made for buying Nvidia over AMD because they have better technology and a better feature set. That case can, and does, fall apart at certain resolutions due to the lack of VRAM, particularly because features like RT are more VRAM intensive. At 1440P, the 4070 Ti is generally OK as of now. It won't have as long legs as it deserves going forward though. VRAM will matter more in a few years than RT, and that can make the difference depending on how long you want to keep your card.

I personally think the likelihood that VRAM becomes a bottleneck for a card like the 4070 Ti necessitating a replacement sooner than RT performance or whatever of the 7000 series is limited at is far greater. That's already the case, and I don't see how the 4070 Ti will somehow pass a 7900 XT in 4K considering things like RT also tend to demand more VRAM, and in both cases you're not going to be using RT unless you like a good slideshow.

Granted, this also is wholly dependent on how long you intend to keep the card. Historically I've kept mine for 5 years or so, but I also would have upgraded my 1070 Ti to a 3080 had crypto not broken the market. In any case, I wasn't going under 16GB VRAM for this upgrade cycle since 12GB is already showing limitations. If you're the kind of person who upgrades every cycle anyway, then you probably don't care nearly as much.
Anywhere a GPU might struggle at 4K, but not with 1440p----you can use DLSS quality or FSR quality. The image quality is usually pretty close, it performs similar to 1440p, and uses VRAM similar to 1440p. The same can be said for 1440p, really.

I don't think we will see many games for the rest of this console generation, where great looking graphics settings are overflowing 12Gb VRAM, at 1440p. I think more pressing, is that new rendering techniques will simply max out the GPU's performance, in general. See: Unreal Engine 5 games. I don't doubt some optimization will be figured out, there (Lords of The Fallen did a bunch, with post release patches. And it seems like a few games this past year or two, have really poor usage of the CPU. Combine that with Nvidia's driver overhead, and it has created some performance issues). But, I still expect UE5 games to be relatively tough to run on current cards. And so will other new engines.

AMD's generic framegen for 'any' game at the driver level, is interesting. But, its currently isolated in its own separate branch for drivers. And that branch is otherwise behind on the latest game updates and other features. I dunno what AMD is doing with their driver development. But, they really need to have the main stable branch with game ready updates, and then have an experimental branch which also has the latest stable updates + the experimental features. Alan Wake 2 pretty much needs framegen, but you still cannot run an AMD driver which has both the latest optimizations for Alan Wake----as well as driver level Fluid Motion Frames.

I think the open source FSR framegen mods are temporarily saving AMD's ass, while they lag on development of their own framegen features!
 
Last edited:
Saner relative terms perhaps, but TSMC is upping its rates, silicon wafer costs are going up, and prices on components like capacitors and such are staying stable at their COVID prices so I doubt we will see the price reductions we want.
Yeah, not saying I am expecting the 90 cards to really come down in price, but the AD103 chip coming down to an $800 product tells me that the 4080 absolutely could have been an $800 card at launch, which is only $100 more than the 3080 MSRP and would have been seen as perfectly reasonable increase considering those factors. There is definitely a greed factor with Nvidia here (and gamers buying massively overpriced cards during COVID/mining probably didn't help perceptions here) and not all can just be pinned on wafer rates. AD103 for example is pretty comparable to prior xx-104 based 80-cards from prior gens in size and certainly wouldn't suffer quite the same issues a large megalithic xx-100 or xx-102 type die would in terms of yields/cost. $1200 was just greedy and $1000 is better, but still too much for what it is. And let's not forget that 60-class cards have completely stagnated in any meaningful generational uplift and they have tiny dies.
 
I've said this before and I'll say it again: the underlying price driver is NVIDIA needs to do everything it can to ensure that Geforce GPUs cannot be used to train AI models. Since 2016, they've: barred the use of Geforce drivers in datacenters, barred OEMs from making cards under 3 slots thick, reduced bandwidth and capacity to embarrassing values, gimped NVLINK, removed NVLINK altogether, and retroactively removed P2P communications from the consumer drivers. Every action they've taken is to protect the H100 - when a single rack of H100's costs more than a house, that's just something you have to do.

Now, the real reason here is there's only one fab left, and its running at 100% capacity. Datacenter parts have always commanded a price premium (compare a $4K K20 to a $700 780 Ti, for example). In the past, you'd print as many datacenter parts as you could sell, then use the leftover capacity to print high volume consumer parts, collect a bit of extra profit, grow your ecosystem, and beef up your revenues. Nowadays, there's no leftover capacity, so every AD102 that winds up in a 4090 is a $10K L40S that doesn't make it into an enterprise customer's hands.
 
DOn't forget, a year ago, people were saying it was priced that way to keep people buying 3000 series cards instead, to deplete *that* stock first.
I don't think that has necessarily been disproven. I remember even the big tech tubers were doing FUD videos (surely paid for by NVIDIA) to push those cards.
 
Kind of wild how well Nvidia pulled this off, if you go on Reddit everyone is completely psyched for the 200 dollar price cut of the 4080 super. They couldn't have done a better job of training their customers to accept higher prices. The 3080 was 699, the 4080 was 1199, a year later the 4080s is 999. So guess what folks? That's the new accepted price of 80 series cards, they still got a 300 dollar MSRP hike in that time span. Everyone will be outraged again when the 5080 is 1399 and then excited when the 5080s 1199. Looking forward to it! Nothing like training your lemming customers to accept a raw deal each gen. At this point I might be done with higher end hardware and just use whatever the bare minimum is to do my job. I really loathe the road we're headed down.
 
Kind of wild how well Nvidia pulled this off, if you go on Reddit everyone is completely psyched for the 200 dollar price cut of the 4080 super. They couldn't have done a better job of training their customers to accept higher prices. The 3080 was 699, the 4080 was 1199, a year later the 4080s is 999. So guess what folks? That's the new accepted price of 80 series cards, they still got a 300 dollar MSRP hike in that time span. Everyone will be outraged again when the 5080 is 1399 and then excited when the 5080s 1199. Looking forward to it! Nothing like training your lemming customers to accept a raw deal each gen. At this point I might be done with higher end hardware and just use whatever the bare minimum is to do my job. I really loathe the road we're headed down.
Hahahaha- good joke. We all know that the actual price of the 3080 was at least 1200, because you had to pay a scalper to get one, or spend $600 of your time trying to get lucky. The crypto market set the price, and ai will continue to set the price going forward. We won’t see 699 80 series cards ever again because they proved much more value than 699 to anyone that can use them to make money.
 
Hahahaha- good joke. We all know that the actual price of the 3080 was at least 1200, because you had to pay a scalper to get one, or spend $600 of your time trying to get lucky. The crypto market set the price, and ai will continue to set the price going forward. We won’t see 699 80 series cards ever again because they proved much more value than 699 to anyone that can use them to make money.
I got a 3060 ti at msrp when it launched, 399. That was a unique circumstance, we had massive supply chain shutdowns and everyone at home at once.
 
Kind of wild how well Nvidia pulled this off, if you go on Reddit everyone is completely psyched for the 200 dollar price cut of the 4080 super. They couldn't have done a better job of training their customers to accept higher prices. The 3080 was 699, the 4080 was 1199, a year later the 4080s is 999. So guess what folks? That's the new accepted price of 80 series cards, they still got a 300 dollar MSRP hike in that time span. Everyone will be outraged again when the 5080 is 1399 and then excited when the 5080s 1199. Looking forward to it! Nothing like training your lemming customers to accept a raw deal each gen. At this point I might be done with higher end hardware and just use whatever the bare minimum is to do my job. I really loathe the road we're headed down.
The 4070 Ti Super is purportedly more/less equivalent in performance to a 4080. So, that would mean you get ~ 4080 preformance, now for $800.

The 4080 super exists only for Nvidia to keep drawing out the performance stack, as much as possible. Rather than simply dropping the 4080 to $800 and the 4090 to.....something better. And not having an in between card.
AMD has nothing to compete with the 4090, so it doesn't move on price. The supers are here to hopefully shift sales away from AMD's very successful 7800 XT and pretty successful 7900 XTX (after price drops)A $900 7900 XTX is a compelling option, against a $1200-$1300 4080. Now you get almost a 4080 for $800. And the 4080 super is just there to keep the 4090 properly extended.
 
Last edited:
I don't think it makes sense to get the 4090 anymore...if you bought it early in its life cycle it was a great buy...but now with the 5000 series not too far away it doesn't make sense to get the uber high end 4090 anymore (especially with the price staying at $1600+)...the 4080 Super or 4070 Ti Super seem like the best buys
 
I don't think it makes sense to get the 4090 anymore...if you bought it early in its life cycle it was a great buy...but now with the 5000 series not too far away it doesn't make sense to get the uber high end 4090 anymore (especially with the price staying at $1600+)...the 4080 Super or 4070 Ti Super seem like the best buys
~A year away isn't too far? 10+ months is a lot of game time...
 
I don't think it makes sense to get the 4090 anymore...if you bought it early in its life cycle it was a great buy...but now with the 5000 series not too far away it doesn't make sense to get the uber high end 4090 anymore (especially with the price staying at $1600+)...the 4080 Super or 4070 Ti Super seem like the best buys
The 5000 series is a full year out not exactly close.
But no the 4090 unless you are playing 4K and need it now is not good.
 
~A year away isn't too far? 10+ months is a lot of game time...

it's still better to wait...yes you'll have 10 months with the 4090 but you'll have 24+ months on the low end with the 5090...more like 36+ months if you don't plan on upgrading for awhile

plus within the next 10 months there are bound to be tons of leaks and rumors which will make the wait seem shorter
 
The 4070 Ti Super is purportedly more/less equivalent in performance to a 4080. So, that would mean you get ~ 4080 preformance, now for $800.

The 4080 super exists only for Nvidia to keep drawing out the performance stack, as much as possible. Rather than simply dropping the 4080 to $800 and the 4090 to.....something better. And not having an in between card.
AMD has nothing to compete with the 4090, so it doesn't move on price. The supers are here to hopefully shift sales away from AMD's very successful 7800 XT and pretty successful 7900 XTX (after price drops)A $900 7900 XTX is a compelling option, against a $1200-$1300 4080. Now you get almost a 4080 for $800. And the 4080 super is just there to keep the 4090 properly extended.
We'll see about it matching a 4080, we'll also see about how many cards will actually cost 799
 
it's still better to wait...yes you'll have 10 months with the 4090 but you'll have 24+ months on the low end with the 5090...more like 36+ months if you don't plan on upgrading for awhile

plus within the next 10 months there are bound to be tons of leaks and rumors which will make the wait seem shorter

If a person is looking to drop 2k (or more) on a GPU right now, I don't think value is at the forefront of their mind. At least if their primary purpose is gaming. If a person is in the market for a 4090 right now for more professional purposes it makes sense to buy now as well, since waiting could mean you end up losing more money due to lost productivity then you would save by waiting 10-12+ months for a better performing card.
 
I already commented in my last post how I feel about the RTX 4070 Super - I think it is a much needed buff - but I think the other two new products deserve elaboration as well:

RTX 4080 Super: RTX 4080 was already a very impressive card, I'd say it was the pinnacle of this generation when we forget the price and just focus on all the characteristics alone: consumption is not over the top, there is enough VRAM and the performance is very solid, and we could say from the company's perspective as well that it is a desirable die to be had. RTX 4080 Super tops it naturally because it is the full die, so it is the new pinnacle (to me at least), much like RX 6800 (non-xt) was the most balanced product in the previous generation, but with the difference that it was actually occasionally priced desirably for the customer, while RTX 4080 Super is still too expensive. Even 900 € would be slightly pushing it, but for 800 $, I would deem it acceptable, though the correct perspective I guess would be to evaluate it through the competition, and it is not too bad compared to the RX 7900 XTX, but I'd so much wish it was at least that 900 $. Let me remind you that the RT features become viable options on RTX 4080 due to it's sheer performance - combine those effects on a 4K screen with DLSS and I'd say the end result is not too bad, frame gen not even being a necessity if you care to tweak settings a bit.

RTX 4070 Ti Super: I really like seeing them offering the AD103 die for what it should cost, but of course it could be somewhat faster. Despite this, it is a more welcome product what I initially expected, especially considering the VRAM amount; I certainly feel saying goodbye to the lacking RTX 4070 Ti is the right way to go. This is a well balanced product like the RTX 4080, hence the same die, and we can pretty safely assume, since it's represents the bottom cut of the AD103 die, that one day it will cost way less, but perhaps not before the very end of this generation. Depending on how much fitting chips are produced, this model can become rather popular, since we also see that people are in fact willing to spend somewhat more on a GPU than before, likely simply because they didn't spent money on the last generation, some people still clinging on the GTX 1000 series even.

It is also a no-brainer to continue producing and offer the RTX 4070 in the future, since that too is a well balanced product, but like many others, only spoiled by the price, although soon this will change I assume. Such models are needed for small cases and for those who do no want to spend too much power for whatever reason.

Additional divisive note:
As discussed above, 12 GBs for the RTX 4070 Super remains a bummer, given the price, but don't make that too big of a deal, you can play just fine in the future as well if you just drop some settings, at 1440p you should rarely have any issues, but I personally opted in the end for the RX 7800 XT for both the performance and VRAM. :)
 
Last edited:
If a person is looking to drop 2k (or more) on a GPU right now, I don't think value is at the forefront of their mind. At least if their primary purpose is gaming. If a person is in the market for a 4090 right now for more professional purposes it makes sense to buy now as well, since waiting could mean you end up losing more money due to lost productivity then you would save by waiting 10-12+ months for a better performing card.
Agreed. 10 months is almost a year. A small chunk of anyone's life, to say the least. Anyone who wants the best GPU regardless of value probably doesn't have the time preference to wait.
 
I think I'll just wait until next gen and hopefully viable 4k gaming that isn't over $1000.
 
Yeah, not saying I am expecting the 90 cards to really come down in price, but the AD103 chip coming down to an $800 product tells me that the 4080 absolutely could have been an $800 card at launch, which is only $100 more than the 3080 MSRP and would have been seen as perfectly reasonable increase considering those factors. There is definitely a greed factor with Nvidia here (and gamers buying massively overpriced cards during COVID/mining probably didn't help perceptions here) and not all can just be pinned on wafer rates. AD103 for example is pretty comparable to prior xx-104 based 80-cards from prior gens in size and certainly wouldn't suffer quite the same issues a large megalithic xx-100 or xx-102 type die would in terms of yields/cost. $1200 was just greedy and $1000 is better, but still too much for what it is. And let's not forget that 60-class cards have completely stagnated in any meaningful generational uplift and they have tiny dies.
Sure, if you ignore that:
  1. TSMC 4nm was triple the cost of Samsung 8nm at the launch of the RTX 4080
  2. The value of the US dollar fell 14.5% between the release of the RTX 3080 and RTX 4080
 
I think I'll just wait until next gen and hopefully viable 4k gaming that isn't over $1000.
The 7900XTX is viable 4K Gaming that isn't over a grand. There have been sales of the card for around 7-800 bucks already. I have one, it's pretty damn good. Everything is fast and fluid and that's without running FSR.
 
The 7900XTX is viable 4K Gaming that isn't over a grand. There have been sales of the card for around 7-800 bucks already. I have one, it's pretty damn good. Everything is fast and fluid and that's without running FSR.

I wonder if it will get cheaper over the next month or so. The $1000 price on the 4080S might cause AMD and/or retailers to react with discounts on the top three 7000-series cards. Dropping the price on the XTX permanently to $750-$800 would make it one heck of a card.
 
I wonder if it will get cheaper over the next month or so. The $1000 price on the 4080S might cause AMD and/or retailers to react with discounts on the top three 7000-series cards. Dropping the price on the XTX permanently to $750-$800 would make it one heck of a card.
In the short run AMD should drop the price.

In the long run, they could release over-clocked cards with faster memory
 
Back
Top