RTX 4xxx / RX 7xxx speculation

If the 4090 is $3000+ all I can say is that there will definitely be good availability because few will want to pay that much.
I could see it coming in at the price they wanted to set the 3090 Ti at (before the delay) - $2,499.
 
MSRP for these is going to skyrocket for 3 reasons:

3. Assuming the perf leaks are true, it'll be the biggest gen leap in many years. Big leap = Big price.
While I don't disagree that this will happen, its time that on point #3, consumers stop making this excuse for price jumps.

What happened to gen on gen price point replacement. So what if it's 90% faster than last gen, that doesn't make it cost 90% more (or shouldn't) all else being equal.

Of course in this case, plenty of other factors, I wholeheartedly agree that the prices are going up.
 
The 3090 ti original msrp was created during the tail end of the mining boom. I don’t think they can get away with charging more than $2000 for a card unless it has the Titan name.
 
The 3090 ti original msrp was created during the tail end of the mining boom. I don’t think they can get away with charging more than $2000 for a card unless it has the Titan name.
Good point but I think combined with the rumored huge performance boost and the fact only the 4090 is coming in 2022 - it could be made more “exclusive” to help sell 3xxx stock.
 
  • Like
Reactions: Wat
like this
If Nvidia can keep the 3090 at the new MSRP of around $1299 and Ti version $1499 for awhile and just release the 4090 before the others -> I say they could sell it for a premium, $1999+ until the higher end Amper cards are sold out. I just don't see that as too viable myself. 3090 for $999, Ti $1199, the larger amount of memory will also allow the higher price even if the 4080 is at $999 and beats both of them. I think the price point is in flux for any new GPU depending on way too many factors that are unpredictable -> Used card market, how fast the current supply of GPUs are sold, economic conditions, inflation for manufacturing, AMD pricing and performance and so on.
 
The pricing they can run with will depend a lot on which games are released. Currently very few titles that will challenge a 3080 at 1440p so it will be mostly the people who run 4k or need to have the newest stuff that will go from a 3080 to a 4080 or higher. This may of course change if there are several top notch titles that push ray tracing and general graphics beyond what a 3080 can handle at 1440p. I would expect the mid range cards to fly off the shelves if they are reasonably priced compared to their performance. 3xxx and 6xxx mid range were either lacking in performance or priced close to what one would expect to pay for a high end card so probably lots of price conscious gamers looking to upgrade.
 
The second hand market greatly effects retail on new gpus. On top of that, inflation is deviating non-essential spending. On top of that, Nvidia ordered more wafers than they needed, because of the mining craze. I highly doubt we will see the 4090 priced at 1500 or more. If they do, I would wait for until after the holidays.
 
JayzTwoCents is back to reiterate that now is the time to buy cards. Says he's heard a rumor the 4000 series may not release at all till next year, mid range 4060 not showing up till the summer.

 
If the 40 series is coming this year, we will know by the end of September. The merge will be completed or delayed by that point as well.
Now is actually the worst time to buy, the market could be quite different in a matter of weeks.

This video would have been useful a couple months ago. At this point people who wanted a 30 series have gotten one, and everyone else is already committed to waiting. Anyone else still on the fence should just wait for the rumors to resolve themselves shortly.
 
Last edited:
If the 40 series is coming this year, we will know by the end of September. The merge will be completed or delayed by that point as well.
Now is actually the worst time to buy, the market could be quite different in a matter of weeks.

This video would have been useful a couple months ago. At this point people who wanted a 30 series have gotten one, and everyone else is already committed to waiting. Anyone else still on the fence should just wait for the rumors to resolve themselves shortly.
You obviously didn’t watch the video.
 
JayzTwoCents is back to reiterate that now is the time to buy cards. Says he's heard a rumor the 4000 series may not release at all till next year, mid range 4060 not showing up till the summer.


lol on that video. He gets most of his stuff, cards from the AIBs, EVGA, ASUS, MSI and so on. Nvidia appears to not have a big inventory of ampere GPUs but the AIBs do. What is Nvidia going to do for the next 2 quarters? Not sell GPUs? Make more Ampere GPUs? Neither make any sense, Nvidia quarterly profits would utterly tank, stock would tank and share holders would be rather more upset. Anyways, just more jiberish thinking. The MSRP for the 3080 FE was $699, the low end AIB 3080s were not the same level of quality in my opinion, and after about 2 years are those 3080's less than MSRP of $699? They should be less than $650 if not less than $600. As for the used mining cards, it is easy to determine their reliability -> start paying attention to the comments, ratings etc. on EBay and elsewhere -> Forums and so on. Plus in mining terms, less than 2 years on a GPU is rather short and I would say most of the cards are rather good yet with any manufacture defected cards filtered out.

The AMD RNDA2 cards were not great mining cards for the $, why would one not look at RNDA2 used cards? I looked for a new 6800XT for the MSRP of $649 -> could not find any readily available yet at that MSRP, cheapest was $689. I still see further drops for prices and one should get what one needs at the time if the benefits outweigh whatever the cost is. If I was shopping for an Ampere card or RNDA2 card, it would be used unless those AIBs really wanted to sell one to me and not try everything to maximize their profits and have worthless talking heads plugging them.
 
Last edited:
lol on that video. He gets most of his stuff, cards from the AIBs, EVGA, ASUS, MSI and so on. Nvidia appears to not have a big inventory of ampere GPUs but the AIBs do. What is Nvidia going to do for the next 2 quarters? Not sell GPUs? Make more Ampere GPUs? Neither make any sense, Nvidia quarterly profits would utterly tank, stock would tank and share holders would be rather more upset. Anyways, just more jiberish thinking. The MSRP for the 3080 FE was $699, the low end AIB 3080s were not the same level of quality in my opinion, and after about 2 years are those 3080's less than MSRP of $699? They should be less than $650 if not less than $600. As for the used mining cards, it is easy to determine their reliability -> start paying attention to the comments, ratings etc. on EBay and elsewhere -> Forums and so on. Plus in mining terms, less than 2 years on a GPU is rather short and I would say most of the cards are rather good yet with any manufacture defected cards filtered out.

The AMD RNDA2 cards were not great mining cards for the $, why would one not look at RNDA2 used cards? I looked for a new 6800XT for the MSRP of $649 -> could not find any readily available yet at that MSRP, cheapest was $689. I still see further drops for prices and one should get what one needs at the time if the benefits outweigh whatever the cost is. If I was shopping for an Ampere card or RNDA2 card, it would be used unless those AIBs really wanted to sell one to me and not try everything to maximize their profits and have worthless talking heads plugging them.
On top of that, they're not going to let AMD beat them out the door. I don't see AMD having the same stock issues that Nvidia currently has (there just never were that many RDNA 2 cards to begin with). AMD putting out a whole lineup with improved ray tracing, price / perf, and perf per watt while Nvidia folds it's hands because AIBs have too many Ampere cards is just absurd.
 
On top of that, they're not going to let AMD beat them out the door. I don't see AMD having the same stock issues that Nvidia currently has (there just never were that many RDNA 2 cards to begin with). AMD putting out a whole lineup with improved ray tracing, price / perf, and perf per watt while Nvidia folds it's hands because AIBs have too many Ampere cards is just absurd.

Yea Jay2Cents keeps talking about how NVidia may delay until next year, but there's no way they'll do that unless AMD is delaying also.
 
Yea Jay2Cents keeps talking about how NVidia may delay until next year, but there's no way they'll do that unless AMD is delaying also.
We may see a 4090 launch by EOY with a paper launch in November, but it's becoming clear there will be no volume cards in 2022.
 
Quite subjective obviously and some of the difference (like a new generation of console launch at a similar time) but the level of difference in hype between Ampere/RDNA2 launch and the 4xxx/7xxxx do seem quite stark, despite the rumours of doubling of the performance in the high end and xx70 getting about previous gen xx90-halo product, which would be quite a good launch, not Pascal with a 1070 beating significantly a 980ti maybe, but a 980ti didn't had the 3090 price tag either.

Outside wanting to play AAA title on a 4K tv without having DLSS/FSR on at 120fps (which is not uncommon among that buyer class I would imagine), needing more than a 6950xt/3090 TI is not that obvious, there almost a lack of game/monitor to run with the future halo hardware and obviously that the mid-buyable otherwise stuff being pushed back with the burnout of impossible to buy from the last launch does not help.

Has for the good or bad time, I am not sure how authoritative one can be here, how many people would have that wrong many times in the last 5-6 year's... it could very well be a very good time to buy (nothing under halo product possible to buy for a long time becoming obvious and making price go up again, crypto rebound) or a terrible time.

Maybe we will go a full 2 year's video card generation cycle and it would have been impossible, not for a single day, to simply buy a digital PS5 at msrp (not some massive bundle) online on a major platform a la bestbuy-amazon in many part of the world, the Xbox-X is possible right now too.

AMD putting out a whole lineup with improved ray tracing, price / perf, and perf per watt while Nvidia folds it's hands because AIBs have too many Ampere cards is just absurd.
Does not know enough to have much grasp but yes it feel absurd, there is absolutely no difference between NVIDIA and AMD for the impact on their relationship with AIB to launch new stuff, regarding the pool of cards they have to sell I feel like, why an AIB card if it is Nvidia or AMD that cause them to get stuck with an unsellable for a profit stock of old cards... And AMD has a lot more stuff they can do with the fabspace they achieve to get outside using it for the new GPU I would presume than Nvidia, if AMD launch cards, Nvidia will (and vice versa) it is almost automatic.
 
Outside wanting to play AAA title on a 4K tv without having DLSS/FSR on at 120fps (which is not uncommon among that buyer class I would imagine), needing more than a 6950xt/3090 TI is not that obvious, there almost a lack of game/monitor to run with the future halo hardware and obviously that the mid-buyable otherwise stuff being pushed back with the burnout of impossible to buy from the last launch does not help.

I have a 3080 Ti in my living room PC on a 77" LG C1 and there are a lot of games out there where even with Quality DLSS I'm well below 120fps. I would upgrade to the 4090 Ti if rumors are true of it being 2x faster than a 3090 in certain circumstances. At the very least from my experience I still need 30% or so more performance to hit 120fps across the board using some form of DLSS.

Ideally, I would want to hit 120fps at 4k using DLAA in something like Cyberpunk but that's not going to happen even with a 4090 Ti.
 
a6jyia9dv5s81.png
7
 
Outside wanting to play AAA title on a 4K tv without having DLSS/FSR on at 120fps (which is not uncommon among that buyer class I would imagine), needing more than a 6950xt/3090 TI is not that obvious, there almost a lack of game/monitor to run with the future halo hardware and obviously that the mid-buyable otherwise stuff being pushed back with the burnout of impossible to buy from the last launch does not help.
I keep saying this too, outside of a few games like CP77, I can't understand what monitors 4090 owners would be plugging into. All those 240hz 4k monitors that hardly exist.
I'm not blaming Nvidia/AMD for this, they need to keep pushing more powerful hardware out to consumers, just sucks that the power of the flagship 4000/7000 series will go under utilized.
Hopefully the LG C3 or next S95B will be 240hz.
 
I am not sure what percentage of the population would have a significantly better experience playing that type of games at a much higher frame rate than 120hz, maybe AMD/Nvidia could create some incentive to add some extra settings in a patch, could be naive but something like throwing more rays during the raytracing phase is maybe not that hard to change code wise or LOD/view distance.

Maybe some new title will make that question mute.
 
If there are already games that cant hit 120fps at 4k even with dlss on then saying its too much power is silly. If i buy a top end gpu i would want it to not only hit 4k 120 on all titles today, but also for ~2 uears after i buy it. Developers making use of hardware that is already out their is better then making games that cant run maxed out on current hardware a la crisis.
 
JayzTwoCents is back to reiterate that now is the time to buy cards. Says he's heard a rumor the 4000 series may not release at all till next year, mid range 4060 not showing up till the summer.


They will only extend things out really far like that-----if RDNA3 is a flop. And all of the leaks say RDNA is going to be quite good...
 
There are plenty of titles that can't do 4K/120 with the details jacked. In fact, there are only a few that can, even with "performance" DLSS. The more extreme DLSS settings nullify many of the visual improvements from higher visual quality settings, so that's usually what you want to avoid. RT is still a performance killer, too. There is absolutely room for something with a 25-50% boost over the best hardware from today. There always is. The interesting thing will be how much of a boost people capped at 60Hz or at 1080p/1440p see.
 
If there are already games that cant hit 120fps at 4k even with dlss on then saying its too much power is silly. If i buy a top end gpu i would want it to not only hit 4k 120 on all titles today, but also for ~2 uears after i buy it. Developers making use of hardware that is already out their is better then making games that cant run maxed out on current hardware a la crisis.
People are not saying a card significantly more powerful than a 3090 TI would be hard to max out on a 120fps nice tv (even if for the title that have an issue running at 120fps now, with modern VRR do the experience get significantly better at 120 than 95 fps ?) it is the 2x+ ones,

https://www.tomshardware.com/reviews/asus-geforce-rtx-3090-ti-review
after looking up my comment was quite exagerated, at 4K, ultra ray tracing on 6 hard to run title the average 99th low of an 3090 TI with DLSS on is currently at 47.3 fps, close to the VRR minimum if not below some system, can easily use double the power. The issue is more about does those title "worth a new run at those higher setting" than them not existing. Has well has the visual experience of playing the game at Ultra + ray tracing ultra being significantly different than Ultra + average ray tracing or Very High-Very High



Below 60 fps average, with DLSS on

You need a "killer app" or not necessarily at those level of performance, it is a lot about performance itself more than playing game for many people I would imagine.


There is absolutely room for something with a 25-50% boost over the best hardware from today.
That was never in doubt imo, and in the relevant price point even more, the question was about the halo product rumored to double performance, which is maybe not relevant to a conversation about the 4xxx/7xxx with their niche market share, but this is Hardforum after all and the only product rumored to be released soon.
 
Last edited:
The TDP rating from NVIDIA has always been "TBP." They don't separate their power numbers like AMD does. At the end of the day, total board power is the only thing that matters to the user. If you need to monitor the other components on the board most modern cards offer that capability.
 
I doubt that either the vanilla 4090 or 4080 will be 800w. I doubt a vanilla 4090ti (if one exists) would be either. Maybe a server/ DC or pro or Titan would hit that number, but those are for people who want the big numbers.
 
Yea Jay2Cents keeps talking about how NVidia may delay until next year, but there's no way they'll do that unless AMD is delaying also.
JayzNoSense has become entirely too insufferable for me in recent years. These videos just keep confirming it.

"Hey guys, please buy now, btw here are my affiliate links."
 
Vanilla 4090 will be around 450w according to leaks, maybe AIB boards will push 500w. I think that’s near the limit of 3x 8 pin power plus the slot.
 
Maybe the 4090ti kingpin will be 800w lol
Edit : or at least have an 800w bios option
 
I seriously doubt they will release a 800w monster to the consumer.
48 gig of ram smell like A100 workstation/data/AI/server card, those currently have cooling system for 4 x A100 right now, die size matter here obviously, but current cooling solution could handle 2x800watt gpus I would imagine.
 
1 x 16-pin PCI-E 5.0 would be enough. They support up to 600W each.
Good point. I was thinking more along traditional plugs since I doubt many people will have power supplies with new connections.
 
Back
Top