RTX 4xxx / RX 7xxx speculation

Did you even watch the presentation that outlines the special hardware unique to 40 series making DLSS 3 possible?
I mean sure, but they said that with raytracing back in the day too, yet we had people doing it with GTX 980tis (though it wasn't great, but neither were gen 1 raytracing cards)
 
It's a marketing presentation brother, I doubt their new tensor core makes interpolation possible. It is possible that's the case but I'm skeptical of anything directly from that presentation.
Seriously, they are as bad as apple now with their marketing tactics and BS
 
Was there a new leak/release? This isn't quite 1:1 with consumer cards but just saw this for the new rtx 6000
(https://tinyurl.com/y7569fya)
1663706958172.png
 
Motherboard is still there. EVGA should've claimed it was slaughtered by Nvidia as well.
 
The 3070 is 256bit and the bottom tier 4080 is 192bit.
Something is askew there.
The only thing I can think of is that the 4070 uses GDDR6 instead of 6x with possibly 8 and 16 GB options. The 4060ti could then be 128 GB GDDR6x (8GB) or 192 GB GDDR6 with 6 and 12 GB options. That leaves the 3060 with 128 GB GDDR6 at 8GB.

All of the above will be relatively low in bandwidth for their class, but perhaps nVidia is using some sort of infinity cache to compensate.
 
All of the above will be relatively low in bandwidth for their class, but perhaps nVidia is using some sort of infinity cache to compensate.
If I am not mistaken:
Shader CoresL2 Cache
AD10218,43296MB
AD10310,75264MB
AD1047,68048MB
AD1064,60832MB
AD1073,07232MB

The RTX 3090 had 6 mb of L2 cache a 3070 had 4mb the rumoured about 16time more cache was true, I could imagine giant cache making up for memory bandwidth a la RDNA (RDNA 2 had similar L2 cache than ampere but had 128mb infinity cache at the high end, 96mb on the lower end).

That would be less but probably faster cache on ADA.

Maybe AMD proved to them that it worked and that it made a lot of sense power-cost wise.
 
Last edited:
Oh look, the 4080 proper costs more or equal to the 3090ti (current price). Still laughing? :LOL:
Nice try! A simple lol smiley wasn't meant as a thesis on pricing. Also, msrp ($1999 vs 1199) to msrp it doesn't.
 
Last edited:
Really disapponted in these. Overall raster performance is disappointing from 5 games they listed. It’s all about DLSS3 It seems. Only thing they seem to be really focusing on. Plus the pricing on 4080 16GB 1199? Lmao. Wow.
 
  • Like
Reactions: Angry
like this
Name one game that justifies the existence of this series card. Is there even one?

Edit: Disgruntled Battlefield fan here.
NVIDIA-AMD will create incentive for those game to "exist" and imagine some will, Cyberpunk here being the flagship.

I would not be surprised if upcoming Unreal 5 games:


Will be able to justify in the eyes of a lot of buyers those cards.

Stalker 2.0 for example, when it release

Maybe the UE 5.0 tech demo released until now are quite badly optimized and the game studio spending time on them will make the upgrade from a 3080-6800xt irrelevant (like yes there will be option that ran on those under 20FPS like Cyberpunk right now do, but the difference in quality-experience versus setting that ran fast enough for VRR are not that game changing), but those UE 5.0 tech demo usually ran just terribly on my 3070.

But what you describe do tend to be a problem in general (like said above, what didn't ran well enough on a 2080Ti in fall of 2020 or even now ?), has a giant proportion of player have older card and game console, less incentive to make game that are significantly more fun to play with a video cards stronger than a 2080TI/PS5, has those game are rare, not much need to buy said card and a chicken and egg can start.
 
I'm really glad I only do 1600x1200 with my 3080 or I'd have to care about this release.
Um... Is that a typo? Not even wide-screen? I used higher than that resolution almost two decades ago... A 3080 for that is comically overpowered :)!
 
I have the feeling today that NVIDIA is fucked. Inventory issues, product stack issues, pricing issues, market issues. Not a pretty sight for Team Green today.

Then this:


signal-2022-09-20-161416_002.png


I think a lot of us are going to be waiting to see what AMD does next.
 
I have the feeling today that NVIDIA is fucked. Inventory issues, product stack issues, pricing issues, market issues. Not a pretty sight for Team Green today.

I think a lot of us are going to be waiting to see what AMD does next.
As the “computer guy” in my circle I’m telling everyone to hold out to see what RDNA3 is all about and if you must buy Nvidia then grab a used 3080 for sooo much less $
 
Wow what a overpriced pile of gpu, hoping AMD doesn't follow but I have serious doubts they will challenge the pricing stupidity.
 
Wow what a overpriced pile of gpu, hoping AMD doesn't follow but I have serious doubts they will challenge the pricing stupidity.
They're no better, maybe they'll be 100 bucks cheaper across the board, but then you get an inferior software stack. Atleast Nvidia and AMD are nice enough to leave some run away open for intel to release some mid range products and have them be competitive.
 
They're no better, maybe they'll be 100 bucks cheaper across the board, but then you get an inferior software stack. Atleast Nvidia and AMD are nice enough to leave some run away open for intel to release some mid range products and have them be competitive.

Lol, at Intel being competitive. They will just never put the energy needed into them being a serious consideration in the gaming market. It would be nice to have a real third player again but I think that ship sailed.
 
The truth is Nvidia has done some estimation of what they think they can sell the 4000 series cards for in the short term, given the quantities they have on hand and will produce in the next few months and how these cards compare to the 3000 series' current street pricing. After that time, AMD will have launched and they will know how things are selling at that pricing. They will then release "in between" SKUs - "ti" / "super" whatever and can adjust either maintaining price performance or making these new SKUs more of a value prop and making the older SKUs more scarce or continuing to sell both.
If things sell well over the next few months (god I hope not) they will keep the status quo for longer, if they take a beating, they will create bundles, or make lower price Ti cards that are more attractive or both - after they've sold out their 3000 inventory.
Almost certainly they are using these cards' higher pricing to clear out their massive backlog of 3000 series cards.
Personally I think it sucks that these cards are so expensive and that once again, the bar for MSRP has been raised significantly at each tier. I have zero confidence that AMD will help bring prices down at this point.
 
Almost certainly they are using these cards' higher pricing to clear out their massive backlog of 3000 series cards.
They're forcing the issue. By pricing the Ada cards ~$200 too high (at the least) then it forces people right towards the older, less expensive product stack that they're desperate to clear without dropping prices further. It's a smart move but that doesn't mean I'll be tolerating it. I can't give a shit about this line of cards until significant price drops happen.

I'll be very curious to see how the used market responds to this and then the launch of the 4090 next month
 
I mean for the moment with 10,000,000 mining cards doubtlessly about to flood the used market more so than before now. Used cards are a viable option which can keep video games alive... But very few people are going to be able to justify buying these 4000 cards at the prices announced.. thus there will be MANY less used cards on the used market in 1~2yrs time when everyone that has hung tight with 3000 series or RX5000/6000 cards start feeling they need to upgrade to keep up with games...

If at that time the only options are a 4070 (cough 4080) for $900 and /or used cards at inflated prices because supply of used cards is low, people are going to start noping out of pc gaming en masse.... This is a hobby and as it becomes more and more expensive more and more people are going to be priced out and will stop gaming / switch to consoles (which are also kinda overpriced now) ...

But less people playing == less games / worse games / more expensive games as the numbers aren't there in sales anymore..

pray that AMD can deliver something more compelling at a lower price point (meh not hopeful).... or Intel manages to sort their **** out and do something viable with ARC to put downward price pressure on video cards.

Oh and the power requirements are going to make anyone willing (most people) to shell out for these new cards have to buy a new power supply that won't be cheap either since you are going to need moar power...

If course doesn't change the pricing is going to drive a high number of people out of video gaming and potentially kill gaming on the pc as we've known it at least.... I'm sure it will be good fun to game when the only people who can afford it are "rich kids of Instagram" and the dudes at the Country Club...
 
According to MLID: "Lovelace was designed for peak performance at the expense of everything else, and that includes cost to produce."

"Lovelace costs at least 50% more to make, per tier, compared to Ampere."

 
1600 for 4090. Sounds like a plan. Probably not day one until games that challenge my 3090 Ti come out.
 
seems like the 4090 and 3090 are the best cards...both 4080 variants are a disappointment...the 3080 10GB is a better card than the 4080 12GB variant and the 3090 seems like the better value over the 4080 16GB variant...the 4080 12GB is pretty much a 4070
 
Last edited:
Looks like the 4070 will end up being the 10 GB 160-bit card, about 10% slower than the 4080 12 GB. More or less on-par with the 3080 Ti w/o DLSS 3.0.
Guessing $700 MSRP.
 
The price increase is not surprising at all when you think about how much money you need to spend on R&D to get something better than the last gen these days. Performance wise, all low hanging fruit must have been picked a long time ago. And they need to recoup R&D as well as production costs, driver development for years to come, and a nice profit on top of that. Is 1200+ dollar unreasonable for a graphic card, compared to what you pay for the latest Apple or Android top model? Probably not, but of course people must decide if they think it is worth paying.
 
Wow what a overpriced pile of gpu, hoping AMD doesn't follow but I have serious doubts they will challenge the pricing stupidity.
AMD might challenge the prices as they did at the high end on the previous gen, but the most effective thing is if the consumers challenge the pricing by not buying. Nothing tells Nvidia overpriced better than tons of inventory that they are unable to ship.

Personally I find the 4000 series to look underwhelming so far in performance per dollar and only the 4080 16gb and the 4090 look like a decent performance upgrade over a 3080 10GB version. Seems it is either team red, a 30% price slash on the 4080 16gb version or generation skip on my part. Thankfully there aren't many games that challenge my 3080 on max at 1440p and not that many games coming out in the next 24 months that got me hyped so don't need the upgrade atm.
 
Last edited:
I would like one rtx 4080 12gb (if someone gave it to me for free), it would be just right for my ryzen 2700x and 1080p resolution, I could run rdr2 at 100fps on ultra settings without any problems.
And the old corsair tx 650w would be just right loaded, neither too much nor too little.

Otherwise, yes, elitism is going for the richest as well.
Those who have already bought rtx 3090 and rtx 3080 will have no problem buying rtx 4090 and 4080 16gb. Because that is basically their only progress.With rtx 4080 12gb and weaker there is no progress at all.
Now Jensen figured out that there are enough elites in Europe and that he can make extra money, so he increased the prices for Europe even more.
I have no doubt at all that the rtx 4090 and rtx 4080 16gb will go like hot cakes in Europe, the only question is whether they will be in stock.
https://www.tomshardware.com/news/geforce-rtx-40-series-gpus-are-22-more-expensive-in-europe
Obviously, the demand is there, so the prices can go sky high.
 
1600 for 4090. Sounds like a plan. Probably not day one until games that challenge my 3090 Ti come out.
Such games have already come out, cyberpunk, etc., which bring the rtx 3090ti to its knees.
Only if you want to cheat and turn on dlss.
But without dlss and with raytracing on 4K, the rtx 3090ti crashes, and not only cyberpunk.
 
Yea this is going to make 4K gaming much more playable now with all settings cranked. Don't think I'm gonna wait another 2 years for 5xxx series.
Honestly only the 4090 is worth upgrading too. I am perfectly satisfied keeping my 3080ti for another gen. I am also too lazy to mess with my WC loop and don't really want to upgrade my PSU which I recently bought.
 
None yet, but the same could have been said last generation. A 2080Ti will still play just about everything at 4K/60 fps.
I don’t think 3090 and 40 series will really be pushed until more next gen games come out in a few years. PS4/xbox is still holding things back as the targets for game development. For now only 4k gamers and high refresh 1440p gamers might benefit. I think I’ll stay with my trusty 3070 and skip this gen. Maybe wait for a cheap 3090ti.
 
Seeing the margins for nvidia at 70+ % already and having them increase the pricing on cards in depressed market, I have lost all respect for them much like apple.
70%+ margins is pure greed, and as a consumer I will not support that.
 
Such games have already come out, cyberpunk, etc., which bring the rtx 3090ti to its knees.
Only if you want to cheat and turn on dlss.
But without dlss and with raytracing on 4K, the rtx 3090ti crashes, and not only cyberpunk.
Why wouldn’t you turn on DLSS? It looks inconsequentially different (in some cases better) on Cyberpunk and gives an enormous frame rate advantage. Even at launch I played Cyberpunk with DLSS and was impressed. I’m sure it’s only gotten better since.

Prices moving up aren’t surprising. CEO Jensen H. said like 5 years ago that he intended to move Nvidia prices higher successively. I bet these ada (40x0) cards have a dip in sales figures for Nvidia relative to ampere (30x0) — much like the turning cards (20x0) did in relation to pascal (10x0) cards. The crypto boom and bust waves seem to assure that.
 
Back
Top