RTX 4xxx / RX 7xxx speculation

I’ll be trying for day 1 with the RTX 4090.

I’ve been trying AMD for years (last card was the 6900 XT). They’re good - but just not what I prefer when it comes to features, performance, and drivers.

Everyone really thinks AMD is gonna be #1 this year?
no, because, as before, they will be patched up so that they may be a little cheaper and weaker so that both of them make money, all of this is agreed in advance, even though it seems like some kind of competition.
They could win this time, but somehow I don't believe it, they all cooperate with each other.
If by chance the rtx 4090 is threatened, then the 4090ti goes out.
I also believe that the weaker rtx 4070 and others will not come out for a very long time.
The same is true for amd, they will release 2 or 3 stronger ones and that's it.
And the old rtx 3000 and rx 6000 still remain in production.
That's their best tactic, that's how I would do it to earn as much as possible.
 
Last edited:
1663847982661.png


e-street-one-of-these-things-is-not-like-the-other.gif
 
Everything about these cards just seem fifty types of ridiculous. Feels like they are just pushing the bar to see how gullible or desperate their customers are.

Can't wait for their stock to plummet amidst "underwhelming sales", so I can buy up more of their stock for the inevitable jump once they get their shit together.
 
So basically nvidia has another 1st gen Fermi on their hands. When the reviews come out we will see how they're only ~20% faster in raster perf/watt and only can only make performance gains with raw power.

And I expect to see 600w AIB variants sooner rather than later, and definitely by the time AMD shows up in full force.
 
There was no doubt there. It's the same thing as Raytracing on GTX cards. It's a software limitation they setup themselves.
It’s not really the same as that as playing RTX games without RTX hardware wouldn’t be worthwhile. A better example would have been RTX voice.
 
Man ASUS really screwed up the Strix this time.
The Asus RTX 4090 looks pretty decent IMO. RGB can be turned off. Has zero fan mode on both bios options and the cooler looks like its capable of running the 4090, while keeping it pretty quiet without sacrificing too much performance. Was very pleased with the Asus 2080 TI Rog strix. Hugh ass cooler, but in my case it will fit pretty well. Looking forward to the reviews and also to the comparisons with the new AMD series in November. :)
 
It’s not really the same as that as playing RTX games without RTX hardware wouldn’t be worthwhile. A better example would have been RTX voice.
I'm more referring to how they backtracked and later allowed GTX cards to turn on raytracing for games and such. Sure, wasn't great, but was possible.
 
Everything about these cards just seem fifty types of ridiculous. Feels like they are just pushing the bar to see how gullible or desperate their customers are.

Because there are hundreds of thousands of "gullible" sheep just like this one I discovered this morning:

"As the title states, I am Nvidia's target customer. I have more money than sense and I have upgraded every gen since the 500 series. I used to SLI 560's, 780's, 780ti's (I know, I know,) 980ti's, before settling on a single 1080ti, 2080ti, and currently have a 3090. Have a few other random cards I've acquired over the years 770, 980, 2080S. All paperweights. I generally pass on my previous gen to a friend or family member to keep it in my circle and out of miner's hands. As (somewhat) selfless as that may sound, once I upgrade to the new and shiny, I have little regard for my old cards. Having the hardware lust I have developed over the years has me needing to have the best so I can overclock, benchmark, and buy new games that I marvel at for 20 minutes max before moving on to the next "AAA" title I see. I collect more than enjoy I suppose. In my defense, I did finish Elden Ring this year."
 
The 4090 seems worth it to me, but I’m in no rush and will wait and see what AMD offers.

Also other than Ray traced Cyberpunk I haven’t seen anything that really needs a new card running at 4K as freesync/gsync and a 3080 seems to handle everything I play well. Although this year it was mainly just Elden ring and Destiny 2.
 
I will LOL even harder at a 192 bit 4080 that's really just a rebranded 4070, but now more expensive.

nvidia wants people to buy their 30xx cards that is why launching a non-remarkable 4080 cards !?

We've run the numbers and Nvidia's RTX 4080 cards don't add up​

$900 for a 192-bit graphics card? Seriously?

https://www.pcgamer.com/nvidia-rtx-40-series-let-down/?utm_campaign=socialflow

In terms of its relationship with the RTX 4090, the new RTX 4080 12GB is more akin to the RTX 3060 Ti with its 4,864 shaders. Except the RTX 3060 Ti at least had a 256-bit memory bus. The RTX 4080 12GB only has a 192-bit bus. Oh, and the RTX 4080 is $900.
 
Ya, Im surprised people are saying "I am fine with the pricing. Its worth it to me." This is downright ignorance and Nvidia is really milking these people hard.

nvidia wants people to buy their 30xx cards that is why launching a non-remarkable 4080 cards !?

We've run the numbers and Nvidia's RTX 4080 cards don't add up​

$900 for a 192-bit graphics card? Seriously?

https://www.pcgamer.com/nvidia-rtx-40-series-let-down/?utm_campaign=socialflow

In terms of its relationship with the RTX 4090, the new RTX 4080 12GB is more akin to the RTX 3060 Ti with its 4,864 shaders. Except the RTX 3060 Ti at least had a 256-bit memory bus. The RTX 4080 12GB only has a 192-bit bus. Oh, and the RTX 4080 is $900.
 
That's a bold statement to make when AMD hasn't even shown their hand yet.

I don't make the videos, I just share 'em. I think he's trying to influence AMD more than his audience.
 
I find the omission of Display Port 2.0 interesting. It implies we won't be getting very high refresh rate 4K displays any time soon. :(
 
I find the omission of Display Port 2.0 interesting. It implies we won't be getting very high refresh rate 4K displays any time soon. :(
And surprising, Intel Arc card has DP 2.0, Zen 4 motherboard seem to have a DP 2.0 port, so it would be special for RDNA 3 not have it has well.

Maybe it is a sign how little VR matter.
 
Will the 4080 16GB be enough of an upgrade if you already have a 3090 / 3080Ti / 3080, I’m not so sure... In a sense the 4090 is the most interesting but obviously very expensive.
 
Will the 4080 16GB be enough of an upgrade if you already have a 3090 / 3080Ti / 3080, I’m not so sure... In a sense the 4090 is the most interesting but obviously very expensive.
Will have to wait for the reviews and depend on the game-mode you play I feel like, a 3090 TI could be quite the horizontal move, when we look at this:
735763_21100644451l.jpg

* the DLSS frame generation available when applicable, DSSL performance on the 3090TI when available is not super clear, but NVIDIA own slide show scenario with a modest gap between a 3090TI and a 4080 16GB, the 3090TI was a $2000 dollars card just 3 months or so ago and you can see massive gain in some context (yet to see the what the experience value are) and you have path tracing.

If you can sell that 3090Ti to someone that prefer the 24 gig of ram the price of your 4080 16gig has a gamer could be well worth it.

That generation could need a killer app (a game for which path tracing do change the experience) or for that frame insertion to be quite nice to be worth it for the 3090-3090TI owner under that 4090 model.
 
So basically nvidia has another 1st gen Fermi on their hands. When the reviews come out we will see how they're only ~20% faster in raster perf/watt and only can only make performance gains with raw power.

And I expect to see 600w AIB variants sooner rather than later, and definitely by the time AMD shows up in full force.

In what world is it only 20%? AMD is going to get slaughtered just like always.
 
In what world are the 6000 series "slaughtered" outside of cherry picked examples?

So AMD wasn’t a choice for most people last gen, but this GPU with 2.5x the transistors and higher clocks speeds, ignoring the architectural changes AND DLSS, ect., is going to have trouble against AMD? I mean come on…
 
So AMD wasn’t a choice for most people last gen, but this GPU with 2.5x the transistors and higher clocks speeds, ignoring the architectural changes AND DLSS, ect., is going to have trouble against AMD? I mean come on…
That the 4090 that seem a good card (if they are at those price) for what they were relative to the previous generation.

The 4080s seem to use that higher density of transistor to have a much smaller die and thus a much smaller boost in transistor count than that (will have to see, but on techpowerup right now:
https://www.techpowerup.com/gpu-specs/geforce-rtx-4080-16-gb.c3888
Die size: 380 mm² (the 3080 were 628 mm²)

At a 270% gain to go from Samsung 8 to TSMC 4, that would be a 1.6x transistor count I think (if techpowerup is not completely wrong on that one).

Hard for me to imagine AMD having the smallest chance to be actually slaughtered in the actual offering of perf by $, they will price and clocks-push power in their offer accordingly to be competitive, and their products will be more than good enough and produced at a good enough price to be almost certain that this will happen, they are talking of an above 50% gain in performance by watt, which certainly put them up there.

Nothing indicate that either will clubber either, the margin are high enough on the Nvidia side to adjust accordingly to AMD if they start to eat their sales.
 
My apologies, I thought the 6xxx series was next gen not current gen. I was still ranting about next gen. You’re right, 6900xt was competitive rasterized.
Fair enough. I just wonder what prompts people to assume Nvidia will "slaughter" AMD in performance when more often than not that doesn't happen.

Just about the entire lineup of Navi has been competitive.
 
My apologies, I thought the 6xxx series was next gen not current gen. I was still ranting about next gen. You’re right, 6900xt was competitive rasterized.
Do you doubt those claim from AMD:
small_rdna-3-features.jpg


They have in their recent track record, full benefit of the doubt of not exagerating, specially with the confidence of using an >50% instead of an ~50%

Or do you think that a 50% boost (with some room to increase power if they need) will not be enough ?

if you mean by: going to get slaughtered, that they will again sales less than 30% of discrete video cards this falls and next years, maybe yes, but in term of competitive video card for the price ? You are basing this on what exactly ?
 
I find the omission of Display Port 2.0 interesting. It implies we won't be getting very high refresh rate 4K displays any time soon. :(
Yeah I thought the same, im waiting for 16:9 4k 240hz screens to hit the market before getting a new rig, but without DP2.0 I guess it's not going to be happening any time soon...
 
So are we to expect a 100 dollar increase every fucking year? Every generation goes up 200 bucks? That's been the trend since the 1000 series and it's getting ridiculous.

In 4 years we're going to be paying over 2k for a 6080 gpu at this rate. Getting ridiculous. I'm surprised there's not more outrage about this here lol.
 
Im not mad at the price but i will likely skip this gen. 1600 usd is 2160cdn add 12% sales tax and terrifs and it would be almost $2500 cdn. I skipped 2000 series for this very reason. To me the 4080 16 is way too cut down for the price to make sense. Thats the nice part about getting flagship (i have 3090) in that 8t makes it easier to skip a gen when needed.
 
Back
Top