RTX 3xxx performance speculation

Yeah, it sure is. Not looking forward to the swamp-ass and back-sweat I'll end up with while I squish around on my vinyl gamer chair when my office heats up to 90 degrees.
main issue with upwards ejection of hot air is for samsung b-die ram users as those (when overclocked) have almost zero tolerance for extra heat
 
If Nvidia is recommending the 3990x for 3090 SLI. Then that tells you something when it comes to PCIE 4.0. AMD systems are really top notch now, Do not hesitate going AMD, specially with Zen 3 right around the corner.

Where does it show Nvidia is recommending the 3990x for RTX 3090 SLI? Sorry I just can't find it anywhere.

From a gaming perspective, I'm pretty sure a much cheaper 3960X Threadripper would perform the same, if not better than the 3990X (has higher base clock, boost clock, less heat overall, and I'm pretty much TOO many cores can sometimes negatively affect games from what I saw in the benches). They both have 64 lanes of PCIE 4.0.

I was thinking of settling with 8x/8x PCIE 4.0 going with the $480 3900XT (12 core, and better clock speeds than any Threadripper). Big pill to swallow though going to a $1,350 3960X for 16x/16x PCIE 4.0 + a more expensive motherboard. I will gladly drop $1,500~$3,000 on GPU's but I can't say the same for motherboards & processors.
 
Last edited:
main issue with upwards ejection of hot air is for samsung b-die ram users as those (when overclocked) have almost zero tolerance for extra heat
Guess that it is good those are not built with Samsung RAMs. Samsung does not build GDDR6X.
 
I would caution 2080 Ti owners from selling their cards too cheap right now because of the frenzy. Look past the Eurogamer NVIDIA dictated benchmarks and take a look at NVIDIAs own slide:
ampere.png


Edit: I had confused the x-axis for percent rather than frames, my mistake.

Look at Borderlands 3, you're seeing about a 30% increase between 2080 and 3080, not the massive 60-70% touted in other games. The other two have RTX on which just shows Ampere is much better with RTX but that doesn't really speak to raster performance. Also, wait on AMD, I think everyone will be pleasantly surprised IMO. I'm one of the people who always used to joke about "wait on AMD" but I honestly think this time around they have something worthwhile. Also, something worth noting is the CUDA counts for these cards were much different in some leaked marketing material before they were pulled. There's just a lot of unknowns with Ampere right now so it's better to wait and hold.
 
Last edited:
I would caution 2080 Ti owners from selling their cards too cheap right now because of the frenzy. Look past the Eurogamer NVIDIA dictated benchmarks and take a look at NVIDIAs own slide:
View attachment 275730

Look at Borderlands 3, you're seeing about a 30% increase between 2080 and 3080, not the massive 60-70% touted in other games. The other two have RTX on which just shows Ampere is much better with RTX but that doesn't really speak to raster performance. Also, wait on AMD, I think everyone will be pleasantly surprised IMO. I'm one of the people who always used to joke about "wait on AMD" but I honestly think this time around they have something worthwhile. Also, something worth noting is the CUDA counts for these cards were much different in some leaked marketing material before they were pulled. There's just a lot of unknowns with Ampere right now so it's better to wait and hold.

Hmmmm, looks like B3 went from about 35 FPS to 62 FPS. That's more than a 30% increase in framerate. Do the math again.
 
Hmmmm, looks like B3 went from about 35 FPS to 62 FPS. That's more than a 30% increase in framerate. Do the math again.

Ok you're right, I was thinking % rather than FPS, a friend just pointed that out to me in discord before you replied. That's still not as massive as the gains we saw in other titles so I'd still be weary of the massive performance claims. Also one thing I'm concerned about is the claimed CUDA core counts. There was AIB marketing material for the 3090 that showed around 5700 CUDA cores and now we're seeing over 10,000 and I noticed NVIDIA also claimed a 1.9x/watt increase between Turing and Ampere so did the CUDA cores really increase to 10k or are they using fuzzy math?
 
Last edited:
Ok you're right, I was thinking % rather than FPS, a friend just pointed that out to me in discord before you replied. That's still not as massive as the gains we saw in other titles so I'd still be weary of the massive performance claims. Also one thing I'm concerned about is the claimed CUDA core counts. There was AIB marketing material for the 3090 that showed around 5700 CUDA cores and now we're seeing over 10,000 and I noticed NVIDIA also claimed a 1.9x/watt increase between Turing and Ampere so did the CUDA cores really increase to 10k or are they using fuzzy math?

77% isn't massive?

The AIB marketing material is incorrect. There are twice the number of FP32 ALUs in each SM.
 
Hmmmm, looks like B3 went from about 35 FPS to 62 FPS. That's more than a 30% increase in framerate. Do the math again.

Yeah, I measured the pixels of each bar: at 304/170 = 79% improvement. So a wee bit more than 30%. ;)

Not coincidentally, that is about the same as what Digital Foundry measured in their testing.

The real problem like any first party testing is these are the games NVidia wants us to see. NVidia doubled up on FP32 only when Int32 is not being used. So some FP32 heavy games will benefit more. Games that use more Int32 won't benefit as much. We need a wider mix of games, not the cherry picked set.
 
77% isn't massive?

The AIB marketing material is incorrect. There are twice the number of FP32 ALUs in each SM.

Well I did a cursory look at the bar since there aren't fixed values and figured around 55% vs the RTX ones that show what looks like close to 70% or more.
 
It's been 2 months since I played a video game I am beginning to have withdrawal symptoms.
I'm going to end up replaying BotW on WiiU before this is over.
 
My god its running a good 4k 120fps......with the 3090 being what 20-25% faster that is a good sign that 4k120hz gaming can become a real thing.
 
Is any info when i can preorder and pay to nvidia for card using PAYPAL or something? ; D
 
Free is free, but of all the upcoming RTX games, I was least looking forward to that.

As someone who's generally very hard on Ubisoft lately, I'm actually pleased by this choice. The other two upcoming games they featured are Cp2077 and BLOPS. CP I've had pre-ordered since the day they were made available, so I don't need another copy of that. The BLOPS games suck, Ive hated every one of them. Zero interest in owning that. Legion looks OK. I'll probably be disappointed because Ubisoft, but it still looks OK. Free is the perfect price for it.
 
As someone who's generally very hard on Ubisoft lately, I'm actually pleased by this choice. The other two upcoming games they featured are Cp2077 and BLOPS. CP I've had pre-ordered since the day they were made available, so I don't need another copy of that. The BLOPS games suck, Ive hated every one of them. Zero interest in owning that. Legion looks OK. I'll probably be disappointed because Ubisoft, but it still looks OK. Free is the perfect price for it.

Atomic Heart is looking pretty freaking good too
 
  • Like
Reactions: Auer
like this
Back
Top