RTX 3xxx performance speculation

Do you guys know if there are any physical differences between tuf and tuf oc other than gpu speeds? Also please correct me the bios switch is to change the fan profiles? It doesn’t changes gpu clocks right? As usual I will use afterburner to control voltages and fan profiles. I will probably try to undervolt it if possible to reduce heat/power consumption.

Regular TUF physically looks the same to me as what I saw in reviews of OC.

Card is too big though... But hey, who needs fans :)

tuf._sm.jpg
 
NVIDIA RTX 3090 Gaming Benchmarks Leaked: Just 10% Faster than the RTX 3080
https://www.hardwaretimes.com/nvidi...arks-leaked-just-10-faster-than-the-rtx-3080/

Bunch of benchmarks in the article.
Wonder if this is a new data set or the same data set that Videocardz reported off of. If it’s a separate set then there is a little bit of validity. Best still wait for official benchmarks but the leaks so far have painted a worse picture than I imagined for the 3090 over the 3080 price to performance gaming wise.
 
Wonder if this is a new data set or the same data set that Videocardz reported off of. If it’s a separate set then there is a little bit of validity. Best still wait for official benchmarks but the leaks so far have painted a worse picture than I imagined for the 3090 over the 3080 price to performance gaming wise.
Definitely waiting for more benches, especially from places like here and Hardwareunboxed. I've seen places that show differences of up to 19%.

If 10% is the best case scenario then I'll probably just wait on the 20GB 3080s. Somewhat glad I held on to my 2080Ti.
 
450-650W SFX vs 3080



am i the only one that thinks it is wrong him saying that a 600 watt PSU is ok? It doesn't seem like he accounts for anything extra like fans, rgb lighting, hard drives and such for power draw. Even in the video, he says that the power draw from the wall hit over 630watts in some cases yet he says it is ok? Yes it is true that not everyone will be using a 10900k OC but still, i would be super leary then..
 
am i the only one that thinks it is wrong him saying that a 600 watt PSU is ok? It doesn't seem like he accounts for anything extra like fans, rgb lighting, hard drives and such for power draw. Even in the video, he says that the power draw from the wall hit over 630watts in some cases yet he says it is ok? Yes it is true that not everyone will be using a 10900k OC but still, i would be super leary then..
Indeed they are operating over spec. Units like the SF600 are capable of over 600W but I wonder how sustainable that is...
 
I'm waiting on the 3090 benchmarks to decide if I want one or a 20GB 3080.

5-10% over 3080 for twice the price lmfao

from vidcardz leaked 3090 review:

Benchmark : 3090 : 3080 : % gain over 3080


3DMark Time Spy Extreme99489000+10.5%
3DMark Port Royal1282711981+7.1%
Metro Exodus RTX/DLSS OFF54.448.8+10.2%
Metro Exodus RTX/DLSS ON74.567.6+10.2%
Rainbow Six Siege275260+5.8%
Horizon Zero Dawn8476+10.5%
Forza Horizon156149+4.7%
Far Cry10799+8.1%
Assassins Creed Oddysey7165+9.2%
Shadow of the Tomb Raider RTX/DLSS Off9183+9.6%
Shadow of the Tomb Raider RTX/DLSS On111102+8.8%
Borderlands 367.661.3+10.3%
Death Stranding DLSS/RTX ON175164+6.7%
Death Stranding DLSS/RTX OFF116104+11.5%
Control DLSS/RTX ON7165+9.2%
Control DLSS/RTX OFF6257+8.8%
 
  • Like
Reactions: Elios
like this
5-10% over 3080 for twice the price lmfao

from vidcardz leaked 3090 review:

Benchmark : 3090 : 3080 : % gain over 3080


3DMark Time Spy Extreme99489000+10.5%
3DMark Port Royal1282711981+7.1%
Metro Exodus RTX/DLSS OFF54.448.8+10.2%
Metro Exodus RTX/DLSS ON74.567.6+10.2%
Rainbow Six Siege275260+5.8%
Horizon Zero Dawn8476+10.5%
Forza Horizon156149+4.7%
Far Cry10799+8.1%
Assassins Creed Oddysey7165+9.2%
Shadow of the Tomb Raider RTX/DLSS Off9183+9.6%
Shadow of the Tomb Raider RTX/DLSS On111102+8.8%
Borderlands 367.661.3+10.3%
Death Stranding DLSS/RTX ON175164+6.7%
Death Stranding DLSS/RTX OFF116104+11.5%
Control DLSS/RTX ON7165+9.2%
Control DLSS/RTX OFF6257+8.8%
I wonder what the GPU boost clock is on the 3090 - if it's as aggressive as the 3080 appears to be (hence the low OC headroom), or if there's a lot more to be gained from OCs on them.
 
Weird, wonder if it is hitting a power limit. Based on specs the 3090 should be around 20% faster.
 
Weird, wonder if it is hitting a power limit. Based on specs the 3090 should be around 20% faster.
Differences between specs don’t always translate 1:1 in real world performance. Will vary in every application. I imagine a ~10% average increase in gaming performance and closer to a ~20% average increase in productivity performance (not impacted specifically by VRAM) might be likely. But we’ll find out soon.
 
So I mean you are telling me you are paying $800 for 10% performance and 12gb more memory?

Must be some expensive ass memory
 
So I mean you are telling me you are paying $800 for 10% performance and 12gb more memory?

Must be some expensive ass memory
Hell no. In my opinion, if you’re only gaming then the 3090 (based on current leaks) is a terrible purchase over the 3080 from a price to performance perspective.
 
Hell no. In my opinion, if you’re only gaming then the 3090 (based on current leaks) is a terrible purchase over the 3080 from a price to performance perspective.

On the brightside, if youre a gamer, and u did end up buying a gaming card, the 3090, only to game on, at least you can brag about being able to do science stuff
 
I really wonder if the 3090 it's faster with 50 % than the RTX 2080 Ti
If 10% faster than a 3080, than at 4K it should be >50% than a 2080 Ti in general with a large dataset of games. At lower resolutions it is harder to utilize the extra fp32 ability of Ampere, it pulls away at higher resolutions.
 
Hell no. In my opinion, if you’re only gaming then the 3090 (based on current leaks) is a terrible purchase over the 3080 from a price to performance perspective.

This is true, but that's always been the case for halo/titan-class cards. I doubt anyone buying a card like the 3090 to use primarily for gaming is overly concerned with price-performance numbers. The leaked benchmarks put me a bit on the fence. I don't really care how it does at 4K, I'm more interested in how Ampere compares to the 2080 ti at 1440p ultrawide.
 
I wonder what the GPU boost clock is on the 3090 - if it's as aggressive as the 3080 appears to be (hence the low OC headroom), or if there's a lot more to be gained from OCs on them.

Honestly Nvidia is already pushing these past the power efficiency of the chips, i doubt you can get much if anything past what they've already pushed them to. Unless you think 2-5% overclock is substantial.
 
So I mean you are telling me you are paying $800 for 10% performance and 12gb more memory?

Must be some expensive ass memory
It’s 14GB more memory, not 12GB.

Also, those numbers from that one leaked set of benchmarks doesn’t seem to match the other leaked benchmarks.

I’m wondering if there were some driver issues in that leak, if genuine, or some other environmental variable (temp, power, etc.).

Also, if I recall correctly, the boost clock on stock 3090 was around 10-15MHz lower than the 3080, but we have seen the 3080 boost much higher than its stated boost clock. I am curious how much that 3090 was boosting at and if it was much lower than the 3080.
 
If 10% faster than a 3080, than at 4K it should be >50% than a 2080 Ti in general with a large dataset of games. At lower resolutions it is harder to utilize the extra fp32 ability of Ampere, it pulls away at higher resolutions.
You mean, 3080 is 40 % faster than 2080 Ti ??
 
You mean, 3080 is 40 % faster than 2080 Ti ??
No, it is around 38% at 4K, 10% > 3080 performance is much great greater when compared to a 2080Ti since 1% performance of the 3080 is greater than 1% performance of a 2080Ti.
 
So if swapping from 2080 ti to 3080 doesnt make sense, and from 3080 to 3090 doesnt make sense,then swappin from 2080 ti 3090 doesnt make sense?
 
Every buyer has their own performance per dollar threshold which needs to be met.

Personally, I balked at a 2080ti due to cost. But if i had one - I'd probably sit tight right now. Everything runs pretty well, eh? YMMV.
 
So if swapping from 2080 ti to 3080 doesnt make sense, and from 3080 to 3090 doesnt make sense,then swappin from 2080 ti 3090 doesnt make sense?

It makes sense if you want the best and if you have the money. Otherwise wait for 3080 20GB or if 25% more perf doesn't matter to you wait for 7nm Hopper cards.
 
These GPUs are so fast wr need faster CPUs.
We use to be almost entirely GPU bound, with tiny fps improvements after overerclocking CPUs. But these modern GPUs gain FPS with CPU overclocks, not just in the silly low res conditions. And this is before the new consoles start making their big impact on PC gaming requirements.

It's really going to be interesting once those future games of the next gen consoles become the norm.

But ya know, it sure would be nice if AMD manages meaningful CPU gains in these workloads. I would be ecstatic if they finally top Intel in gaming performance with their, soon to be launched, next gen Zen.
 
These GPUs are so fast wr need faster CPUs.
We use to be almost entirely GPU bound, with tiny fps improvements after overerclocking CPUs. But these modern GPUs gain FPS with CPU overclocks, not just in the silly low res conditions. And this is before the new consoles start making their big impact on PC gaming requirements.

It's really going to be interesting once those future games of the next gen consoles become the norm.

But ya know, it sure would be nice if AMD manages meaningful CPU gains in these workloads. I would be ecstatic if they finally top Intel in gaming performance with their, soon to be launched, next gen Zen.
Without a target framerate, we dont know whether you might become CPU bound.
Just because a GPU isnt maxed out at a particular res doesnt mean it matters automatically.
 
10%? Someone is trying to lower expectations. It's going to be much higher than 10%. You don't create an entirely new power solution for a 10% increase.
 
Every buyer has their own performance per dollar threshold which needs to be met.

Personally, I balked at a 2080ti due to cost. But if i had one - I'd probably sit tight right now. Everything runs pretty well, eh? YMMV.
I have a 2080Ti- I have a 5820k @ 4.3GHz and game at 4k/60Hz. More than a few titles can't max out at the resolution/refresh rate. I'm waiting on the next generation of CPU's before I upgrade there, but I was considering a 3090 as a way to play some of the newer games maxed out at 4k. If there isn't much benefit to be had with a 3090 I'll probably upgrade my CPU in the meantime and wait on a 20GB 3080.
 
Without a target framerate, we dont know whether you might become CPU bound.
Just because a GPU isnt maxed out at a particular res doesnt mean it matters automatically.

So if you are strictly speaking in terms of the impact personally in particular cases, I agree. The user's experience while playing a game, that's a different matter, albeit mostly insignificant.

I was talking about how it relates in other terms. So specific to benchmarks and fps of cards that are fast enough they end up wasting cycles waiting on the CPU between frame. Being bottlenecked, most people vision this like performance just hits a wall, but that is in the far reaching end, that is being totally and completely bottlenecked. But there is a lot of space in between no CPU bottleneck and total CPU bottleneck.

At the intended resolutions of the high end Ampere cards, the CPU is not a total brick wall bottleneck. I just find it interesting the 3080 is already being held back, bound, it is restricted by the CPU currently, in some games. Even if it is ever so slightly, any card more powerful will be even more impacted, if ran on the same CPU. This of course will show up in scaling, higher specs will not scale linearly. Diminishing gains.

I don't think many people are going to be thinking about mild to moderate CPU bottleneck as being a factor when the 3090 launch reviews come out. Yet there will be results impacted by such.

Then I started thinking about how the new wave of next gen console games may be even harder on CPUs. I am just hopeful AMD makes awesome gains in gaming performance with their next gen Ryzen. We need CPUs to advance and stay as far ahead as possible so that GPUs don't get stuck or held back.
 
Last edited:
As I said I will wait for the 3090 benchmarks from a wide range of sources before I decide.

That said, even if I wanted one I doubt I'll be able to get one during the first wave anyways.
 
Back
Top