TaintedSquirrel
[H]F Junkie
- Joined
- Aug 5, 2013
- Messages
- 11,837
XC3 teardown
Do you guys know if there are any physical differences between tuf and tuf oc other than gpu speeds? Also please correct me the bios switch is to change the fan profiles? It doesn’t changes gpu clocks right? As usual I will use afterburner to control voltages and fan profiles. I will probably try to undervolt it if possible to reduce heat/power consumption.
Wonder if this is a new data set or the same data set that Videocardz reported off of. If it’s a separate set then there is a little bit of validity. Best still wait for official benchmarks but the leaks so far have painted a worse picture than I imagined for the 3090 over the 3080 price to performance gaming wise.NVIDIA RTX 3090 Gaming Benchmarks Leaked: Just 10% Faster than the RTX 3080
https://www.hardwaretimes.com/nvidi...arks-leaked-just-10-faster-than-the-rtx-3080/
Bunch of benchmarks in the article.
Definitely waiting for more benches, especially from places like here and Hardwareunboxed. I've seen places that show differences of up to 19%.Wonder if this is a new data set or the same data set that Videocardz reported off of. If it’s a separate set then there is a little bit of validity. Best still wait for official benchmarks but the leaks so far have painted a worse picture than I imagined for the 3090 over the 3080 price to performance gaming wise.
450-650W SFX vs 3080
Indeed they are operating over spec. Units like the SF600 are capable of over 600W but I wonder how sustainable that is...am i the only one that thinks it is wrong him saying that a 600 watt PSU is ok? It doesn't seem like he accounts for anything extra like fans, rgb lighting, hard drives and such for power draw. Even in the video, he says that the power draw from the wall hit over 630watts in some cases yet he says it is ok? Yes it is true that not everyone will be using a 10900k OC but still, i would be super leary then..
I'm waiting on the 3090 benchmarks to decide if I want one or a 20GB 3080.
3DMark Time Spy Extreme | 9948 | 9000 | +10.5% |
---|---|---|---|
3DMark Port Royal | 12827 | 11981 | +7.1% |
Metro Exodus RTX/DLSS OFF | 54.4 | 48.8 | +10.2% |
Metro Exodus RTX/DLSS ON | 74.5 | 67.6 | +10.2% |
Rainbow Six Siege | 275 | 260 | +5.8% |
Horizon Zero Dawn | 84 | 76 | +10.5% |
Forza Horizon | 156 | 149 | +4.7% |
Far Cry | 107 | 99 | +8.1% |
Assassins Creed Oddysey | 71 | 65 | +9.2% |
Shadow of the Tomb Raider RTX/DLSS Off | 91 | 83 | +9.6% |
Shadow of the Tomb Raider RTX/DLSS On | 111 | 102 | +8.8% |
Borderlands 3 | 67.6 | 61.3 | +10.3% |
Death Stranding DLSS/RTX ON | 175 | 164 | +6.7% |
Death Stranding DLSS/RTX OFF | 116 | 104 | +11.5% |
Control DLSS/RTX ON | 71 | 65 | +9.2% |
Control DLSS/RTX OFF | 62 | 57 | +8.8% |
I wonder what the GPU boost clock is on the 3090 - if it's as aggressive as the 3080 appears to be (hence the low OC headroom), or if there's a lot more to be gained from OCs on them.5-10% over 3080 for twice the price lmfao
from vidcardz leaked 3090 review:
Benchmark : 3090 : 3080 : % gain over 3080
3DMark Time Spy Extreme 9948 9000 +10.5% 3DMark Port Royal 12827 11981 +7.1% Metro Exodus RTX/DLSS OFF 54.4 48.8 +10.2% Metro Exodus RTX/DLSS ON 74.5 67.6 +10.2% Rainbow Six Siege 275 260 +5.8% Horizon Zero Dawn 84 76 +10.5% Forza Horizon 156 149 +4.7% Far Cry 107 99 +8.1% Assassins Creed Oddysey 71 65 +9.2% Shadow of the Tomb Raider RTX/DLSS Off 91 83 +9.6% Shadow of the Tomb Raider RTX/DLSS On 111 102 +8.8% Borderlands 3 67.6 61.3 +10.3% Death Stranding DLSS/RTX ON 175 164 +6.7% Death Stranding DLSS/RTX OFF 116 104 +11.5% Control DLSS/RTX ON 71 65 +9.2% Control DLSS/RTX OFF 62 57 +8.8%
Differences between specs don’t always translate 1:1 in real world performance. Will vary in every application. I imagine a ~10% average increase in gaming performance and closer to a ~20% average increase in productivity performance (not impacted specifically by VRAM) might be likely. But we’ll find out soon.Weird, wonder if it is hitting a power limit. Based on specs the 3090 should be around 20% faster.
Hell no. In my opinion, if you’re only gaming then the 3090 (based on current leaks) is a terrible purchase over the 3080 from a price to performance perspective.So I mean you are telling me you are paying $800 for 10% performance and 12gb more memory?
Must be some expensive ass memory
Hell no. In my opinion, if you’re only gaming then the 3090 (based on current leaks) is a terrible purchase over the 3080 from a price to performance perspective.
On the brightside, if youre a gamer, and u did end up buying a gaming card, the 3090, only to game on, at least you can brag about being able to do science stuff
If 10% faster than a 3080, than at 4K it should be >50% than a 2080 Ti in general with a large dataset of games. At lower resolutions it is harder to utilize the extra fp32 ability of Ampere, it pulls away at higher resolutions.I really wonder if the 3090 it's faster with 50 % than the RTX 2080 Ti
Hell no. In my opinion, if you’re only gaming then the 3090 (based on current leaks) is a terrible purchase over the 3080 from a price to performance perspective.
I wonder what the GPU boost clock is on the 3090 - if it's as aggressive as the 3080 appears to be (hence the low OC headroom), or if there's a lot more to be gained from OCs on them.
It’s 14GB more memory, not 12GB.So I mean you are telling me you are paying $800 for 10% performance and 12gb more memory?
Must be some expensive ass memory
You mean, 3080 is 40 % faster than 2080 Ti ??If 10% faster than a 3080, than at 4K it should be >50% than a 2080 Ti in general with a large dataset of games. At lower resolutions it is harder to utilize the extra fp32 ability of Ampere, it pulls away at higher resolutions.
You can upscale and interpolate faster on a 3090 vs. 3080.Is pron viewing science stuff? Asking for a friend.
No, it is around 38% at 4K, 10% > 3080 performance is much great greater when compared to a 2080Ti since 1% performance of the 3080 is greater than 1% performance of a 2080Ti.You mean, 3080 is 40 % faster than 2080 Ti ??
So if swapping from 2080 ti to 3080 doesnt make sense, and from 3080 to 3090 doesnt make sense,then swappin from 2080 ti 3090 doesnt make sense?
*probably in ray tracing titlesI really wonder if the 3090 it's faster with 50 % than the RTX 2080 Ti
Without a target framerate, we dont know whether you might become CPU bound.These GPUs are so fast wr need faster CPUs.
We use to be almost entirely GPU bound, with tiny fps improvements after overerclocking CPUs. But these modern GPUs gain FPS with CPU overclocks, not just in the silly low res conditions. And this is before the new consoles start making their big impact on PC gaming requirements.
It's really going to be interesting once those future games of the next gen consoles become the norm.
But ya know, it sure would be nice if AMD manages meaningful CPU gains in these workloads. I would be ecstatic if they finally top Intel in gaming performance with their, soon to be launched, next gen Zen.
I have a 2080Ti- I have a 5820k @ 4.3GHz and game at 4k/60Hz. More than a few titles can't max out at the resolution/refresh rate. I'm waiting on the next generation of CPU's before I upgrade there, but I was considering a 3090 as a way to play some of the newer games maxed out at 4k. If there isn't much benefit to be had with a 3090 I'll probably upgrade my CPU in the meantime and wait on a 20GB 3080.Every buyer has their own performance per dollar threshold which needs to be met.
Personally, I balked at a 2080ti due to cost. But if i had one - I'd probably sit tight right now. Everything runs pretty well, eh? YMMV.
Without a target framerate, we dont know whether you might become CPU bound.
Just because a GPU isnt maxed out at a particular res doesnt mean it matters automatically.
I ordered one on launch day but there havent been any shipped as far as I know.I didn't see if this got posted yet? Asus ROG Strix clocking at 1935 Mhz for OC mode:
https://www.notebookcheck.net/The-A...igher-than-the-Founders-Edition.494790.0.html
Probably gonna be an $850+ card though.
Is pron viewing science stuff? Asking for a friend.
I think it would be in a title that was %100 optimized for the arcitecture. That may come someday.Weird, wonder if it is hitting a power limit. Based on specs the 3090 should be around 20% faster.