RTX 3xxx performance speculation

After seeing reviews. I honestly do not see the 3090 being maybe 15-20% faster than the 3080. More memory is not going to make 4k gaming faster.,....

Not sure $1500 is worth it then IMO.
 
After watching the Gamers Nexus review what the heck were they smoking at the reveal? Looks like a great time to buy used 2080s.
 
After seeing reviews. I honestly do not see the 3090 being maybe 15-20% faster than the 3080. More memory is not going to make 4k gaming faster.,....

Not sure $1500 is worth it then IMO.

After reading through reviews, I am personally more inclined to go for 3080 now instead of 3090. With no 3090 reviews in sight, I am risking on missing out on 3080 until I wait for 3090 reviews. And if the 3090 reviews are lackluster, who knows how long will I have to wait to see 3080s in stock again.....
 
After reading through reviews, I am personally more inclined to go for 3080 now instead of 3090. With no 3090 reviews in sight, I am risking on missing out on 3080 until I wait for 3090 reviews. And if the 3090 reviews are lackluster, who knows how long will I have to wait to see 3080s in stock again.....

I think 3090 reviews are on the 24th? But ill be honest, for $800 more for maybe 20% more performance tops....You think that is worth it?
 
Hey guys. So my temps 9700K will rise when i buy Rtx 3080 or 3090? I have actually Waterforce 2080 TI Aorus
 
Hey guys. So my temps 9700K will rise when i buy Rtx 3080 or 3090? I have actually Waterforce 2080 TI Aorus

Of course they will rise if you start dumping heat into the case. They won't rise a lot if you have a good CPU HSF, but you'll notice a difference.
 
Of course they will rise if you start dumping heat into the case. They won't rise a lot if you have a good CPU HSF, but you'll notice a difference.
I doubt cpu temps will increase unless you are comparing to a 2000 series blower card.
 
After watching the Gamers Nexus review what the heck were they smoking at the reveal? Looks like a great time to buy used 2080s.

The time for that was probably before the reviews came out. Basically from the GN and HUB reviews it looks like its decent at 4K but disappointing at 1440P:
4K.png


The interesting thing is that the results were closer to expectation at 4K resolutions. People were expecting ~75% increase at 4K resolution compared with the 2080 given the Digital Foundry video. Getting 68% at 4K isn't that far off . The real disappointment is at 1440p where its only 47% faster than the 2080 which is about <25% faster than the 2080 Ti.

Interesting thought. NVIDIA was saying 3070 = 2080 Ti or slightly better. What if this is only true at 4K but worse at 1440p? The people who fire-saled their 2080Ti's for around $500 might be a bit dissapointed unless they are 4K gamers. There is some speculation that the poor lower resolution scaling is due to the architecture which does either FP+INT or FP+FP. Since the INT load remains similar for 1440p and 4K the higher resolution has a more favorable FP to INT ratio which means more of the shaders can be used in the double FP mode which may explain why the 3080 performs better at 4K than just CPU bottlenecking would suggest compared with 1440p.
 
After watching the Gamers Nexus review what the heck were they smoking at the reveal? Looks like a great time to buy used 2080s.
It's not all that bad this time... At least performance level at the price point actually increased... In the previous 1000 => 2000 series transition. The $800 dollar price point didn't offer any real perf. improvement. (1080ti to 2080) .. Its progress...
Of course, I see little reason for 2080ti owners to upgrade unless they have very specific needs. (namely HDMI 2.1 as they need to drive a 4K 120Hz display).
 
I think 3090 reviews are on the 24th? But ill be honest, for $800 more for maybe 20% more performance tops....You think that is worth it?

For purely 20% increase, def not, memory is the issue for me, will be going from 11 gig (1080 Ti) to 10 gig. Not sure how future games will utilize ram, MS flight simulator can eat through it, watch dogs just released ultra specs calling for 11 gig recommended (could be just because it is targeting 2080 Ti). But in any case, DLSS should offset need for memory for 4K, but then how many games will really use DLSS? I remember being memory limited on my 970 SLI setup, and its not something I want to experience again.
 
For purely 20% increase, def not, memory is the issue for me, will be going from 11 gig (1080 Ti) to 10 gig. Not sure how future games will utilize ram, MS flight simulator can eat through it, watch dogs just released ultra specs calling for 11 gig recommended (could be just because it is targeting 2080 Ti). But in any case, DLSS should offset need for memory for 4K, but then how many games will really use DLSS? I remember being memory limited on my 970 SLI setup, and its not something I want to experience again.

Well Nvidia screwed up the memory on the 970's they got sued and they had to settle. If you watched or read most of the reviews. Memory is not an issue it seems for 4k gaming.
 
Man lots of games still sub 60FPS maxed @ 4K with the 3080. We still aren't there yet (Control, Metro, Red Dead, Gears Tactics, etc).
 
For purely 20% increase, def not, memory is the issue for me, will be going from 11 gig (1080 Ti) to 10 gig. Not sure how future games will utilize ram, MS flight simulator can eat through it, watch dogs just released ultra specs calling for 11 gig recommended (could be just because it is targeting 2080 Ti). But in any case, DLSS should offset need for memory for 4K, but then how many games will really use DLSS? I remember being memory limited on my 970 SLI setup, and its not something I want to experience again.

There is a difference between memory allocation and actual usage.
 
There is a difference between memory allocation and actual usage.
Yes, I think the 10gb is fine.

The 3070, however, is not enough for 4K, which is why people dumping their 2080tis for a 3070 are going to regret it looking at the 3080 benchmarks.
 
Man lots of games still sub 60FPS maxed @ 4K with the 3080. We still aren't there yet (Control, Metro, Red Dead, Gears Tactics, etc).

We don't ever get there. Its never happened, and never will. You bu
TheFPSReview at least shows that the RTX3080 is capable of fully utilizing PCIE 4.0, showing double the throughput over what the RTX 2x series can do. But as the TechPowerUp article shows, there isn't anything out there YET that is benefiting from it.

https://www.thefpsreview.com/2020/09/16/nvidia-geforce-rtx-3080-founders-edition-review/4/

I think we will see a massive difference when we finally get some games with the DirectStorage thing going.
 
So if you were building a new system, and not upgrading (so no consideration of change from some current performance level) what's the best performance/$ now that we've seen what 3080 is capable of?

I have my new rig built, and I was really hoping to get a 3080, but with the likely limited availability, if I can't get one tomorrow, I don't really want to wait 3 months to finish putting my rig together.

What would you look at buying based on the benchmarks if a 3080 wasn't in the cards, and you were trying to get the best bang for your buck?

For me, the focus is 1440p high refresh rate.

Appreciate any thoughts (though buying used is not likely, I've been burned before)
 
So if you were building a new system, and not upgrading (so no consideration of change from some current performance level) what's the best performance/$ now that we've seen what 3080 is capable of?

I have my new rig built, and I was really hoping to get a 3080, but with the likely limited availability, if I can't get one tomorrow, I don't really want to wait 3 months to finish putting my rig together.

What would you look at buying based on the benchmarks if a 3080 wasn't in the cards, and you were trying to get the best bang for your buck?

For me, the focus is 1440p high refresh rate.

Appreciate any thoughts (though buying used is not likely, I've been burned before)

I think the obvious answer to that would be a 2080Ti
 
So if you were building a new system, and not upgrading (so no consideration of change from some current performance level) what's the best performance/$ now that we've seen what 3080 is capable of?

I have my new rig built, and I was really hoping to get a 3080, but with the likely limited availability, if I can't get one tomorrow, I don't really want to wait 3 months to finish putting my rig together.

What would you look at buying based on the benchmarks if a 3080 wasn't in the cards, and you were trying to get the best bang for your buck?

For me, the focus is 1440p high refresh rate.

Appreciate any thoughts (though buying used is not likely, I've been burned before)

The best 1440p gaming card will be the 3080 for high refresh rate. The issue is being CPU bottlenecked in some situations. But the question is the 3080 worth the price for 1440p? That is something you will need to answer yourself.

The 2080ti would be more than enough for 1440p gaming as well.
 
Reviews coming in seem to have the same conclusion (I have read through around 4 so far): noticeable upgrade if gaming at 4K. Lucky for me, I game at 4K. Also, going from a GTX 1080 to an RTX 3080 should be quite a large jump in performance when gaming at 4K.

While I was expecting some of the benchmarks to be better than what is being posted, they are still impressive. I wonder how much of an improvement we will see in the coming months to those same games via driver optimizations. I was really hoping to see RDR2 maintain 60fps or higher at 4K at or near max settings (I have seen just dropping a few settings will make this achievable).

Now the bigger question for me, which I haven't found yet, is what kind of performance improvements can we expect with VR?

I would really like to be able to play my VR games at 120/144hz with high/max settings and decent scaling and maintain frame rate, especially Half Life Alyx.
 
Perf per watt is up, 25% over the 2080ti? That is both on a new architecture and node. Does anyone have the numbers of previous double jumps like this ?
 
Not impressed with 3080 BF5 numbers. 4K, minimum frames about 60fps. That doesn’t bode well for BF6.

4K 120fps is still a generation a way it seems.

0B977602-D965-49B6-9982-977783027865.png
 
Not impressed with 3080 BF5 numbers. 4K, minimum frames about 60fps. That doesn’t bode well for BF6.

4K 120fps is still a generation a way it seems.
if bf6 uses the same engine (optimized), then it could be good. But 4k120fps is still not hear, I agree with that.
 
I think i will buy 3080 Rtx and when Rtx 3090 will launch i will sell 3080 and buy 3090
 
Not impressed with 3080 BF5 numbers. 4K, minimum frames about 60fps. That doesn’t bode well for BF6.

4K 120fps is still a generation a way it seems.

Wait.... someone is still playing BF5? I heard the player numbers are lower than BF1 and BF4 (by a lot!).
 
I think 3090 reviews are on the 24th? But ill be honest, for $800 more for maybe 20% more performance tops....You think that is worth it?

Unlikely. But here's your example of having competition vs. not having competition. At the 3080 class card and below, Nvidia is expecting competition from RDNA2 cards. At the 3090 class, not so much...
 
Perf per watt is up, 25% over the 2080ti? That is both on a new architecture and node. Does anyone have the numbers of previous double jumps like this ?



TPU said 17% over 2080ti in 4K. I really don’t know what power usage is anymore. Thanks to nvidia. Hahaha. I mean we had 2080 with similar performance over 1080Ti. At like 220w. Now 3080 at 25% faster then 2080ti is more efficient at 320w. I guess it makes some sense with resolution but when I see power efficiency as a pro on TPU I shake my head. If this was AMD besting 2080ti by 25% at 4K with 320w. They would be getting destroyed about power hog sentient. I am likely picking up a 3080 before miners scalp it. Guess can always sell it it if rdna2 is competitive and uses less power. I’ll give red team a shot this time for bitching at them about power before lol.
 
  • Like
Reactions: noko
like this
The time for that was probably before the reviews came out. Basically from the GN and HUB reviews it looks like its decent at 4K but disappointing at 1440P:
View attachment 279635

The interesting thing is that the results were closer to expectation at 4K resolutions. People were expecting ~75% increase at 4K resolution compared with the 2080 given the Digital Foundry video. Getting 68% at 4K isn't that far off . The real disappointment is at 1440p where its only 47% faster than the 2080 which is about <25% faster than the 2080 Ti.

Interesting thought. NVIDIA was saying 3070 = 2080 Ti or slightly better. What if this is only true at 4K but worse at 1440p? The people who fire-saled their 2080Ti's for around $500 might be a bit dissapointed unless they are 4K gamers. There is some speculation that the poor lower resolution scaling is due to the architecture which does either FP+INT or FP+FP. Since the INT load remains similar for 1440p and 4K the higher resolution has a more favorable FP to INT ratio which means more of the shaders can be used in the double FP mode which may explain why the 3080 performs better at 4K than just CPU bottlenecking would suggest compared with 1440p.

Don't matter, if it is faster people will buy it and it will be out of stock for months. The top end cards should top out at $300-400 but too many have too much money (or little common sense, will never retire?) and they can sell lots and lots of $600-800 cards and lots of $1200-$2000 cards. I do want HDMI 2.1 for the LG 77C9 but I don’t care too much about running 4k or a 20FPS bump in a certain game.

If AMD can give me similar raster performance, HDMI 2.1 and for less…
 
TPU said 17% over 2080ti in 4K. I really don’t know what power usage is anymore. Thanks to nvidia. Hahaha. I mean we had 2080 with similar performance over 1080Ti. At like 220w. Now 3080 at 25% faster then 2080ti is more efficient at 320w. I guess it makes some sense with resolution but when I see power efficiency as a pro on TPU I shake my head. If this was AMD besting 2080ti by 25% at 4K with 320w. They would be getting destroyed about power hog sentient. I am likely picking up a 3080 before miners scalp it. Guess can always sell it it if rdna2 is competitive and uses less power. I’ll give red team a shot this time for bitching at them about power before lol.

I think it's clear at least Ampere (on 8nm) is performing no miracles with power efficiency.

Exactly what I intend to do. I'll pick up a 3080, play with it for a month or two, then if Navi 21 performs in the realm of a) 3080, b) it's a bit cheaper, c) has less power draw, I'll sell and move over to AMD. Or who knows: if the retailers are hiking prices tomorrow, I may just sit back, have a pint, and wait for this to all blow over.
 
Back
Top