RTX 3xxx performance speculation

the 5700XT was never intended to compete in the high end...Big Navi is supposed to be the first AMD GPU to do that since the ATI days...

Never said it was. It's about extrapolating what the technology in 5700Xt, and more recently RDNA2 in XBSX, can do, and leaving AMD little room to breath.
 
God damn. I'm first to say when I was wrong, and I was sure performance was going to be decent for Ampere, but not outstanding. It looks like I may be pleasantly wrong. If 3080 performance is in the ballpark of what nV is advertising we're looking at a 680-level performance increase.

If Big Navi competes with 3080 and is priced right, it's going to be a good generation for PC gamers.
 
Sorry, I wasn't clear enough. I did not mean people complaining on forums, there are many forms for people to complain against a product. I was thinking of the vast amount of people with Pascal cards that saw the new RTX range and thought, meh, barely any DXR games at this point, no rush. There's a lot of us in that category. Now that DXR is actually picking up steam, the value offer is much more interesting, as long as they don't rise prices like crazy. Nvidia need consumers using DXR after all, as it becomes a mainstream thing, so if it's only available to the expensive range of GPUs, it's counterproductive for them to raise prices this time around.

Pascal was a performance "anomaly", NVIDIA's best.
The performance jump from Maxwell -> Pascal was not the norm.
The norm was actully more inline with Pascal -> Turing.
But people have short memories (even shorter on forums) so I predict now, that people will whine over Ampere to "Hopper"...because it won't be as big (Their biggest jump in performance in a generation according to NVIDIA) as Turing to Ampere.
Just wait and see ;)
 
God damn. I'm first to say when I was wrong, and I was sure performance was going to be decent for Ampere, but not outstanding. It looks like I may be pleasantly wrong. If 3080 performance is in the ballpark of what nV is advertising we're looking at a 680-level performance increase.

If Big Navi competes with 3080 and is priced right, it's going to be a good generation for PC gamers.
Indeed, feels like a great year for GPU.
 
Pascal was a performance "anomaly", NVIDIA's best.

Oh absolutely. What they did on the same node was insane. That doesn’t change that there’s a sizable part of the market that felt Turing wasn’t necessarily worth it. Jensen basically acknowledged as much when he explained the Pascal gains, and then literally told those of us in that group “it’s now safe to upgrade”. That is the most humble I’ve ever seen him, showing his hope that this time we will find value in their RTX cards, despite the higher than Pascal prices. Personally this time I’m ready, but I’ll get a 3060 at the most expensive.
 
It's cool that alot of people are getting the 3090, but I just want to make sure that everyone knows there will be a 3080ti in 6 months with 95% of the performance and 60% of the price.

We always seem to have some surprised and even angry reactions when this happens. Just making sure everyone understands this.
 
I’m think 3090 is a nonstarter because by the time all that raw power can be utilized, there will be a cheaper card with even more rtx/raytracing power.

The improvements over last generation are impressive, but the actual rtx performance still is not in my opinion. They sort of lost me when they showed their marble demo at 1080p 60 and not 4K.

Also, they are on 8nm after all.

I’m waiting for a 3080ti and better OLED4k displays.
 
  • Like
Reactions: drklu
like this
If Kopite is right, 2020 might be a god awful terrible year, but it could just turn into an amazing year for GPU nerds.

I'm already impressed with the DF footage of the 3080, can't wait to see real world performance on all fronts.
 
I’m think 3090 is a nonstarter because by the time all that raw power can be utilized, there will be a cheaper card with even more rtx/raytracing power.

The improvements over last generation are impressive, but the actual rtx performance still is not in my opinion. They sort of lost me when they showed their marble demo at 1080p 60 and not 4K.

Also, they are on 8nm after all.

I’m waiting for a 3080ti and better OLED4k displays.

I will be the first to say I think all that power will be utilized with Cyberpunk at 4k120hz gaming imo.
 
It's cool that alot of people are getting the 3090, but I just want to make sure that everyone knows there will be a 3080ti in 6 months with 95% of the performance and 60% of the price.

We always seem to have some surprised and even angry reactions when this happens. Just making sure everyone understands this.

Agree. The 3090 has buyer’s remorse written all over but that price is just irresponsible for a product that will be obsolete within a year.
 
I’m think 3090 is a nonstarter because by the time all that raw power can be utilized, there will be a cheaper card with even more rtx/raytracing power.

The improvements over last generation are impressive, but the actual rtx performance still is not in my opinion. They sort of lost me when they showed their marble demo at 1080p 60 and not 4K.

Also, they are on 8nm after all.

I’m waiting for a 3080ti and better OLED4k displays.
That marble demo was at 1440P and 30 FPS locked, so I'm not sure where you got 1080P 60 from.
 
Well he is obviously going to be right this time.

Since "Very Strong" could mean anything.

Did you read the reply?

Here "GA104 cannot beat Big Navi."

I mean, I'm excited, how can you not be? If true, wow, and if not, still wow.
 
It's cool that alot of people are getting the 3090, but I just want to make sure that everyone knows there will be a 3080ti in 6 months with 95% of the performance and 60% of the price.

We always seem to have some surprised and even angry reactions when this happens. Just making sure everyone understands this.

That's exactly what is making me hesitant to jump on a 3080 on day one...the inevitable release of a 3080Ti, especially if it's around the $800-850 mark.
 
I will be the first to say I think all that power will be utilized with Cyberpunk at 4k120hz gaming imo.

It took what, 2 generations of Nvidia cards before 4K60 was viable with everything turned on in Witcher 3 with hairworks?--the Titan V was the first card to do it. The RTX 3090 is not going to be relevant at 4K with Cyberpunk, especially with the early build the 2080 Ti not even able to hold 60fps while doing DLSS upsampled 1080p (internal res must have been 800 or 900p). 3090 might be a viable 1440p60 Cyberpunk card if we're lucky.
 
Did you read the reply?

Here "GA104 cannot beat Big Navi."

I mean, I'm excited, how can you not be? If true, wow, and if not, still wow.

which translates to Big Navi being at the very least better then the 3070...
 
It took what, 2 generations of Nvidia cards before 4K60 was viable with everything turned on in Witcher 3 with hairworks?--the Titan V was the first card to do it. The RTX 3090 is not going to be relevant at 4K with Cyberpunk, especially with the early build the 2080 Ti not even able to hold 60fps while doing DLSS upsampled 1080p (internal res must have been 800 or 900p). 3090 might be a viable 1440p60 Cyberpunk card if we're lucky.

To be honest this is all just guessing because #1. We don't even know the performance of the 3090, and #2. The performance of the game, I mean an very early sample could not run on a 2080ti at 4k with an unoptimized game with unoptimized drivers, which is now being replaced with a $500 card.

Too many if's to even come to any conclusion if you ask me.
 
Agree. The 3090 has buyer’s remorse written all over but that price is just irresponsible for a product that will be obsolete within a year.

What do you think is going to make it obsolete? 3080 is already the cut down version of the 3090, and looks only about 20% slower.

The only thing to wait for if you must have more memory, are the AIB 20GB 3080 cards, but they are going to perform like the 3080, and you will be paying more money to get mroe VRAM, not more performance.
 
It took what, 2 generations of Nvidia cards before 4K60 was viable with everything turned on in Witcher 3 with hairworks?--the Titan V was the first card to do it. The RTX 3090 is not going to be relevant at 4K with Cyberpunk, especially with the early build the 2080 Ti not even able to hold 60fps while doing DLSS upsampled 1080p (internal res must have been 800 or 900p). 3090 might be a viable 1440p60 Cyberpunk card.
Are you telling me $1,500 for 1440p 60fps is viable? That is not viable. For that money we should be getting 4k 120fps for a few years.

I am thinking I will build a new system around the next Battlefield and GTA titles and stick with 1080ti for as long as I can until the cards are blowing the doors off of AAA games at 4k.
Lots of gimmicks here. Minecraft, Quake and Fortnite are our killer apps for this gen.

I think 3080ti will be the sweet spot. I always get burned when I jump on a new architecture. It is never quite enough.
 
Do you all think what Jensen said about the RTX 3090 being the new TITAN is a good thing or another money gouge?

-Good: You now no longer need to pay $2,500~$3,000 for the top end "gaming card" (yes, technically the TITAN's are the best gaming GPU's even if they aren't marketed as such, as they still use GEFORCE drivers and slightly outperformed the 2080Ti/1080Ti)
-Bad: The MSRP is now $300 higher on the top end "gaming card" vs last gen, which very well may have happened to get the full fat chipset (aka TITAN's).
 
I don wonder why everyone is talking about performance, and I've barely heard a peep about RTX IO. Maybe I'm the only one, but this was the most groundbreaking feature in today's presentation for me. This is literally what will bring PS5/XBSX-like SSD performance to the PC. So far, I have heard of absolutely no-one else talking about how they bring this to PC, and it'll be quite important if we want this pretty big improvement to translate from consoles to PC (it is THE one area where consoles will behave drastically better than PC).

1598992912100.png
 
I don wonder why everyone is talking about performance, and I've barely heard a peep about RTX IO. Maybe I'm the only one, but this was the most groundbreaking feature in today's presentation for me. This is literally what will bring PS5/XBSX-like SSD performance to the PC. So far, I have heard of absolutely no-one else talking about how they bring this to PC, and it'll be quite important if we want this pretty big improvement to translate from consoles to PC (it is THE one area where consoles will behave drastically better than PC).

For anything like that, never bank on the presentation, see if it actually materializes in games, until then its just smoke and mirrors.
 
-Bad: The MSRP is now $300 higher on the top end "gaming card" vs last gen, which very well may have happened to get the full fat chipset (aka TITAN's).

the people that buy the highest end top tier Titan type of cards usually don't care about the cost...so $1200 vs $1500 is pretty much the same to them...they're willing to spend whatever it takes, which is what makes them a niche market
 
Are you telling me $1,500 for 1440p 60fps is viable? That is not viable. For that money we should be getting 4k 120fps for a few years.

I am thinking I will build a new system around the next Battlefield and GTA titles and stick with 1080ti for as long as I can until the cards are blowing the doors off of AAA games at 4k.
Lots of gimmicks here. Minecraft, Quake and Fortnite are our killer apps for this gen.

I think 3080ti will be the sweet spot. I always get burned when I jump on a new architecture. It is never quite enough.

1. Your "entitlement" is just hat....entitlement.
2. The MARKET dictates prices as it always has. NVIDIA got 20% of the the total market (Turing adoptation) and 80% of GPU's shipped last quater, even if people have whined over Turing's cost so I predict Ampere will sell better so your "entitlement" will do nothing.
 
Are you telling me $1,500 for 1440p 60fps is viable? That is not viable. For that money we should be getting 4k 120fps for a few years.

I am thinking I will build a new system around the next Battlefield and GTA titles and stick with 1080ti for as long as I can until the cards are blowing the doors off of AAA games at 4k.
Lots of gimmicks here. Minecraft, Quake and Fortnite are our killer apps for this gen.

I think 3080ti will be the sweet spot. I always get burned when I jump on a new architecture. It is never quite enough.
A good counter to your new hardware never quite being fast enough is to buy games on sale a year or two after they're released.

Or you could get a bad ass gsync monitor and not GAF about your FPS.
 
For anything like that, never bank on the presentation, see if it actually materializes in games, until then its just smoke and mirrors.
Uhm... it's based on Microsoft DirectStorage, which is what all XBSX games will use to move data around the system. So it's not an "if" it will happen. It is happening the moment XBSX is released. Nvidia so far has been the first to acknowledge that they're accelerating it through their GPUs so the same functionality will be usable on PCs. This suggests that AMD will probably also support it, but so far we don't know that.
 
DLSS getting an update too:
https://news.developer.nvidia.com/new-features-to-dlss-coming-to-nvidia-rtx-unreal-engine-4-branch/

  • New ultra performance mode for 8K gaming. Deliver 8K gaming on GeForce RTX 3090 with DLSS.
  • Improved VR support. Maintaining the 2880×1600 resolution of top-end VR head mounted displays while delivering the recommended 90 FPS has been made easier with DLSS.
  • Dynamic resolution support. The input buffer can change dimensions from frame to frame while the output size remains fixed. If the rendering engine supports dynamic resolution, DLSS can be used to perform the required upscale to the display resolution.
 
It took what, 2 generations of Nvidia cards before 4K60 was viable with everything turned on in Witcher 3 with hairworks?--the Titan V was the first card to do it. The RTX 3090 is not going to be relevant at 4K with Cyberpunk, especially with the early build the 2080 Ti not even able to hold 60fps while doing DLSS upsampled 1080p (internal res must have been 800 or 900p). 3090 might be a viable 1440p60 Cyberpunk card if we're lucky.

If you are referring to the press confrence, the 2080ti was capped to 60 fps so we are not really sure what was required. Also, the new cards are fat more efficient at ray tracing.
Screenshot_20200901-164631_YouTube.jpg
 
Back
Top