RTX 3xxx performance speculation

I received some flak recently for merely linking a table made in a topic here which looked at VRAM 'usage' in modern games. The counter argument was that true usage (not merely that allocated/reserved) is lower and thus 10GB VRAM is more than enough even for the most demanding titles.

That said no one I've seen making this counterpoint has explained how true usage can be measured, apart from one game which supposedly distinguishes it---Flight Sim 2020. Even Steve from Gamers Nexus mentioned that GPU-Z's reported VRAM 'usage' isn't representative of the real usage which makes it confusing tbh.
It will be "10 GB is plenty" and "Allocated doesn't mean used" until the exact moment the 3080 starts stuttering in games. Then it will be "16 GB was more future-proof, we warned you!"

Sometime in the next 2 years I will end up linking back this post. Hello future self.
 
I don't know if an app exists for VRAM usage, but for RAM usage we have Windows tools (Ressource monitor):
2020-09-12_124459.png


(this machine is mostly idle now but this tab shows some interesting things when you start firing up a bunch of apps)
 
It will be "10 GB is plenty" and "Allocated doesn't mean used" until the exact moment the 3080 starts stuttering in games. Then it will be "16 GB was more future-proof, we warned you!"

Sometime in the next 2 years I will end up linking back this post. Hello future self.

This is exactly what was said with prior gens e.g. 7800 256mb, not coincidentally shortly after new console releases where an increased amount of VRAM started being standard in games.
 
yeah the 10gb of VRAM is def a bit of a worry... I'm not good at waiting though, I will re-sell and buy a 16gb or w/e comes along in 6 months time
 
  • Like
Reactions: Auer
like this
yeah the 10gb of VRAM is def a bit of a worry... I'm not good at waiting though, I will re-sell and buy a 16gb or w/e comes along in 6 months time
Sounds reasonable, and if they do the 8 to 16 later on for the 3070 too, then that's a plan as well.
You'll loose a couple of hundreds, and got to game for a year or so. Consider it cheap rent.
 
It will be "10 GB is plenty" and "Allocated doesn't mean used" until the exact moment the 3080 starts stuttering in games. Then it will be "16 GB was more future-proof, we warned you!"

Sometime in the next 2 years I will end up linking back this post. Hello future self.
A game stuttering due to running out of VRAM is unfortunately the only real test though. There's plenty of theory that can be tossed at the problem, but none of that covers what game developers have yet to actually do.
 
You use one game and then conclude that is the performance difference? This is 23 games at 1440p. The 980 Ti was 57% of the 1080Ti performance at 100% (baseline comparison percent), 1080Ti was 1/.57 =1.75, a factor of 1.75 faster or 75% faster overall using 23 games as data points. Ampere performance is still not yet fully known but does not appear to be even close for generational increase.

Maybe take your time and breath, and look more closely at what you are responding to.

A lot of people were upset that a performance leak, that Showed 3080 (note the absence of Ti) was only 33% faster than 2080 Ti.

There was only one meaningful game in that leak, and since we the leak was not about 3080 Ti or even 3090 that might work as a stand in, I made the same generation comparison as in the leak, Previous series x80 Ti, to new series x80 comparison.

To compare to the leak, I looked at the Anantech review for the GTX 1080, and noted the first handful of games, I chose the one that made Pascal look the best, and still it was completely in line with the leaked comparison.

Making the point that in the x80 vs prev generation x80 ti comparison, Ampere is displaying a similar performance uplift that Pascal did at launch.

Then you lumbered in in a compare it the Pascal x80 Ti, which does NOT exist in Ampere, so your comparison is the irrelevant one.
 
For 1440p@165hz gsync gaming 3070 or 3080 based on what we know today? I’m planning to build new system with 4070x (zen3). I’m leaning toward 3080 due to extra vram, however 3070 ti with 16gb would be ideal. I guess I will wait a little longer.
 
I don't know if an app exists for VRAM usage, but for RAM usage we have Windows tools (Ressource monitor):
View attachment 278497

(this machine is mostly idle now but this tab shows some interesting things when you start firing up a bunch of apps)
If I’m not mistaken msi afterburner can tell you that. I used it to determine how much vram madvr uses when playing 4K content using gtx 1060 (6gb). It turned out it uses over 4gb and 1060 (3gb) would not work for my htpc.
 
For 1440p@165hz gsync gaming 3070 or 3080 based on what we know today? I’m planning to build new system with 4070x (zen3). I’m leaning toward 3080 due to extra vram, however 3070 ti with 16gb would be ideal. I guess I will wait a little longer.

If you want to push the full 165hz of your monitor, I'd go 3080. I wouldn't think 8gb of VRAM will limit you at 1440p, at least not for the foreseeable, but having 10gb is a bonus.

That said, I'm going from educated guesses and rumors, like all of us are at this point.
 
If I’m not mistaken msi afterburner can tell you that. I used it to determine how much vram madvr uses when playing 4K content using gtx 1060 (6gb). It turned out it uses over 4gb and 1060 (3gb) would not work for my htpc.

Lots of apps can tell you vram allocation. No apps can tell you actual vram usage.
 
I received some flak recently for merely linking a table made in a topic here which looked at VRAM 'usage' in modern games. The counter argument was that true usage (not merely that allocated/reserved) is lower and thus 10GB VRAM is more than enough even for the most demanding titles.

That said no one I've seen making this counterpoint has explained how true usage can be measured, apart from one game which supposedly distinguishes it---Flight Sim 2020. Even Steve from Gamers Nexus mentioned that GPU-Z's reported VRAM 'usage' isn't representative of the real usage which makes it confusing tbh.
Basically when you see large jumps in frame times or uneven frame times is a good indication. Flight Simulator on my 5700XT in San Franscisco has an allocation about 8gb but the frame times show clearly the stutter and pauses, the Vega FE while slower gives a superior more consistent experience. Reducing the settings to medium in Flightsimulator 2020 for San Francisco is the only way for a smooth frame rate with the 5700XT. The 1080Ti is very smooth, best frame rates and indicates around 10gb+, while the Vega FE had Vram usage up to 15gb. So the game may allocate and use the memory since overall it will be better then constantly moving memory around at a lower than available vram size. I doubt very much the 5700XT with 8gb has enough vram for use at 4K.

RT which will uses BVH to streamline and make more efficient ray hits on objects plus other requirements of ray tracing take a significant amount more memory as in 1gb-2gb more, MS Flight Simulator may get a RT update which could push that 10gb 3080 to its limits or limit what you can do with RT. I've found HDR adds about .5gb more memory needed as a side note.

https://developer.nvidia.com/blog/rtx-best-practices/
Q How much extra VRAM does a typical ray-tracing implementation consume?
A. Today, games implementing ray-tracing are typically using around 1 to 2 GB extra memory. The main contributing factors are acceleration structure resources, ray tracing specific screen-sized buffers (extended g-buffer data), and driver-internal allocations (mainly the shader stack).
 
Framerate isn't exclusively tied to vram useage.

I get 165hz @ 1440p in esports titles with low and medium settings with a 1080ti.
Cod BO4 and mw 2019 are 150fps average everything low.
 
Framerate isn't exclusively tied to vram useage.

I get 165hz @ 1440p in esports titles with low and medium settings with a 1080ti.
Cod BO4 and mw 2019 are 150fps average everything low.
Dramatic changes in frame times, as in how many ms from one frame to the next. I should have specify 1440p for the 5700XT with Flight Simulator 2020. Anyways in San Francisco, 1440p, Ultra settings, frame to frame time variations was up to 150ms -> Very inconsistent frame rates. The fps, a different measurement was around 30fps and did not show this huge noticeable visually experience using fps. Is that due to the vram size? Could be a game/driver issue but by reducing the vram requirements I could get this issue to go away while my other cards with more vram did not exhibit this.
 
Dramatic changes in frame times, as in how many ms from one frame to the next. I should have specify 1440p for the 5700XT with Flight Simulator 2020. Anyways in San Francisco, 1440p, Ultra settings, frame to frame time variations was up to 150ms -> Very inconsistent frame rates. The fps, a different measurement was around 30fps and did not show this huge noticeable visually experience using fps. Is that due to the vram size? Could be a game/driver issue but by reducing the vram requirements I could get this issue to go away while my other cards with more vram did not exhibit this.

DF has a video up on optimizing FS2020. It's a game where just using max settings doesn't make sense for any HW.
 
https://videocardz.com/newz/overclocking-nvidia-geforce-rtx-3080-memory-to-20-gbps-is-easy

The GPU was only overclocked by 70 MHz and memory by 850 MHz. This was the highest clock speed for this particular sample (for GPU and memory overclock).

I’ve been worried about these cards being maxed out at the factory and it seems the 70 MHz max core OC confirms it somewhat. We’ll see how custom AIB cards do in a few months but so far this isn’t a great result for a reference design. Too bad they didn’t OC the FE card.
 
  • Like
Reactions: noko
like this
https://videocardz.com/newz/overclocking-nvidia-geforce-rtx-3080-memory-to-20-gbps-is-easy



I’ve been worried about these cards being maxed out at the factory and it seems the 70 MHz max core OC confirms it somewhat. We’ll see how custom AIB cards do in a few months but so far this isn’t a great result for a reference design. Too bad they didn’t OC the FE card.
I think it had more to do with the power limit nvidia has in place. We have no idea how much overclocking room nvidia is going to allow. We will need more info tbh on that though.
 
I've been checking multiple forums to see what the pulse is on demand for the 3080/3090. Overall, people are flipping out about them. Certain forums (which shall not be named) have members who are legitimately asking about the performance difference between different models of 3080.

I mean... has anyone heard of anything so crazy before? The cards aren't out. The reviews aren't out. People can't even pre-order cards. I'm losing brain cells by the minute reading threads like that. I can already feel the crotchety old man in me coming out wanting to yell at the damn kids to get off my lawn.

Seriously.... what the hell?!?!?!?
 
I've been checking multiple forums to see what the pulse is on demand for the 3080/3090. Overall, people are flipping out about them. Certain forums (which shall not be named) have members who are legitimately asking about the performance difference between different models of 3080.

I mean... has anyone heard of anything so crazy before? The cards aren't out. The reviews aren't out. People can't even pre-order cards. I'm losing brain cells by the minute reading threads like that. I can already feel the crotchety old man in me coming out wanting to yell at the damn kids to get off my lawn.

Seriously.... what the hell?!?!?!?
I think there is a lot of added hype because of how lackluster the 20-series was. It's pretty remarkable that the 2080 Ti launched at $1200, and stayed there for nearly two years. People are well overdue for a monster performer at a hot price.
 
Dramatic changes in frame times, as in how many ms from one frame to the next. I should have specify 1440p for the 5700XT with Flight Simulator 2020. Anyways in San Francisco, 1440p, Ultra settings, frame to frame time variations was up to 150ms -> Very inconsistent frame rates. The fps, a different measurement was around 30fps and did not show this huge noticeable visually experience using fps. Is that due to the vram size? Could be a game/driver issue but by reducing the vram requirements I could get this issue to go away while my other cards with more vram did not exhibit this.

That's a gpu crusher of a game. When I saw 30fps steady on clocked 9900k+2080ti builds Ultra settings at 1440p I was like, oh it's the new Crysis.
 
Simulators are usually hopelessly CPU/RAM bound though. But I do not have the new FS and have no interest in it (low FPS, no VR, civilian aircraft) so I can't speak for this one, maybe it can hit 100% GPU usage for a change.
 
DF has a video up on optimizing FS2020. It's a game where just using max settings doesn't make sense for any HW.

It seems optimizing is a must for current gen hardware. DF gained 70% performnce with little impact in visuals.
Screenshot_20200912-164129_YouTube.jpg

Even then, the 2080ti was often in the mid 40's during dual cloud areas and cities at night
Screenshot_20200912-164836_YouTube.jpg
Screenshot_20200912-164920_YouTube.jpg

With no optimization, even the 2080ti would likely dip below 30 fps.
 
Lots of apps can tell you vram allocation. No apps can tell you actual vram usage.

Yep, most game engines will use unused VRAM to cache things, whether or not it is needed.

Just because 12GB of RAM is in use, does not mean that performance would drop at all if only 6GB were available.

You'd need to test the same card with different amounts of RAM side by side to determine that.
 
Yep, most game engines will use unused VRAM to cache things, whether or not it is needed.

Just because 12GB of RAM is in use, does not mean that performance would drop at all if only 6GB were available.

You'd need to test the same card with different amounts of RAM side by side to determine that.

There’s also issues with Dev skimming over concepts then trying to implement workflows improperly.
Cod has used a couple Nvidia APIs improperly the last 2 games I’m aware of.
I know that 1 of the studios I worked at had a Dev misunderstand AMD dual gpu rendering apis back when 295x2 was a hush hush engineering sample in a manner that got him laughed out of a job. The dude literally stammered and used the “worked on my laptop in test” excuse.

So, just bc it got committed and somehow pushed into latest games build doesn’t mean it’s a good idea or even logical.

Also it super sucks we have to wait for reviews, but back to back FE to partner gpus makes it a fun news 24hr thrash.
 
Just to throw this out to my [H] brethren, Bestbuy gives a one time 10% off coupon for your birthday month. I am going to use it. Question is on a 3080 or 3090 ;).

$800 for 18% perf increase is rouughhhhh.

Hey hey, where is this coupon? the 3080 launches on my bday ;P?
 
I've been checking multiple forums to see what the pulse is on demand for the 3080/3090. Overall, people are flipping out about them. Certain forums (which shall not be named) have members who are legitimately asking about the performance difference between different models of 3080.

I mean... has anyone heard of anything so crazy before? The cards aren't out. The reviews aren't out. People can't even pre-order cards. I'm losing brain cells by the minute reading threads like that. I can already feel the crotchety old man in me coming out wanting to yell at the damn kids to get off my lawn.

Seriously.... what the hell?!?!?!?

And all these pathetic youtube shills and hardware websites just giving Nvidia a pass. What journalists with any integrity would accept this bullcrap? NDAs that don't expire until after the cards are available to purchase? It's so obvious that the FE cards are worse than the AIB cards. They wouldn't be afraid of reviews if they weren't.

Early benchmarks have already confirmed Nvidia was flat out lying about performance. 100% faster than the 2080ti my ass.
 
people are so worried about the cards being out of stock that they're going to buy them as soon as they become available...I'd rather wait and find out which AIB cards are the best performing with the best cooling...
 
Back
Top