RTX 3xxx performance speculation

What PS4 evidence are you talking about exactly? And your 2nd point is completley false. People buy these videocards to run games at 120fps. My point was that you would have had to make very little sacrifices with a 6GB 1080ti card and still get a similar experience.

Just look at the last few games launched. All of them have the 2080ti struggling to hit 60 fps at 4k while even 6 GB cards keep relative pace (again idtech 7 notwithstanding).

Lack a performance of the 3080ti will still be an issue LONG before 11 GB is.

Wrong. Most people don't even have 120fps displays.
 
What PS4 evidence are you talking about exactly? And your 2nd point is completley false. People buy these videocards to run games at 120fps. My point was that you would have had to make very little sacrifices with a 6GB 1080ti card and still get a similar experience.

Just look at the last few games launched. All of them have the 2080ti struggling to hit 60 fps at 4k while even 6 GB cards keep relative pace (again idtech 7 notwithstanding).

Lack a performance of the 3080ti will still be an issue LONG before 11 GB is.
No. It won't. This next gen of consoles is going to usher in the mainstreaming of a very VRAM intensive feature: Ray Tracing. If the new Ampere GPUs have much improved RT and Tensor performance as is rumored, we could see games start to utilize Ray Tracing a lot more.

We might not need 24GB of VRAM, but I could very easily see 16-18GB start to be utilized in the coming years.
 
No. It won't. This next gen of consoles is going to usher in the mainstreaming of a very VRAM intensive feature: Ray Tracing. If the new Ampere GPUs have much improved RT and Tensor performance as is rumored, we could see games start to utilize Ray Tracing a lot more.

We might not need 24GB of VRAM, but I could very easily see 16-18GB start to be utilized in the coming years.

In games like Metro:Exodus, BFV, and Control, RT has had a rather consistent penalty of just over 1 GB while forcing you to lower resolution in most cases due to the performance penalty, basically negating the penalty.

DLSS, in a sense, reduces vram requirement as you are able to render at 1440p while getting 4k visuals. The cost of that has been fairly consistent as well at just 300 MB or so which is much smaller than the vram requirements between resolutions.

Along with all of this, games have been able to leverage DDR4 and PCIE 4 storage to great effect for use on distant textures that do not need ultra quick swapping of data.
 
Meanwhile, I've had 12GB for 4 years... Yeah, it cost a lot at the time, but is just average priced now.
 
Why do you think Microsoft went to the effort of a hybrid 320 bit system with 16 GB Ram? (about 12 GB available for video)

Because they knew the extra bandwidth would give them an edge over the PS5 and simply having a straight 20GB pool (or 16 GB for video) on a 320 bit system would be a waste of money.
 
I've slowly been upgrading VRAM with each new GPU. 7950 3GB -> 980 Ti 6GB -> 1080 8GB. Hopefully the next one will be at least 10GB, but we'll see. I've personally never run into a VRAM constrained situation, there are some games that do allow you to fill up the VRAM but that doesn't actually cause it to be a bottleneck.
 
I've slowly been upgrading VRAM with each new GPU. 7950 3GB -> 980 Ti 6GB -> 1080 8GB. Hopefully the next one will be at least 10GB, but we'll see. I've personally never run into a VRAM constrained situation, there are some games that do allow you to fill up the VRAM but that doesn't actually cause it to be a bottleneck.
That’s the key point. Sure, games can and will allocate a ton. The time interval is crucial. If the engine has time to shuffle on demand for the next frame/scene/area, we see little difference. And of course, devs work very hard on just that.
 
Wrong. People held onto 1080tis forever.

Your arguement was 7 or 8 years (" People that buy $1,500 video cards use them for literally an entire console generation. That's like 7-8 years. ").

1080 Ti only came out in 2017. That's not even 4 years ago.
 
No. It won't. This next gen of consoles is going to usher in the mainstreaming of a very VRAM intensive feature: Ray Tracing. If the new Ampere GPUs have much improved RT and Tensor performance as is rumored, we could see games start to utilize Ray Tracing a lot more.

We might not need 24GB of VRAM, but I could very easily see 16-18GB start to be utilized in the coming years.

I'm going to ask this again in it's own post so it's not missed.

Why do you think you know more than Nvidia's engineers? What insider info do you have that makes you so certain that 12GB is not a sufficient match for the capabilities of this card? Do you really think that Nvidia is purposefully crippling their flagship product? Or do you think they just didn't have a conversation about how much VRAM their GPU can make use of? Perhaps it slipped their minds?

You might be right that in several years we will see games that exceed 12GB of VRAM. Even if that's true, that doesn't mean the cards were talking about today are capable of playable settings that would require it. Nvidia could have put 12GB on your 780... would you have felt better knowing you paid more for a an absurd amount of VRAM that your GPU would never be powerful enough to utilize? Maybe, just maybe, the smart folks at Nvidia discussed if 16GB or 20GB had a practical application for their flagship gaming product, and decided (because they know everything their is to know about their card, and you do not) that there was no need.

If you think you know more than Nvidia and need more memory than they have allotted, feel free to waste your money on whatever the next Titan it. Surely it will have the unnecessary amounts of VRAM you crave. I think the rest of us will be happy with this card not being any more expensive than it needs to be.
 
I'm going to ask this again in it's own post so it's not missed.

Why do you think you know more than Nvidia's engineers? What insider info do you have that makes you so certain that 12GB is not a sufficient match for the capabilities of this card? Do you really think that Nvidia is purposefully crippling their flagship product? Or do you think they just didn't have a conversation about how much VRAM their GPU can make use of? Perhaps it slipped their minds?

You might be right that in several years we will see games that exceed 12GB of VRAM. Even if that's true, that doesn't mean the cards were talking about today are capable of playable settings that would require it. Nvidia could have put 12GB on your 780... would you have felt better knowing you paid more for a an absurd amount of VRAM that your GPU would never be powerful enough to utilize? Maybe, just maybe, the smart folks at Nvidia discussed if 16GB or 20GB had a practical application for their flagship gaming product, and decided (because they know everything their is to know about their card, and you do not) that there was no need.

If you think you know more than Nvidia and need more memory than they have allotted, feel free to waste your money on whatever the next Titan it. Surely it will have the unnecessary amounts of VRAM you crave. I think the rest of us will be happy with this card not being any more expensive than it needs to be.
It's not about Nvidia's engineers. It's about profit margins. They have to build these items to a price in order to satisfy their stock holders. These are probably going to be giant GPU dies that cost a ton to manufacture. In order to keep cost down, you nix the RAM amount. I was hoping that that would be implied, but apparently not.

You can guarantee that there are engineers @ Nvidia that made the pitch to Jen-Hsun Huang about these next generation GPUs needing 24GB VRAM, but they were shot down by the CFO because cost of production would push the price too high in order to remain profitable. Also just so we're clear, the 1080 Ti and RTX 2080 Ti were both meant to be 384-bit cards, but either due to yields or cost, they were nixed to 352-bit and 11 VRAM ICs. How do we know this? Because their equivalent TITANS had 384-bit bus.

People forget that Nvidia is a business. Profit and shareholder happiness are their top priority, and as long as their competition isn't knocking down the door with a better product (that has more VRAM), why should Nvidia incur additional cost to themselves?
 
Last edited:
You can guarantee that there are engineers @ Nvidia that made the pitch to Jen-Hsun Huang about these next generation GPUs needing 24GB VRAM, but they were shot down by the CFO because cost of production would push the price too high in order to remain profitable.

Can you?

More than likely the engineers said "12GB will be enough of VRAM for the next 2-3 years. But you can always put 24GB on some high end cards and charge way more for those idiots who think they need that for gaming"
 
Can you?

More than likely the engineers said "12GB will be enough of VRAM for the next 2-3 years. But you can always put 24GB on some high end cards and charge way more for those idiots who think they need that for gaming"
You don't know engineers. Engineers want it all, because if they have it all, they can make the best product possible. Unfortunately, that's not how business works.

If 12GB is "enough" why does the TITAN RTX have 24GB VRAM?
 
Last edited:
Can you?

More than likely the engineers said "12GB will be enough of VRAM for the next 2-3 years. But you can always put 24GB on some high end cards and charge way more for those idiots who think they need that for gaming"

It's more like: "If we put an insufficient amount of RAM on them we'll force people with money to upgrade earlier than they would have had to otherwise."

It's more like: "Let's only put a decent amount of RAM on the most expensive model to force people to buy it even though they would have skipped it if a slower card had just had a decent amount of RAM."
 
It's more like: "If we put an insufficient amount of RAM on them we'll force people with money to upgrade earlier than they would have had to otherwise."

It's more like: "Let's only put a decent amount of RAM on the most expensive model to force people to buy it even though they would have skipped it if a slower card had just had a decent amount of RAM."
In other words, it's a business decision, not an engineering decision. Glad somebody else gets it. :)
 
  • Like
Reactions: noko
like this
People forget that Nvidia is a business. Profit and shareholder happiness are their top priority, and as long as their competition isn't knocking down the door with a better product (that has more VRAM), why should Nvidia incur additional cost to themselves?

So what are you arguing exactly? Nvidia should have stuffed this card with 22 GB for 'engineer reasons' instead of saving the consumer costs and margin profits? Until we here game developers cry about how these cards have only 11 GB of vram, your 'insider knowledge' is completely worthless.
 
So what are you arguing exactly? Nvidia should have stuffed this card with 22 GB for 'engineer reasons' instead of saving the consumer costs and margin profits? Until we here game developers cry about how these cards have only 11 GB of vram, your 'insider knowledge' is completely worthless.
I don't appreciate the condescending attitude. You want to have a civil discussion or continue being a jerk?

22GB would last longer on the consumer market, yes. Just because video cards don't use 8 or 11 GB VRAM right now does not mean they won't use it a year from now.
 
It's more like: "If we put an insufficient amount of RAM on them we'll force people with money to upgrade earlier than they would have had to otherwise."

It's more like: "Let's only put a decent amount of RAM on the most expensive model to force people to buy it even though they would have skipped it if a slower card had just had a decent amount of RAM."

Yeah, thats another load of manure. As if nVidia is going to dictate how much vram is required from games in the future. None of what you said makes any sense from a business standpoint. Also, we have been here before. Nothing is stopping AMD from pushing out 16 GB cards just like when they mostly had 8 GB cards against 6 GB nvidia cards. Guess what? Nvidia cards had zero issues. For one, they generally have a more efficient memory controller. Second, game developers are not going to build something that runs like crap on 75% of systems.
 
Yeah, thats another load of manure. As if nVidia is going to dictate how much vram is required from games in the future. None of what you said makes any sense from a business standpoint. Also, we have been here before. Nothing is stopping AMD from pushing out 16 GB cards just like when they mostly had 8 GB cards against 6 GB nvidia cards. Guess what? Nvidia cards had zero issues. For one, they generally have a more efficient memory controller. Second, game developers are not going to build something that runs like crap on 75% of systems.

*cough* Crysis *cough*
 
Never thought I would see an enthusiast PC forum be ok and accepting of 11-12GB VRAM on a flagship product, especially since it has been this way for 2 generations now. Quite sad, actually.


Grow up. Vram was the least of your worries back then when trying to deal with Crysis.
https://www.techpowerup.com/review/his-hd-4870-iceq4/9.html

You said " Second, game developers are not going to build something that runs like crap on 75% of systems."

Just correcting you there, bud. No hard feelings :)
 
Never thought I would see an enthusiast PC forum be ok and accepting of 11-12GB VRAM on a flagship product, especially since it has been this way for 2 generations now. Quite sad, actually.

Nope, the problem is the installed base of GPU's. Until enough have a significant amount of Vram (over 8GB) the developers aren't going to code for massive Vram usage.

Also, those buying the very high end tend to do so almost every generation. So people like me that spend the money will buy the next gen that has more Vram when it's needed.

Anyway, games aren't designed for the very high end. They are designed for the masses.
 
Never thought I would see an enthusiast PC forum be ok and accepting of 11-12GB VRAM on a flagship product, especially since it has been this way for 2 generations now. Quite sad, actually.




You said " Second, game developers are not going to build something that runs like crap on 75% of systems."

Just correcting you there, bud. No hard feelings :)

You were being disingenuous. We were talking about vram the last several threads so I didn't feel the need to repeat that point on every post. Crysis was built to be very scalable in the future. Its possible that cyberpunk runs sub 60 fps with rtx a a RTX 3090, but it will not be due to lack of vram.
 
Never thought I would see an enthusiast PC forum be ok and accepting of 11-12GB VRAM on a flagship product, especially since it has been this way for 2 generations now. Quite sad, actually.

Enthusiast doesn't mean do the equivalent of pile your money up and light it on fire.

I have no problem with optional extreme memory for those who want to pay for it.

Though I expect when people see that it does absolutely nothing in gaming benchmarks, the vast majority would opt to keep the extra hundreds of dollars in their pockets and get the regular memory models.
 
How do you know this? Are you an engineer? Do you have insider information? How do you know this game wasn't developed exclusively on QUADRO RTX cards with 48GB VRAM?

Just sayin man, if you're gonna call me out, you'd better believe I'm gonna call you out on misinformation.

It's very clear that YOU are spreading the misinformation with all this drivel about engineers, stockholders, secret nvidia meetings, etc. I am making GUIDED assumptions using DATA from games developed over the past decade.
 
It's very clear that YOU are spreading the misinformation with all this drivel about engineers, stockholders, secret nvidia meetings, etc. I am making GUIDED assumptions using DATA from games developed over the past decade.
And I'm making educated guesses based on observations of what the industry is currently going through and trends we are seeing from a consumer standpoint. So congrats, we are both speculating in the "RTX 3xxx performance speculation" thread. The difference is that I work as an engineer in my current field and have been a part of several upgrade planning processes, so even though I don't have a bead on the GPU industry, I can most definitely speculate based on my personal experience what "might" be going on. When your job is to predict and analyze things for future expansion, you tend to become pretty good at making educated guesses.

Stop being a jerk. You don't have any more information than I do.
 
How do you know this? Are you an engineer? Do you have insider information? How do you know this game wasn't developed exclusively on QUADRO RTX cards with 48GB VRAM?

It should be beyond obvious, that they aren't going to release game, that needs more VRAM, than anyone has in their gaming cards. It doesn't matter what they developed it on. They are going to test it extensively on common gaming cards. AFAIK, all the demos of the game including play testing were on RTX 2000 series cards that max out at 11GB of VRAM.
 
It should be beyond obvious, that they aren't going to release game, that needs more VRAM, than anyone has in their gaming cards. It doesn't matter what they developed it on. They are going to test it extensively on common gaming cards. AFAIK, all the demos of the game including play testing were on RTX 2000 series cards that max out at 11GB of VRAM.
Well, I have a theory on that.

I speculate that Cyberpunk 2077 on PC has been exclusively optimized for Ampere, and that part of the reason it has been delayed to November is because Nvidia wants Cyberpunk 2077 to be a showpiece for RTX. Part of the reason "might" be because the current crop of Turing GPUs do not have enough VRAM to adequately display the game @ max settings and 4k. Remember, this game is supposed to be a killer app for Ray Tracing, and based on the videos we've seen, the visuals are next-gen for sure. If there is a game out there that may possibly push current GPUs to their VRAM max, this is it.

Again, pure speculation. We shall see in a few months, I suppose.
 
Well, I have a theory on that.

I speculate that Cyberpunk 2077 on PC has been exclusively optimized for Ampere, and that part of the reason it has been delayed to November is because Nvidia wants Cyberpunk 2077 to be a showpiece for RTX. Part of the reason "might" be because the current crop of Turing GPUs do not have enough VRAM to adequately display the game @ max settings and 4k. Remember, this game is supposed to be a killer app for Ray Tracing, and based on the videos we've seen, the visuals are next-gen for sure. If there is a game out there that may possibly push current GPUs to their VRAM max, this is it.

Again, pure speculation. We shall see in a few months, I suppose.

Nope. They aren't going to make a game unplayable on current cards. That would not be smart, to put it mildly.

Besides the game has been widely demoed and play tested. It doesn't push any Ray Tracing boundaries.

So if there is 12GB and 24GB options for high end cards, are you fine paying $400 more to get more memory that has no impact on todays games at all? Do you think many people would make that choice?
 
Nope. They aren't going to make a game unplayable on current cards. That would not be smart, to put it mildly.

Besides the game has been widely demoed and play tested. It doesn't push any Ray Tracing boundaries.

So if there is 12GB and 24GB options for high end cards, are you fine paying $400 more to get more memory that has no impact on todays games at all? Do you think many people would make that choice?
I never said it would be unplayable on current cards, it just may not be able to be maxed out.

And demos should never be taken as gospel. Pre-release demos very rarely show off a game utilizing release code.
 
It should be beyond obvious, that they aren't going to release game, that needs more VRAM, than anyone has in their gaming cards. It doesn't matter what they developed it on. They are going to test it extensively on common gaming cards. AFAIK, all the demos of the game including play testing were on RTX 2000 series cards that max out at 11GB of VRAM.

WTF are you talking about? You couldn't even run Doom 3 with maxed out settings on top of the line hardware when it came out.

Plenty of games have been released that required more hardware than was available on the market to max out their capabilities.
 
it just may not be able to be maxed out.
That's really more on the developers than it is on the technology. Put enough mods on Skyrim and you can max out current hardware.

WTF are you talking about? You couldn't even run Doom 3 with maxed out settings on top of the line hardware when it came out.

Doom 3 is an exception, not a rule. That should really go for anything that comes out of id software, and anyone that's been watching the gaming industry for the last few decades should know that.


If the upcoming consoles have 16GB total, 12GB of VRAM is well within game developers' targets. If it's a problem, you can always run something slower from AMD.
 
If Nvidia ends up charging $1500-$2000 for the high-end non-Titan card I will sell my 4k/120hz screen and my computer, buy a fast slim laptop, and Playstation 5.
$1500 would end up being around €1800 in Sweden. I can afford that but I don't see the point of spending that much money for gaming with PS5 just around the corner.
 
If Nvidia ends up charging $1500-$2000 for the high-end non-Titan card I will sell my 4k/120hz screen and my computer, buy a fast slim laptop, and Playstation 5.
$1500 would end up being around €1800 in Sweden. I can afford that but I don't see the point of spending that much money for gaming with PS5 just around the corner.

You will give up pc gaming just cause you don’t like the price of one card? Seems quite childish. Why dont you just buy a cheaper card?
 
If Nvidia ends up charging $1500-$2000 for the high-end non-Titan card I will sell my 4k/120hz screen and my computer, buy a fast slim laptop, and Playstation 5.
$1500 would end up being around €1800 in Sweden. I can afford that but I don't see the point of spending that much money for gaming with PS5 just around the corner.

I‘m in the same boat - money isn’t an issue (I have $1600 in Amazon rewards waiting to be used) but it is hard to justify especially given how little gaming I’m doing now. I do plan on building a new PC late this year or early next and another thought I’m having is that maybe I should just use the rewards on a 3090, as that will almost assuredly last the life of my next PC and maybe into the PC after that. My 1080Ti is still strong though so I may give it a few months or a year to see if prices drop a little first.
 
If Nvidia ends up charging $1500-$2000 for the high-end non-Titan card I will sell my 4k/120hz screen and my computer, buy a fast slim laptop, and Playstation 5.
$1500 would end up being around €1800 in Sweden. I can afford that but I don't see the point of spending that much money for gaming with PS5 just around the corner.
For a while, I thought about giving up PC gaming, especially since there was hope at one time that the Xbox One X would get PC-like features such as Mouse/KB support. This didn't happen in any meaningful way.

On top of that, I found a new passion in PC gaming trying to extract the FASTEST load times without resorting to a giant SSD. Primocache gave me that push, and I'm glad it did.

We will see about the Series X.
 
Back
Top