Ducked a Bullet with 3090?

6BD40F94-BB33-44FA-A261-ADCC15E57AC8.gif


I think for the 3090 to make sense you have to have the balls to shunt mod it, a use for the 24GB, or just tons of cash.

I am sure GN will shunt mod it and run the paces. Not sure if any AIB have high PL but the FE is power constrained for sure.
 
  • Like
Reactions: noko
like this
You are confusing means with the enraging supply situation my friend. 95% of folks who wanted it on H have not gotten one, and most I am sure have the means. All the power to you but most of us don't have the time to wait 18 hours in line or even a MC nearby (bay area one has been closed since 2012).
I posted this vid in another thread - but I think it worthy of its own discussion.



The first part talks about the folly of buying a 3090 for 8k gaming (which, frankly, common sense should have told us...). However the second half has benches vs an OverClocked 3080 and a regular 3080 FE... and I'm not seeing the value proposition, for anyone other than media creation folks.

For those of us who are 'average to power' consumers of media... 3080 seems plenty fine. Presuming they quit dribbling them out.


What I would say to this is, then why buy a 3950x ... or a 10900K ... or anything that has the highest performance. Faster cars that go more MPH than the model below it, larger homes with more bedrooms than the house across the street,. longer vacations that gives you more nights, perks and stops than the other cheaper cruise, etc.

You could apply the same logic to the people that bought the 2080 ti ...

Value is going to mean something different to all of us it's subjective in nature. I really dislike these types of posts because they serve no one other than, finger pointing and trying to force ones opinion onto others. " For those of us who are 'average to power' consumers of media... 3080 seems plenty fine." ..... honestly, worry about yourself and not the guy down the street that wants to buy a 2021 Corvette and not a Prius or whatever they are called.

Subjective value / theory is the idea that an objects value ( RTX 3090 ) is NOT inherent and is instead worth more to different people ( myself ) based on how much they desire or need the object.

The argument that "casual players" make against not seeing the value in higher end products, not just video cards is an ..... old ... tired..... subject that sadly rears it's ugly head anytime these new video cards come out. People just repackage the argument it and try and serve it all over again. Again, tired and old.

BTW, I called Steve out over at Gamers Nexus an idiot. Not only does he contradict himself a few times, he purposely glosses over the fact of what this card really is. It's a gaming card with a lot of gaming features. He knows this but, he needs subs and wants attention and he wants to do his little emo weird body language thing, stop mid-sentence, studder and roll his eyes etc etc. His entire argument hinges on the fact nVidia called this a "titan level" video card.

This video card is honestly just a 3080 ti with double the DDR6 .... nothing more. Everyone understands this. It's entirely possible we will see a 3090 ti in the future, with this model being discounted, who knows.

You know that the difference between the 2080 and the 2080 ti is? ..... it's 10 or 15% lmao. I swear, you guys.

To put this into context, "The 2080 ti is really just for content creators" ..... "For those of us who are 'average to power' consumers of media... 2080 seems plenty fine."
 
Linux mentioned that only one game in his testing exceeded 11GB vRAM usage.
Wolfenstein at 8K.

That should settle the speculation...except for those not interested in facts.

So don't buy a 3080 for 8K gaming 🤷‍♂️
 
On balance I’m happy if it’s not something that people are lusting over. I kinda need one and would prefer a shorter queue :)

My main takeaway has just been fascination over the impact of brand. If it had been called Titan it’d have been pretty much the same and the drama llamas would have been much quieter.
 
What I would say to this is, then why buy a 3950x ... or a 10900K ... or anything that has the highest performance. Faster cars that go more MPH than the model below it, larger homes with more bedrooms than the house across the street,. longer vacations that gives you more nights, perks and stops than the other cheaper cruise, etc.

You could apply the same logic to the people that bought the 2080 ti ...

Value is going to mean something different to all of us it's subjective in nature. I really dislike these types of posts because they serve no one other than, finger pointing and trying to force ones opinion onto others. " For those of us who are 'average to power' consumers of media... 3080 seems plenty fine." ..... honestly, worry about yourself and not the guy down the street that wants to buy a 2021 Corvette and not a Prius or whatever they are called.

Subjective value / theory is the idea that an objects value ( RTX 3090 ) is NOT inherent and is instead worth more to different people ( myself ) based on how much they desire or need the object.

The argument that "casual players" make against not seeing the value in higher end products, not just video cards is an ..... old ... tired..... subject that sadly rears it's ugly head anytime these new video cards come out. People just repackage the argument it and try and serve it all over again. Again, tired and old.

BTW, I called Steve out over at Gamers Nexus an idiot. Not only does he contradict himself a few times, he purposely glosses over the fact of what this card really is. It's a gaming card with a lot of gaming features. He knows this but, he needs subs and wants attention and he wants to do his little emo weird body language thing, stop mid-sentence, studder and roll his eyes etc etc. His entire argument hinges on the fact nVidia called this a "titan level" video card.

This video card is honestly just a 3080 ti with double the DDR6 .... nothing more. Everyone understands this. It's entirely possible we will see a 3090 ti in the future, with this model being discounted, who knows.

You know that the difference between the 2080 and the 2080 ti is? ..... it's 10 or 15% lmao. I swear, you guys.

To put this into context, "The 2080 ti is really just for content creators" ..... "For those of us who are 'average to power' consumers of media... 2080 seems plenty fine."
This is a lot of words to simply say, “I’m upset that people think I wasted my money on this purchase.”

Let me quote something important that you said that actually applies:

Subjective value / theory is the idea that an objects value ( RTX 3090 ) is NOT inherent and is instead worth more to different people ( myself ) based on how much they desire or need the object.

This I agree with. And unfortunate for you, the vast majority of people have expressed that they think the card is a terrible value for gaming based on their own ‘subjective value’ of the card. But that shouldn’t matter to you since it’s what you think that counts towards the enjoyment of your purchase, and truly should be that way.

You could’ve argued this in the first place instead of leading off trying to say that everyone who thought the card was a terrible value just simply couldn’t afford it.
 
...

You know that the difference between the 2080 and the 2080 ti is? ..... it's 10 or 15% lmao. I swear, you guys.

To put this into context, "The 2080 ti is really just for content creators" ..... "For those of us who are 'average to power' consumers of media... 2080 seems plenty fine."

EDIT: What exlink wrote above, and this:​

Okay - but did you buy a 2080, and then buy the 2080ti as an upgrade?

I will say that that choice does not make sense to me.
Even looking at the value prospect back then of choosing between both cards given simultaneous availability: "The Founders Edition RTX 2080 Ti costs $1,199, while the RTX 2080 goes for $799" * spending the extra money for the ti doesn't get you a lot of bang for the buck. For example: In Witcher 3**, the 2080 gets
  • Full HD: 154 FPS
  • 4k: 61 FPS
While the 2080ti gets
  • Full HD: 186 FPS
  • 4k: 78 FPS
as others have pointed out, the $400 dollars between the two could have been spent on other equipment / games, etc., and you'd not likely miss out on anything given how they both perform in the real world. I mean, what real difference will you see between the two on a 60hz 4k, or 144hz 1080p monitor? Did a purchaser of a 2080ti really get $400 more value / real world performance over a 2080 owner?

I'm not throwing shade on anyone's choice of how to spend their own money. If you want the biggest, baddest, latest greatest for bragging rights, good on ya. I am, however, saying that the dollars-performance metric does not suggest a good value for buying the 3090 over the 3080 as a gaming card, and thus my friend and I ducked a bullet on the 3090. A bit of perspective:

In Witcher 3, the 3080 gets
  • Full HD: 186 FPS
  • 4k: 92 FPS
While the 3090 gets
  • Full HD: 208 FPS
  • 4k: 102 FPS
In MSFS***, the 3080 gets
  • Full HD: 50 FPS
  • 4k: 42 FPS
While the 3090 gets
  • Full HD: 50 FPS
  • 4k: 45 FPS
So when you pair those cards with whatever monitor you have... how much of the extra frames is the average, or even pro gaemerz really gonna see? What is the extra scratch really buying you? (Besides bragging rights, or pride of ownership)?

Thus, for us - there is no realistic value in buying the 3090 over the 3080... and thus, whooosh!


* https://www.tomsguide.com/us/nvidia-rtx-2080-release-date-price,news-27805.html

** Witcher 3 FPS: https://www.guru3d.com/articles_pages/geforce_rtx_3090_founder_review,24.html

*** Flight Sim FPS: https://www.guru3d.com/articles-pages/geforce-rtx-3090-founder-review,21.html
 
They did overclock it (see the power section), not sure why no gaming benchmarks with the OC though. Maybe it wasn't fully stable?

he says early on it was only a 2-3% difference in performance, same as the 3080 stock vs overclocked.
 
The 2080 ti is a good 30-35% faster than the 2080 and made perfect sense if you caught it for < $900 last year.
 
"But 16GB shared memory in teh consoles" --- no, consoles are irrelevant.

PC games are often ported from consoles or designed around the console hardware platform as a reference.

It was a big deal in the PS3 / 360 gen, because those had some higher end GPUs at launch, so demands on PC hardware definitely increased after those launched.

Not a big deal last gen, because PS4/Xbone were just low-mid end PC hardware.

This time even developers are saying PC could be the lowest common denominator (most PC gamers don't have 5700XT class GPUs and even the new Samsung 980 Pro is slower than PS5 storage) and therefore get worse ports than consoles. So if they do that, you may be right. VRAM may not matter if the devs expect your GPU to be 8GB or less. If they use PS5 as a guideline for PC games next year, consoles won't be irrelevant at that point.
 
His entire argument hinges on the fact nVidia called this a "titan level" video card.

This video card is honestly just a 3080 ti with double the DDR6 .... nothing more.

honest question, how is that different from the prior “Titan” distinction. A few more cores active, more memory. That’s basically what Titans are right, or have been for last two rounds at least ? They’ve been within the few percent and are just a halo for some or cheap professional for others who don’t need fp64 or the driver magic (that’s really nvidia bribes to software vendors)

I know the V was based on Volta which was Datacentre but that didn’t seem to mean much in term of differentiation. If they were talking about gpu instancing and the like on ampere then ya I’d get it.

I’m super tired atm and my brain doesn’t work but in the context of choosing it I’m struggling to know why everyone is getting so riled about what seems to me to just be a naming distinction and so is fundamentally arbitrary.

what am I missing?

If I think through my own purchase logic, I need video memory. I can get a Titan RTX, I can get a 6000 RTX (which is basically the same but with driver bullshit) or I can get a 3090. I don’t need the driver BS and Titan rtx seems dumb at this point.

If I think back there never seemed to be this *rage* over the fact that for many a “Titan” wasn’t worth what has always been a massive price difference. Put a different number on the end though and oh boy.
 
honest question, how is that different from the prior “Titan” distinction. A few more cores active, more memory. That’s basically what Titans are right, or have been for last two rounds at least ? They’ve been within the few percent and are just a halo for some or cheap professional for others who don’t need fp64 or the driver magic (that’s really nvidia bribes to software vendors)

I know the V was based on Volta which was Datacentre but that didn’t seem to mean much in term of differentiation. If they were talking about gpu instancing and the like on ampere then ya I’d get it.

I’m super tired atm and my brain doesn’t work but in the context of choosing it I’m struggling to know why everyone is getting so riled about what seems to me to just be a naming distinction and so is fundamentally arbitrary.

what am I missing?

If I think through my own purchase logic, I need video memory. I can get a Titan RTX, I can get a 6000 RTX (which is basically the same but with driver bullshit) or I can get a 3090. I don’t need the driver BS and Titan rtx seems dumb at this point.

If I think back there never seemed to be this *rage* over the fact that for many a “Titan” wasn’t worth what has always been a massive price difference. Put a different number on the end though and oh boy.

The Titan-class cards were created to be cheaper "Prosumer" versions of nVidia's Quadro line. They have certain features enabled in drivers that work with/enhance professional software (think CAD and similar). Titan-class cards have also traditionally drawn ~250-300W at full load, being somewhat power efficient. It just so happened that the Titans were also very good at gaming, but it was not their design focus.

The 3090 has neither the "Prosumer" driver optimizations, nor the reasonable power draw of the older Titan cards. This puts it in something of a weird position... on the one hand, it is priced like a Titan, games like a Titan, and has the extra RAM usually associated with a Titan - but on the other hand, it is missing the driver features required by professional software that would come with a Titan and it hoovers power to the point that it requires a high-quality 1000W power supply in order to reliably operate. This is not to say that the 3090 can't do any professional workloads, or do them well - a lot of gamer cards can. Just that when you look at Prosumer benchmarks for apps that rely on Titan features, the older Titan cards come very close to the performance of the 3090 (if not flat-out leave it in the dust) despite the 3090 being significantly faster in games.

On the question of "value proposition" all I can say is this: The Titans are marketed to "Prosumers". There is a lot of value in Titan cards for creative professionals that don't want to spend Quadro money and that happen to ALSO want to get good gaming performance. They typically cost anywhere from $1000 to $3000 (with the Titav V being the outlier at $3000). The 3090, however, is marketed specifically to gamers as the ultimate gamer card (it does not have Titan features). While it is in Titan territory $$$-wise, it is not a Titan.

tldr: There is a reason nVidia did not call this card a Titan. It's gaming focus is key to that designation. I suspect there will be an actual Ampere Titan at some point later.
 
Last edited:
I'm not sure why people are trying to justify their 3090 purchase to random people on the internet or why some people are upset that others are calling it a terrible value proposition compared to the 3080. If you want it then just buy it and enjoy it. It's your money; spend it how you deem fit.
 
What I would say to this is, then why buy a 3950x ... or a 10900K ... or anything that has the highest performance. Faster cars that go more MPH than the model below it, larger homes with more bedrooms than the house across the street,. longer vacations that gives you more nights, perks and stops than the other cheaper cruise, etc.

You could apply the same logic to the people that bought the 2080 ti ...

Value is going to mean something different to all of us it's subjective in nature. I really dislike these types of posts because they serve no one other than, finger pointing and trying to force ones opinion onto others. " For those of us who are 'average to power' consumers of media... 3080 seems plenty fine." ..... honestly, worry about yourself and not the guy down the street that wants to buy a 2021 Corvette and not a Prius or whatever they are called.

Subjective value / theory is the idea that an objects value ( RTX 3090 ) is NOT inherent and is instead worth more to different people ( myself ) based on how much they desire or need the object.

The argument that "casual players" make against not seeing the value in higher end products, not just video cards is an ..... old ... tired..... subject that sadly rears it's ugly head anytime these new video cards come out. People just repackage the argument it and try and serve it all over again. Again, tired and old.

BTW, I called Steve out over at Gamers Nexus an idiot. Not only does he contradict himself a few times, he purposely glosses over the fact of what this card really is. It's a gaming card with a lot of gaming features. He knows this but, he needs subs and wants attention and he wants to do his little emo weird body language thing, stop mid-sentence, studder and roll his eyes etc etc. His entire argument hinges on the fact nVidia called this a "titan level" video card.

This video card is honestly just a 3080 ti with double the DDR6 .... nothing more. Everyone understands this. It's entirely possible we will see a 3090 ti in the future, with this model being discounted, who knows.

You know that the difference between the 2080 and the 2080 ti is? ..... it's 10 or 15% lmao. I swear, you guys.

To put this into context, "The 2080 ti is really just for content creators" ..... "For those of us who are 'average to power' consumers of media... 2080 seems plenty fine."

I'm not going to go with the point by point stuff I normally do. I'll put it this way:

I want the best graphics card for playing games that money can buy. I am used to buying Ti cards or even Titan's in pairs to achieve this goal. I'm used to running resolutions well beyond the norm. I skipped 1920x1080 and went to 2560x1600 back when that might as well have been 3840x2160 (4K). I then went to 7680x1600, dropped down to 4K and then 3440x1440 ultra-wide. I'm looking to go back to 4K. I like refresh rates as high as possible while maintaining high frame rates. I do not "turn settings down" if I can possibly avoid it.

I'm far from wealthy, but I have enough disposable income to buy whatever video card I want. For me, a 3090 is better than a 3080. Yes, it is more expensive. I understand that. I am well aware that from a price / performance perspective the 3090 is shit for playing games. But, I'm the kind of guy that would buy a Ferrari if I could afford it. I understand I can't really drive much faster on open roads than I can in my Mustang or Camaro. I understand that the price/performance ratio of such cars is absolute shit. I don't care. I like nice things. I like having the best money can buy when and if I can afford it.

So, for me a 3090 is where it's at. The most I ever spent on graphics cards was $2,800 or something like that for my Maxwell Titan X's in SLI. I got a lot of life out of those. For me, spending half that isn't a bad deal. I spent $1,299 for a GIGABYTE RTX 2080 Ti Aorus Xtreme and then almost another $250 or so to put a water block on it. So again, the price of a 3090 is par for the course. Now, if I knew when a potential 3080 Ti or Super was coming with 20GB of RAM and it was soon enough, I'd wait. But right now the 3090 is the fastest gaming card on the planet, even if they will primarily be purchased for their compute performance.
 
I'm not going to go with the point by point stuff I normally do. I'll put it this way:

I want the best graphics card for playing games that money can buy. I am used to buying Ti cards or even Titan's in pairs to achieve this goal. I'm used to running resolutions well beyond the norm. I skipped 1920x1080 and went to 2560x1600 back when that might as well have been 3840x2160 (4K). I then went to 7680x1600, dropped down to 4K and then 3440x1440 ultra-wide. I'm looking to go back to 4K. I like refresh rates as high as possible while maintaining high frame rates. I do not "turn settings down" if I can possibly avoid it.

I'm far from wealthy, but I have enough disposable income to buy whatever video card I want. For me, a 3090 is better than a 3080. Yes, it is more expensive. I understand that. I am well aware that from a price / performance perspective the 3090 is shit for playing games. But, I'm the kind of guy that would buy a Ferrari if I could afford it. I understand I can't really drive much faster on open roads than I can in my Mustang or Camaro. I understand that the price/performance ratio of such cars is absolute shit. I don't care. I like nice things. I like having the best money can buy when and if I can afford it.

So, for me a 3090 is where it's at. The most I ever spent on graphics cards was $2,800 or something like that for my Maxwell Titan X's in SLI. I got a lot of life out of those. For me, spending half that isn't a bad deal. I spent $1,299 for a GIGABYTE RTX 2080 Ti Aorus Xtreme and then almost another $250 or so to put a water block on it. So again, the price of a 3090 is par for the course. Now, if I knew when a potential 3080 Ti or Super was coming with 20GB of RAM and it was soon enough, I'd wait. But right now the 3090 is the fastest gaming card on the planet, even if they will primarily be purchased for their compute performance.
Yep, folks like me bought 2x 1080Ti for SLI => $1500, 2x Vega FE's => $1400, 2x 1070's for about $800. That is with SLi/CFX being hit and miss throughout its history. If one has the budget for one and will not really effect them negatively then the 3090 makes good sense. 3090 really makes sense if you do stuff that could use the extra vram and processing power. So it is not so cut and dry. I really don't think the 3080 is a long term card due to the 10gb of memory, if one does not mind turning down settings as time goes on then it will be a good card. I spent more money that year than buying 2x 3090s. The problem with the 3090 is the 3 slot cooler would block a pcie slot, which water cooling would be the best answer to that issue.

If the next generation consoles are 10gb direct to the GPU plus 3.5gb additional available, if games use that vram capacity, would not a high end PC gamer want better graphics then consoles? If you have 10gb and the console game uses it, congrats, you have console quality gaming on your PC so to speak. For better textures, increase draw distances, better raytracing, more shaders, more dense objects over the console => You will need more vram then the consoles have, if you have the same or less, don't expect to have a better experience over the next gen consoles in general.
 
Last edited:
Yep, folks like me bought 2x 1080Ti for SLI => $1500, 2x Vega FE's => $1400, 2x 1070's for about $800. That is with SLi/CFX being hit and miss throughout its history. If one has the budget for one and will not really effect them negatively then the 3090 makes good sense. 3090 really makes sense if you do stuff that could use the extra vram and processing power. So it is not so cut and dry. I really don't think the 3080 is a long term card due to the 10gb of memory, if one does not mind turning down settings as time goes on then it will be a good card. I spent more money that year than buying 2x 3090s. The problem with the 3090 is the 3 slot cooler would block a pcie slot, which water cooling would be the best answer to that issue.

If the next generation consoles are 10gb direct to the GPU plus 3.5gb additional available, if games use that vram capacity, would not a high end PC gamer want better graphics then consoles? If you have 10gb and the console game uses it, congrats, you have console quality gaming on your PC so to speak. For better textures, increase draw distances, better raytracing, more shaders, more dense objects over the console => You will need more vram then the consoles have, if you have the same or less, don't expect to have a better experience over the next gen consoles in general.

I'm definitely water cooling a 3090. One of the reasons why I'm opting for that is the 24GB of RAM. In my experience, cards like the 3090 and the 2080 Ti or Titan X's before that were all cards you could use long term. Now, I tend to upgrade fairly often, but those cards get moved down to other machines. In fact, my Titan X's are in my girlfriend's PC in SLI to this day. They've been exemplary cards. I've gotten a crap ton of life out of them. I think I will do the same out of the 3090. Also, if and when a 20GB 3080 drops, it will probably perform about the same and cost less, but I'll have had the 3090 ahead of time and I'm good with paying that early adopter tax.
 
  • Like
Reactions: noko
like this
BTW, I called Steve out over at Gamers Nexus an idiot. Not only does he contradict himself a few times, he purposely glosses over the fact of what this card really is. It's a gaming card with a lot of gaming features.
Looking at what it can do in Blender/Maya I am not so sure, there is now a world that exist between high end gaming and professional workload that want only signed driver and cad feature, like artist/3d workload, 4K+ video editing, where the pci-express 4.0 bandwith or in the 3090 vs 3080 the 24 gig of ram's can show it's use or just saving a bit of render time will pay for the difference in the next 3 year's no problem if your workload is 10 and under gig anyway.

It is more than a gaming card (but not a Quadro either).

I could imagine what the last 4-5 year's mean for a little studio or a self employed 3d artist, researcher, you can have better than dual titan of not so long ago for 1500 USD with a 16 or 32 core in a regular $250 motherboard.

Maybe one day it will be a significantly better gaming card than a 3080 or the upcoming 20 gig 3080 version, but for now it does not seem to be a good choice for gaming at the price tag.
 
The Titan-class cards were created to be cheaper "Prosumer" versions of nVidia's Quadro line. They have certain features enabled in drivers that work with/enhance professional software (think CAD and similar). Titan-class cards have also traditionally drawn ~250-300W at full load, being somewhat power efficient. It just so happened that the Titans were also very good at gaming, but it was not their design focus.

The 3090 has neither the "Prosumer" driver optimizations, nor the reasonable power draw of the older Titan cards. This puts it in something of a weird position... on the one hand, it is priced like a Titan, games like a Titan, and has the extra RAM usually associated with a Titan - but on the other hand, it is missing the driver features required by professional software that would come with a Titan and it hoovers power to the point that it requires a high-quality 1000W power supply in order to reliably operate. This is not to say that the 3090 can't do any professional workloads, or do them well - a lot of gamer cards can. Just that when you look at Prosumer benchmarks for apps that rely on Titan features, the older Titan cards come very close to the performance of the 3090 (if not flat-out leave it in the dust) despite the 3090 being significantly faster in games.

On the question of "value proposition" all I can say is this: The Titans are marketed to "Prosumers". There is a lot of value in Titan cards for creative professionals that don't want to spend Quadro money and that happen to ALSO want to get good gaming performance. They typically cost anywhere from $1000 to $3000 (with the Titav V being the outlier at $3000). The 3090, however, is marketed specifically to gamers as the ultimate gamer card (it does not have Titan features). While it is in Titan territory $$$-wise, it is not a Titan.

tldr: There is a reason nVidia did not call this card a Titan. It's gaming focus is key to that designation. I suspect there will be an actual Ampere Titan at some point later.

Other than driver differences, GeForce, Titan and Quadro are segmented by FP64 performance.

So you are right. 3090 is about 20% faster than 3080 in FP64. A hypothetical Ampere Titan would be slower than the upcoming RTX Ampere Quadro in FP64, but a much bigger boost than 20% over 3080.

But 3090 is still impressive though. Slightly faster than Quadro RTX 8000 in FP64, which was what? $9,999 two years ago?

Too bad the nVidia GeForce Creator and Studio drivers don't seem to be getting updates anymore.
 
Well, what happens to the 10GB of RAM on the 3080 when games start coming out with even higher resolution textures?

I have a feeling that some games will end up being able to use more than 10GB RAM in the not too distant future.

10GB on the 3080 is fine for up to 4k gaming.

You'll only need the 24GB on the 3090 for 8k or professional applications such as Blender.

But yes if you want to wait, there will probably be a 20GB version of the 3080 down the road, it might even be a 3080 Ti, but Nvidia can't allow it to go too fast or it would make the 3090 look bad. Cos the 3090 is only 15% faster on average than the 3080 for gaming FPS. LOL.
 
10GB on the 3080 is fine for up to 4k gaming.

You'll only need the 24GB on the 3090 for 8k or professional applications such as Blender.

But yes if you want to wait, there will probably be a 20GB version of the 3080 down the road, it might even be a 3080 Ti, but Nvidia can't allow it to go too fast or it would make the 3090 look bad. Cos the 3090 is only 15% faster on average than the 3080 for gaming FPS. LOL.
...
Well - and further, doesn't VRAM simply provide a buffer between the gpu and the cpu/ram such that once you get to the VRAM capacity you're not stuck there - it will continue to work but now has to tap into the longer pathway to utilize system memory for the brief period of over utilization?
 
Back
Top