Ducked a Bullet with 3090?

DarkSideA8

Gawd
Joined
Apr 13, 2005
Messages
989
I posted this vid in another thread - but I think it worthy of its own discussion.



The first part talks about the folly of buying a 3090 for 8k gaming (which, frankly, common sense should have told us...). However the second half has benches vs an OverClocked 3080 and a regular 3080 FE... and I'm not seeing the value proposition, for anyone other than media creation folks.

For those of us who are 'average to power' consumers of media... 3080 seems plenty fine. Presuming they quit dribbling them out.
 
Wondering why they didn't overclock the 3090 for comparison. Looking forward to seeing what it can do under water.

They did overclock it (see the power section), not sure why no gaming benchmarks with the OC though. Maybe it wasn't fully stable?
 
Well, what happens to the 10GB of RAM on the 3080 when games start coming out with even higher resolution textures?

I have a feeling that some games will end up being able to use more than 10GB RAM in the not too distant future.
 
No real bullet ducked. Everyone following the cards closely knew this would be the result. It is perhaps a little more underwhelming at 4k and especially 1440p than might have been expected initially. After we all saw how the cards were mostly built around accelerating high resolutions like 4k it was pretty apparent that the 3090 wasn't going to be 40-50% faster. Which is a bummer.

But it is still an undisputed monster for certain work applications.
 
Well, what happens to the 10GB of RAM on the 3080 when games start coming out with even higher resolution textures?

I have a feeling that some games will end up being able to use more than 10GB RAM in the not too distant future.
I feel the same way, I like the 3080 a lot, but is it a viable card for 4-5 years? I don't want to upgrade in a year or so just because Im running out of vram, its happened when I owed the GTX 970. I had the card not even a year and I was already hitting the wall with vram, I hardly got much use from the 970 at all before upgrading again.
 
They did overclock it (see the power section), not sure why no gaming benchmarks with the OC though. Maybe it wasn't fully stable?

Maybe it didn't help their "not worth it argument." I get the price performance ratio is abysmal but if they're going to overclock a card and compare it to a stock card we aren't getting all the information. What if the overclocked 3090 kept it's lead and put the fps at just the sweet spot for 4k. 4k res is becoming more common with 4k OLED's being used, and some folks don't want 110fps, they want 120 instead. Or instead of 53fps it does 60fps, so that it's the first "4k capable" card.

They may just be waiting to put it on water so folks can see what it really can do in a separate review.

Looking forward to a Kingpin variant in a few months.
 
Last edited:
Well, what happens to the 10GB of RAM on the 3080 when games start coming out with even higher resolution textures?

I have a feeling that some games will end up being able to use more than 10GB RAM in the not too distant future.
I’d rather turn my textures down to high from very high in those edge case circumstances than drop an extra $800. Now if the 3090 offered at least a 30% performance boost over the 3080 in that $800 as well then I’d be more inclined.

The 3080 with 20GB of VRAM will be the sweet spot if it comes in at or below $1000. You’ll have all your bases covered even in the “what if” scenarios with VRAM limitations. And you won’t be out $800 bucks for a measly 10% performance increase.
 
Well, what happens to the 10GB of RAM on the 3080 when games start coming out with even higher resolution textures?

I have a feeling that some games will end up being able to use more than 10GB RAM in the not too distant future.

Games aren't going to come out with even higher res textures just-because. There are no gaming awards given for Most-VRAM-Used. Developers are going to cater to 10GB/8GB/6GB for 3080/3070/3060 respectively because those are the new defacto's.

"But I saw a benchmark for this one game that showed 24GB VRAM used on a 3090, so 10GB cards must not be enough" -- no, games that just preallocate the max available VRAM don't mean anything.

"But 16GB shared memory in teh consoles" --- no, consoles are irrelevant.
 
...There are no gaming awards given for Most-VRAM-Used. ...

Made me laugh.

- I'm wondering at the news @ Big Navi pushing higher VRAM out the box, and the 'promise' of NVidia responding with a 20Gb 3080... Like, what's really going to be the benefit; other than impressive, arbitrary bench numbers... and maybe a bit of future-proofing.

The 4k 'revolution' really has not kicked off; you can tell that by the offerings of Display panels. Most people seem stuck in the 24-27 inch 1080 - 1440 realm right now, and still talking about refresh rates. But the benches I've seen show that 3080 does not really come into its own until / unless 4k, due to cpu throttling. And the new 'architecture' of SSD - GPU (bypassing the CPU/RAM) won't happen until game designers really adopt it... which they wont (and wont need to) until there's sufficient hardware demand .

IDK. Much as I want a 3080... This launch is convincing me I'm not really losing out on anything by waiting (i.e. the panels I want are not yet even offered for sale - 4k IPS 32" w/ 120+ Hz)
 
Games aren't going to come out with even higher res textures just-because. There are no gaming awards given for Most-VRAM-Used. Developers are going to cater to 10GB/8GB/6GB for 3080/3070/3060 respectively because those are the new defacto's.

"But I saw a benchmark for this one game that showed 24GB VRAM used on a 3090, so 10GB cards must not be enough" -- no, games that just preallocate the max available VRAM don't mean anything.

"But 16GB shared memory in teh consoles" --- no, consoles are irrelevant.

Some games have come out with free high resolution texture packs after they were released.

Video cards at the time of the release would not have been able to handle these higher res texture packs.

For example - Fallout 4 - regular required system specs:

System Requirements
  • Requires a 64-bit processor and operating system.
  • OS: Windows 7/8/10 (64-bit OS required)
  • Processor: Intel Core i5-2300 2.8 GHz/AMD Phenom II X4 945 3.0 GHz or equivalent.
  • Memory: 8 GB RAM.
  • Graphics: NVIDIA GTX 550 Ti 2GB/AMD Radeon HD 7870 2GB or equivalent.
  • Storage: 30 GB available space.
With the high res texture pack:

Note: To utilize the High-Resolution Texture Pack, make sure you have an additional 58 GB of available and that your system meets/exceeds the recommended specs below.
Recommended PC Specs
Windows 7/8/10 (64-bit OS required)
Intel Core i7-5820K or better
GTX 1080 8GB
8GB+ Ram


And there are plenty others.

Just because it isn't needed right at release doesn't mean that there won't be something that can eat up 10GB a little ways down the road.

That being said, I would most likely get the 3080 myself if/when I do buy one.
 
Some games have come out with free high resolution texture packs after they were released.

Video cards at the time of the release would not have been able to handle these higher res texture packs.

For example - Fallout 4 - regular required system specs:

System Requirements
  • Requires a 64-bit processor and operating system.
  • OS: Windows 7/8/10 (64-bit OS required)
  • Processor: Intel Core i5-2300 2.8 GHz/AMD Phenom II X4 945 3.0 GHz or equivalent.
  • Memory: 8 GB RAM.
  • Graphics: NVIDIA GTX 550 Ti 2GB/AMD Radeon HD 7870 2GB or equivalent.
  • Storage: 30 GB available space.
With the high res texture pack:

Note: To utilize the High-Resolution Texture Pack, make sure you have an additional 58 GB of available and that your system meets/exceeds the recommended specs below.
Recommended PC Specs
Windows 7/8/10 (64-bit OS required)
Intel Core i7-5820K or better
GTX 1080 8GB
8GB+ Ram


And there are plenty others.

Just because it isn't needed right at release doesn't mean that there won't be something that can eat up 10GB a little ways down the road.
The question comes down to: do you pony up the extra ~$200 (what I guess the price difference of the 20GB cards will cost) into the 3080 or wait ~2 years down the road and put the ~$200 towards a 4080 that will likely have just as much VRAM and even better performance.

I guess if you’re planning on keeping your 3080 for more than 2 years it is probably worth it. But if you plan to upgrade again in 2 years it may be better to put that $200 down then?
 
Some games have come out with free high resolution texture packs after they were released.

Video cards at the time of the release would not have been able to handle these higher res texture packs.

For example - Fallout 4 - regular required system specs:

System Requirements
  • Requires a 64-bit processor and operating system.
  • OS: Windows 7/8/10 (64-bit OS required)
  • Processor: Intel Core i5-2300 2.8 GHz/AMD Phenom II X4 945 3.0 GHz or equivalent.
  • Memory: 8 GB RAM.
  • Graphics: NVIDIA GTX 550 Ti 2GB/AMD Radeon HD 7870 2GB or equivalent.
  • Storage: 30 GB available space.
With the high res texture pack:

Note: To utilize the High-Resolution Texture Pack, make sure you have an additional 58 GB of available and that your system meets/exceeds the recommended specs below.
Recommended PC Specs
Windows 7/8/10 (64-bit OS required)
Intel Core i7-5820K or better
GTX 1080 8GB
8GB+ Ram


And there are plenty others.

Just because it isn't needed right at release doesn't mean that there won't be something that can eat up 10GB a little ways down the road.

That being said, I would most likely get the 3080 myself if/when I do buy one.
Skyrim and Fallout being loaded up with 1000 different 8K rug textures and boob mods to eat lots of VRAM for the hell of it doesn't really change anything. Developers are going to ensure a high-profile release works with the new defacto gaming GPU's. Their workflows and development targets will therefore be tuned to these new common denominators, as usual when Nvidia/AMD release new cards. Thus the statement 10GB is the new 11GB.
 
Last edited:
Skyrim and Fallout being loaded up with 1000 different 8K rug textures and boob mods to eat lots of VRAM for the hell of it doesn't really change anything. Developers are going to ensure a high-profile release works with the new defacto gaming GPU's. Their workflows and development targets will therefore be tuned to these new common denominators, as usual when Nvidia releases new cards. Thus the statement 10GB is the new 11GB.

HAHAHAHA. That's great.

Anyway, the mod I referenced was released by the developers.
 
By the time 10GB is an issue for the average game, the 3080 will no longer be a relevant gaming card. This 10GB concern is unfounded.

Remember Nightmare vs Ultra setting on Doom. 4GB VRAM cards couldn’t use nightmare settings. 4GB card owners had to settle for one tier down.

Who cares. You couldn’t even tell the visual difference in still screenshots or gameplay. If we come to a point like that with 10GB — it’ll be similar that 10GB owners are missing nothing of consequence.

 
The question comes down to: do you pony up the extra ~$200 (what I guess the price difference of the 20GB cards will cost) into the 3080 or wait ~2 years down the road and put the ~$200 towards a 4080 that will likely have just as much VRAM and even better performance.

I guess if you’re planning on keeping your 3080 for more than 2 years it is probably worth it. But if you plan to upgrade again in 2 years it may be better to put that $200 down then?

The 4080 won't be a 'generational' change / upgrade.

If you currently have a 2080, for sure, wait... But if you have anything older?
 
As a counterpoint to the high res texture concerns, Witcher 3 runs slightly better on my 2080S with the HD Reworked high-res texture packs installed than without.
 
The 4080 won't be a 'generational' change / upgrade.

If you currently have a 2080, for sure, wait... But if you have anything older?
We don’t know that for sure. But the argue I’m trying to make is why put additional cash towards a 3080 for extra VRAM that probably won’t be utilized in the majority of games over the next 2 years when you can put that money towards the 4080 if you normally upgrade every generation.
 
I posted this vid in another thread - but I think it worthy of its own discussion.



The first part talks about the folly of buying a 3090 for 8k gaming (which, frankly, common sense should have told us...). However the second half has benches vs an OverClocked 3080 and a regular 3080 FE... and I'm not seeing the value proposition, for anyone other than media creation folks.

For those of us who are 'average to power' consumers of media... 3080 seems plenty fine. Presuming they quit dribbling them out.


hahaha! Best video I've seen of his.
 
We don’t know that for sure. But the argue I’m trying to make is why put additional cash towards a 3080 for extra VRAM that probably won’t be utilized in the majority of games over the next 2 years when you can put that money towards the 4080 if you normally upgrade every generation.

Within those perameters, I can accept your argument.
  • I fully agree that for the short term, waiting for a 20 GB 3080 is not going to pay dividends vs getting one now (if you have a chance to) - given the industry isn't likely to realistically stress this card for a couple of years.
  • I don't understand why folks would upgrade every iteration /year tho. Not at all good use of gaming money - given that at 1080p 60hz gaming the 980 still works for most titles, the 1080 works for even the new titles and the 2080 isn't stressed at all (this is, also, the most common res). 2080 does great at 1440 and only struggling at 4k.
Way back when we always suggested skipping a gen for the best bang for the buck (tick = cpu/mobo/ram, tock =gpu)

I'm guessing that's becoming less common these days?
 
Within those perameters, I can accept your argument.
  • I fully agree that for the short term, waiting for a 20 GB 3080 is not going to pay dividends vs getting one now (if you have a chance to) - given the industry isn't likely to realistically stress this card for a couple of years.
  • I don't understand why folks would upgrade every iteration /year tho. Not at all good use of gaming money - given that at 1080p 60hz gaming the 980 still works for most titles, the 1080 works for even the new titles and the 2080 isn't stressed at all (this is, also, the most common res). 2080 does great at 1440 and only struggling at 4k.
Way back when we always suggested skipping a gen for the best bang for the buck (tick = cpu/mobo/ram, tock =gpu)

I'm guessing that's becoming less common these days?
Video cards used to be almost yearly releases. It’s only been the past couple of generations that they’ve lasted 2+ years between releases. Lots of people upgrade based on time between upgrades as they’ve saved enough cash opposed to the actual need to.
 
3090 is a upgrade for professional work or ultra Uber gamer.
6900xt is where the value market is.Even on immature drivers.....Feel free to quote:)
 
Skyrim and Fallout being loaded up with 1000 different 8K rug textures and boob mods to eat lots of VRAM for the hell of it doesn't really change anything. Developers are going to ensure a high-profile release works with the new defacto gaming GPU's. Their workflows and development targets will therefore be tuned to these new common denominators, as usual when Nvidia/AMD release new cards. Thus the statement 10GB is the new 11GB.

Or overzealous settlements...

6Z9mvP3.jpg
 
First of all, no one is gaming at 8K ...

If people have to worry about "dogding a bullet" with their wallets then they shouldn't be trying to afford these cards in the first place.PXL_20200920_170534255.NIGHT.jpgPXL_20200924_135650777.jpgPXL_20200924_144948905.jpgPXL_20200924_231201105.jpgPXL_20200924_231208881.jpgPXL_20200925_014158971.jpg

The RTX 3090 is the replacement for the 2080 ti in my book, simply because the price is about the same if not slightly higher.

So if you have / had a 2080 ti, then the RTX 3090 is the replacement.

Any type of drama / disdain people are spewing toward the RTX 3090 is simply put, people who do not have the means. If these were affordable, then EVERYONE would have one. Same logic applies to exotic cards, huge homes, pricey vacations, etc.

The RTX 3090 is definitely a card for performance oriented people who want the best gaming performance. I have a LG OLED C9 that I need to power. I also have Cyberpunk 2077 that I want to run at max settings at 4K with HDR / RTX. The tool for that? The RTX 3090.

In COD Warzone 1440p I get 250 to 300fps, in 4K with all high settings, ultra, etc, I get 140 - 200fps depending on map geometry and this is WITHOUT Gsync on since it's currently broke..
 
First of all, no one is gaming at 8K ...

If people have to worry about "dogding a bullet" with their wallets then they shouldn't be trying to afford these cards in the first place.View attachment 282537View attachment 282538View attachment 282539View attachment 282540View attachment 282541View attachment 282542

The RTX 3090 is the replacement for the 2080 ti in my book, simply because the price is about the same if not slightly higher.

So if you have / had a 2080 ti, then the RTX 3090 is the replacement.

Any type of drama / disdain people are spewing toward the RTX 3090 is simply put, people who do not have the means. If these were affordable, then EVERYONE would have one. Same logic applies to exotic cards, huge homes, pricey vacations, etc.

The RTX 3090 is definitely a card for performance oriented people who want the best gaming performance. I have a LG OLED C9 that I need to power. I also have Cyberpunk 2077 that I want to run at max settings at 4K with HDR / RTX. The tool for that? The RTX 3090.

In COD Warzone 1440p I get 250 to 300fps, in 4K with all high settings, ultra, etc, I get 140 - 200fps depending on map geometry and this is WITHOUT Gsync on since it's currently broke..
You are confusing means with the enraging supply situation my friend. 95% of folks who wanted it on H have not gotten one, and most I am sure have the means. All the power to you but most of us don't have the time to wait 18 hours in line or even a MC nearby (bay area one has been closed since 2012).
 
Video cards used to be almost yearly releases. It’s only been the past couple of generations that they’ve lasted 2+ years between releases. Lots of people upgrade based on time between upgrades as they’ve saved enough cash opposed to the actual need to.

Given all the available statistics, I would not say that most people upgrade ever year or two years. Steam survey (in before gasp, its so inaccurate) still shows the majority of people residing on the 1000 series. I'd say the average upgrade cycle these days is much longer (4ish years) and isn't driven by gpu releases so much as resolution and frame rate requirements, and most still run on 1080p at something between 60-140hz.
 
We don’t know that for sure. But the argue I’m trying to make is why put additional cash towards a 3080 for extra VRAM that probably won’t be utilized in the majority of games over the next 2 years when you can put that money towards the 4080 if you normally upgrade every generation.

In that case, any possible performance problems which may arise due the having "only" 10GB are going to affect the resale value of the card.
 
Given all the available statistics, I would not say that most people upgrade ever year or two years. Steam survey (in before gasp, its so inaccurate) still shows the majority of people residing on the 1000 series. I'd say the average upgrade cycle these days is much longer (4ish years) and isn't driven by gpu releases so much as resolution and frame rate requirements, and most still run on 1080p at something between 60-140hz.
I think GPU upgrades are actually more driven by price since they've also significantly increased over the past 5 years.
 
In that case, any possible performance problems which may arise due the having "only" 10GB are going to affect the resale value of the card.
Possibly, it would largely depend on the extent of the performance problems. If problems arise in only edge case scenarios and the vast majority of games will still play completely fine then resale value will be minimally impacted. Likely not enough to offset the cost of opt'ing for the 20GB model. But if there are a decent number of games that cause issues with 10GB of VRAM then sure, the impact will likely be more significant. That's where the cost for the 20GB model could be offset with resale value.

Hard to say anything right now without a crystal ball.
 
I think GPU upgrades are actually more driven by price since they've also significantly increased over the past 5 years.

Doubtful, if that was the case there would be more amd gpus than nvidia, which isn't the case.

People have the performance they want at the resolution they use, most if not all of us are outliers using a pc with 3440/1440 or 4k with high hz and even rtx features. Most people on pc still game at 1080p and don't care about rtx but fps where a 1000 series card is ample. Why would you upgrade if your already maxing out your monitor?

I would bet that most gpu sales right now are being driven by the new console launches and the general public expecting that will put pressure on hardware to perform.
 
First of all, no one is gaming at 8K ...

If people have to worry about "dogding a bullet" with their wallets then they shouldn't be trying to afford these cards in the first place.View attachment 282537View attachment 282538View attachment 282539View attachment 282540View attachment 282541View attachment 282542

The RTX 3090 is the replacement for the 2080 ti in my book, simply because the price is about the same if not slightly higher.

So if you have / had a 2080 ti, then the RTX 3090 is the replacement.

Any type of drama / disdain people are spewing toward the RTX 3090 is simply put, people who do not have the means. If these were affordable, then EVERYONE would have one. Same logic applies to exotic cards, huge homes, pricey vacations, etc.

The RTX 3090 is definitely a card for performance oriented people who want the best gaming performance. I have a LG OLED C9 that I need to power. I also have Cyberpunk 2077 that I want to run at max settings at 4K with HDR / RTX. The tool for that? The RTX 3090.

In COD Warzone 1440p I get 250 to 300fps, in 4K with all high settings, ultra, etc, I get 140 - 200fps depending on map geometry and this is WITHOUT Gsync on since it's currently broke..
Lol the arrogance of this post is hilarious. I'm more than well off enough to purchase a 3090. But that doesn't mean I'm going to spend my money in ways I find don't make financial sense.

The card is a terrible value from a gaming perspective compared to the 3080, no matter how you try to spin it to justify your purchase.
 
Lol the arrogance of this post is hilarious. I'm more than well off enough to purchase a 3090. But that doesn't mean I'm going to spend my money in ways I find don't make financial sense.

The card is terrible value from a gaming perspective compared to the 3080, no matter how you try to spin it to justify your purchase.

A 7% performance gain is pretty damn lame, and I bought a 2080 Ti which is the previous pretty darn lame perf increase to price.
 
First of all, no one is gaming at 8K ...

If people have to worry about "dogding a bullet" with their wallets then they shouldn't be trying to afford these cards in the first place...

Any type of drama / disdain people are spewing toward the RTX 3090 is simply put, people who do not have the means. If these were affordable, then EVERYONE would have one. Same logic applies to exotic cards, huge homes, pricey vacations, etc.

The RTX 3090 is definitely a card for performance oriented people who want the best gaming performance. I have a LG OLED C9 that I need to power. I also have Cyberpunk 2077 that I want to run at max settings at 4K with HDR / RTX. The tool for that? The RTX 3090.

In COD Warzone 1440p I get 250 to 300fps, in 4K with all high settings, ultra, etc, I get 140 - 200fps depending on map geometry and this is WITHOUT Gsync on since it's currently broke..

You missed the point of my post, and what was in the video, entirely. The point isn't that its expensive, or that people cannot afford it - it's why pay double for less than 10% improvement? The value proposition is simply not there.

Heck - go to the middle of the vid: the benchmarks show an OC'd 3080 almost on par with the 3090... Heck, in RDR2, at 4k the 3090 gets 8 frames more per second than the STOCK 3080... for twice the price.

Oof.
 
Back
Top