• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

The Slowing Growth of vRam in Games

Do you think it will be sufficient at 3440 x 1440? Especially considering it can run Ray Tracing, DLSS 3 and Frame Gen for casual single player titles? Ideally I would wait for 50 series and GDDR7 but that won’t be for at least 18-24 months, maybe longer for mid-high tier gaming cards.

That resolution is between 1440p and 4k as far as pixel count goes. Throw in RT and other features and you will likely be out of gpu grunt before 12gb vram is a problem.
 
That resolution is between 1440p and 4k as far as pixel count goes. Throw in RT and other features and you will likely be out of gpu grunt before 12gb vram is a problem.
Yeah I figured, people recommended the 7900 XT but that card will have the same issue it’s most likely never going to really be able to effectively use 20GB VRAM as it won’t have the “horsepower” to utilize any game that requires such an amount.
 
Yeah I figured, people recommended the 7900 XT but that card will have the same issue it’s most likely never going to really be able to effectively use 20GB VRAM as it won’t have the “horsepower” to utilize any game that requires such an amount.
I think that depends.

If you are gaming in 4K then 7900xt else if gaming at 1440p then 4070ti
 
I think that depends.

If you are gaming in 4K then 7900xt else if gaming at 1440p then 4070ti
Yes possibly at 4K but I think any game that Requires 20GB VRAM, the 7900XT is going to run out of power that is usually the case. Anyway I have no desire to game at 4K so it’s not really an issue for me. But if I were gaming at 4K I agree the 7900 XT is the better card for that resolution.
 
Fast storage should also help alleviate VRAM shortages as well with fast streaming of assets when needed.

Personally 12gb should be enough since consoles are at their limits with a few tricks. Now going above Console quality 12gb maybe more limiting. Still believe Nvidia has better compression of data internally plus don't rule out AI methods of compression Nvidia may incorporate.

Was running Witcher 3 at 3600x1600, 7900XTX, FSR2 quality and getting in the 70's RT High. It was using 11.9gb of Vram. A much more capable RT card as in the 4090 at 4K would use way more while performing better.

4070Ti to me is borderline and likely to become limiting in certain scenarios. More to due to the VRam amount.
 
3070 was such a bad buy in hindsight. They were getting pushed way too hard during the crunch. Imagine dumping $600 for 8gb of vram, I swear there were people passing up 6800xts for similar prices to go with the 3070. Same with 6700xt vs 3060.
 
3070 was such a bad buy in hindsight. They were getting pushed way too hard during the crunch. Imagine dumping $600 for 8gb of vram, I swear there were people passing up 6800xts for similar prices to go with the 3070. Same with 6700xt vs 3060.
And its going to be the same problem with 12GB of vram......Nvidia really does know how to fleece customers in upgrading lol
 
*when buying from certain companies with lower vram.

*Edit* The salt over at r/nvidia over this video is pretty amusing.
Yeah well we even had people here defending the 8gb. Glad to see someone did testing to find out it’s more than 3 games.

Again the 3070 can play those games at max settings at 1080p…..IF it had more than 8GB of memory
 
I mean it’s definitely a bad look for Nvidia that’s pretty obvious, at the same time it’s only 5 games…which of 2 are very unoptimized. But it’s def a sign of what’s to come with this new gen of consoles, and their mid-gen refreshes will only be more capable. So much for grabbing a 70 Ti…large amounts of BS having to drop almost $1300 to go with a long term Nvidia card if you’re gaming over 1080p.
I think very unoptimized is an excuse for people who can’t run it the way they want because of inferior vram. Don’t matter how anyone tries to say it, 8gb is now low end. Same thing will happen with 12gb in 1.5-2 years as well.

Why pay $800 for a video card you expect to play max graphics games, then to find out soon you won’t because too because it’s lacking vram. The gpu could do it, but the company sold you low ass vram.
 
Last edited:
I think 12GB will be the new 8GB from a few years ago...will get you by in most games but will hold you back in others...it's on the edge of being good yet obsolete at the same time
It's definitely possible, I do think we need a larger sample size of games that will be on next-gen engines (UE5 etc.) before we can definitively say 12GB VRAM will become a big issue at 1080p/1440p, at 4K it's a pretty safe bet that it will become insufficient in the next few years (GDDR6/6X at least), especially on a 192 bit bus.....Hopefully as more developers become more familiar with new gen engines we can get more optimized games, there will always be outliers though.
 
16GB vs. 8GB VRAM: Radeon RX 6800 vs. GeForce RTX 3070, 2023 Revisit



Shiiit. I am sad (and furious) to see my 3070 being outdated this quickly and only because of VRAM and not GPU performance itself. Not that there is much help from 4000 serie, Nvidia is still being stingy about their vram amounts.

I am surprised how much VRAM games use these days. When 3070 was released you could play Doom Eternal at 4k at almost max settings and it ran really quick. This was without raytracing though but the game still looks stunning and 8gb was (barely) enough. Now 3070 struggles at 1440p even with raytracing disabled. 😭 *Edit* struggling as in barely capable of stable 60fps.
 
Shiiit. I am sad (and furious) to see my 3070 being outdated this quickly and only because of VRAM and not GPU performance itself. Not that there is much help from 4000 serie, Nvidia is still being stingy about their vram amounts.

I am surprised how much VRAM games use these days. When 3070 was released you could play Doom Eternal at 4k at almost max settings and it ran really quick. This was without raytracing though but the game still looks stunning and 8gb was (barely) enough. Now 3070 struggles at 1440p even with raytracing disabled. 😭 *Edit* struggling as in barely capable of stable 60fps.

the Hardware Unboxed video even mentions that if the 3070 had higher VRAM (16GB) then it would be a much better performer...it's strictly the VRAM that's holding it back...they even mentioned that the 3060 with 12GB VRAM is expected to age better
 
Before and after the Ampere launch there were users saying the low VRAM from Nvidia's cards were fine due to memory bandwidth and for a time managed to persuade even myself but yeah it's apparent VRAM is becoming a bottleneck where otherwise various cards could have been performant. I have no problem saying that as an Ampere owner (though the 3080 has a little more headroom).

As much as Hardware Unboxed are known to have an AMD bias the benchmarks from them and others show the same thing. Like, part of the point of buying a higher end card is being able to play at the highest quality (within reason) but two and a half years in with such signs isn't a great trajectory. Not sure what's controversial about admitting that.

I think part of it is for a good chunk of that time it was hard in some places to obtain a card due to the GPU shortages and often inflated prices so there's natural instinct to temper critique.
 
AMD showing off...

https://community.amd.com/t5/gaming/building-an-enthusiast-pc/ba-p/599407

to be fair the vast majority of titles using tons of VRAM are AMD sponsored titles (Last of Us, Resident Evil 4)

AMD.png
 
Before and after the Ampere launch there were users saying the low VRAM from Nvidia's cards were fine due to memory bandwidth and for a time managed to persuade even myself but yeah it's apparent VRAM is becoming a bottleneck where otherwise various cards could have been performant. I have no problem saying that as an Ampere owner (though the 3080 has a little more headroom).

As much as Hardware Unboxed are known to have an AMD bias the benchmarks from them and others show the same thing. Like, part of the point of buying a higher end card is being able to play at the highest quality (within reason) but two and a half years in with such signs isn't a great trajectory. Not sure what's controversial about admitting that.

I think part of it is for a good chunk of that time it was hard in some places to obtain a card due to the GPU shortages and often inflated prices so there's natural instinct to temper critique.
Not sure how they have an AMD bias considering how many times they called out AMD on their bullshit. Just like calling out Nvidia on their bullshit.

They are only biased if you favor one company over another. If anything Hardware unboxed is one of the more unbiased reviewers on the net.

Did you miss the last review where they explained how much better DLSS is over FSR that it wasn't even close in some regards? Totally biased toward AMD lol
 
Last edited:
Not sure how they have an AMD bias considering how many times they called out AMD on their bullshit. Just like calling out Nvidia on their bullshit.
To be clear, I don't think it colors the objective side of things like benchmarks but I'm not sure how it can't be seen as a preference, at least from videos and channel pages I've seen. There seems to be a tendency to be harsher with critique of Nvidia with features/issues/opinion pieces, where it's also characterized/worded a certain way, while for AMD there's less overt negativity and the language used is softer. Also their thumbnails are treated differently between the two.

If I compare to say Gamers Nexus I haven't observed such things with their coverage of both brands. I don't care that much either way though as I'm generally only interested in figures for things.
 
I'm kinda in the same boat here, my buddy wants to build a new rig. Its going to be a 7800X3D and we're struggling to decide on the GPU. He doesn't want to spend more than ~$1000 on it, but he has a 34" Alienware Gsync display (real Gsync module) so to take advantage of that he needs an Nvidia GPU. The problem is at $1000 its 4070ti vs 7900XTX and they aren't even close, and he tends to keep things a long time - his current card is a 2070. 12GB on the 4070ti 3-4 years down the road is likely to be in the same spot as the 8GB cards now.
 
To be clear, I don't think it colors the objective side of things like benchmarks but I'm not sure how it can't be seen as a preference, at least from videos and channel pages I've seen. There seems to be a tendency to be harsher with critique of Nvidia with features/issues/opinion pieces, where it's also characterized/worded a certain way, while for AMD there's less overt negativity and the language used is softer. Also their thumbnails are treated differently between the two.

If I compare to say Gamers Nexus I haven't observed such things with their coverage of both brands. I don't care that much either way though as I'm generally only interested in figures for things.
Funny because 99% of the time hardware unboxed and gamers nexus usually end with the same conclusion. I think it’s more that people are biased when watching the videos then get upset when their preferred company is shown in a bad light.

Sorry but there have been many times they praised Nvidia. Lol
 
I'm kinda in the same boat here, my buddy wants to build a new rig. Its going to be a 7800X3D and we're struggling to decide on the GPU. He doesn't want to spend more than ~$1000 on it, but he has a 34" Alienware Gsync display (real Gsync module) so to take advantage of that he needs an Nvidia GPU. The problem is at $1000 its 4070ti vs 7900XTX and they aren't even close, and he tends to keep things a long time - his current card is a 2070. 12GB on the 4070ti 3-4 years down the road is likely to be in the same spot as the 8GB cards now.
You could look at buying a used 3090 or 3090ti then. Both with 24GB of memory too. But yea he will need to spend more than and get a 4080 as imo 12GB of memory should be entry level at this point.
 
You could look at buying a used 3090 or 3090ti then. Both with 24GB of memory too. But yea he will need to spend more than and get a 4080 as imo 12GB of memory should be entry level at this point.

I should specify that only new components are considered otherwise I would just hook him up with my lightly used 6900XT.
 
I should specify that only new components are considered otherwise I would just hook him up with my lightly used 6900XT.
Sounds like he will need to buy a 4070ti then. Since he has a g-sync module monitor....he's stuck in the Nvidia ecosystem.

Otherwise there are plenty of other options at a much lower price compared to the 4080. 6950XT for $600, 7900XT for $800, or 7900XTX for $1000. All with more vram.
 
Sounds like he will need to buy a 4070ti then. Since he has a g-sync module monitor....he's stuck in the Nvidia ecosystem.

Otherwise there are plenty of other options at a much lower price compared to the 4080. 6950XT for $600, 7900XT for $800, or 7900XTX for $1000. All with more vram.

Gsync monitors will work with AMD cards but as a normal monitor. However, this guy didn't even know it had Gsync to begin with despite having it for almost 5 years now so I don't think that will be an issue. Thats why I'm focusing on the 7900XTX because for $100 more than a 4070ti its a whole lot more card.
 
Gsync monitors will work with AMD cards but as a normal monitor. However, this guy didn't even know it had Gsync to begin with despite having it for almost 5 years now so I don't think that will be an issue. Thats why I'm focusing on the 7900XTX because for $100 more than a 4070ti its a whole lot more card.

Sometimes those monitors are also compatible with Freesync as well, my monitor for instance will do both.
 
Sometimes those monitors are also compatible with Freesync as well, my monitor for instance will do both.

I thought that was always the case? Or am I mistaken? Never used gsync, so I can't say for sure.

Yeah, this monitor has an original Gsync module so unfortunately it doesn't. The very very expensive monitors with the Gsync Pro module will do freesync as well, and ones listed as "gsync compatible" just do adaptive sync with no module.
 
The question now becomes 4070 vs 4070 Ti? For non competitive gaming at 3440 x 1440…I will be using the DLSS 3/Frame Gen features in games that support it. I’m leaning towards the 4070 as it has the same memory configuration. Launches tomorrow for $599 and there will be a FE.

Update: Ending up buying the ASUS TUF 4070 Ti for $799, after reading DLSS 3 will be available for devs in UE4/UE5 and Unity + more engines (in engine) I have no concern about 12GB with the extra L2 Cache gaming at 3440 x 1440 (4K maybe I’d be more hesitant but it’s even running well there as of now). Can’t wait to pick it up next week from best buy! VRAM panic anyone?
 
Last edited:
The question now becomes 4070 vs 4070 Ti? For non competitive gaming at 3440 x 1440…I will be using the DLSS 3/Frame Gen features in games that support it. I’m leaning towards the 4070 as it has the same memory configuration. Launches tomorrow for $599 and there will be a FE.

Update: Ending up buying the ASUS TUF 4070 Ti for $799, after reading DLSS 3 will be available for devs in UE4/UE5 and Unity + more engines (in engine) I have no concern about 12GB with the extra L2 Cache gaming at 3440 x 1440 (4K maybe I’d be more hesitant but it’s even running well there as of now). Can’t wait to pick it up next week from best buy! VRAM panic anyone?

Probably the best choice, the 4070ti is quite a bit faster than the 4070. Its not really vram panic, its just that the emerging game engines are very vram hungry and its only going to get worse as features get added. People that don't want to buy a GPU every 1-2 years are trying to forward think it is all.
 
Yeah well we even had people here defending the 8gb. Glad to see someone did testing to find out it’s more than 3 games.

Again the 3070 can play those games at max settings at 1080p…..IF it had more than 8GB of memory
My GTX 970 could run Middle-Earth: Shadow of Mordor at max setting at 1440p..... IF it had more than 3.5GB of memory
 
Gsync monitors will work with AMD cards but as a normal monitor. However, this guy didn't even know it had Gsync to begin with despite having it for almost 5 years now so I don't think that will be an issue. Thats why I'm focusing on the 7900XTX because for $100 more than a 4070ti its a whole lot more card.
If he didn't know that going on 5 years I have to wonder if he has a clue. Must have never gone into nvidia CP/exp which is very unusual unless he's borderline computer illiterate.
 
Probably the best choice, the 4070ti is quite a bit faster than the 4070. Its not really vram panic, its just that the emerging game engines are very vram hungry and its only going to get worse as features get added. People that don't want to buy a GPU every 1-2 years are trying to forward think it is all.
There has been very few cases where an older gpu can play games that came out years later at higher setting but did not have enough vram. Everyone keeps getting hung up on simply ram amount and not vram/perf which is what absolutely matters.
 
There has been very few cases where an older gpu can play games that came out years later at higher setting but did not have enough vram. Everyone keeps getting hung up on simply ram amount and not vram/perf which is what absolutely matters.

Very few cases? Its a very normal scenario we've seen a lot over the years when moving to major increments of vram. 512MB to 1GB, 1GB to 2GB, 2GB to 4GB, 4GB to 8GB, etc. Some of you keep getting hung up on people wanting to make wise decisions about spending money on a product that will last 4-5 years instead of just sucking the 4070ti's short little green pp. How many times do we have to fall for this trick? 970 anyone?
 
There has been very few cases where an older gpu can play games that came out years later at higher setting but did not have enough vram. Everyone keeps getting hung up on simply ram amount and not vram/perf which is what absolutely matters.
That is exactly the case with the 3060, 3060ti, 3070 and 3070ti. More powerful GPU than a console, stumbles on console ports due to insufficient vram.

I would expect mid tier if not lower end cards to be able to match the vram of the current generation console. High and above with more. Nvidia first card on the 40 series having more vram than what the consoles have (PS5, XBOX series X) cost over 2x more than the consoles, with barely more vram.

Is 12gb enough? I think it is but not for max settings, it may have to use settings that does not match the quality of consoles.
 
That is exactly the case with the 3060, 3060ti, 3070 and 3070ti. More powerful GPU than a console, stumbles on console ports due to insufficient vram.

I would expect mid tier if not lower end cards to be able to match the vram of the current generation console. High and above with more. Nvidia first card on the 40 series having more vram than what the consoles have (PS5, XBOX series X) cost over 2x more than the consoles, with barely more vram.

Is 12gb enough? I think it is but not for max settings, it may have to use settings that does not match the quality of consoles.
That has yet to be seen. Consoles are not going to have anywhere near the RT ability you can get on PC. And from what people have posted that's inflating the vram requirements.
 
Back
Top