• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

GeForce RTX 5060 Ti 8GB performance revealed (The review that NVIDIA doesn't want you to see)

8gb Vram seemed insane (like, good) even in 2019 on my Rx 580, at a time when plenty of people were rocking a 1060 3gb, and my last card was a GTX 680 4gb.
 
Last edited:
I just don't get their reasoning here from any perspective.. having a lower tier card than the x070 series with more VRAM than it or a version with obviously inadequate VRAM. Why dafuq did they not just make it 12GB and charge the same or slightly less than the 16GB variant? The 8GB card is almost e-waste with how much it gimps the GPU.
 
Will see the price of the upcoming 9060xt 8GB version because it is an other product that could be hard to price well here.

12GB version of them would make significantly more sense and must be incoming. You can still make nice 8GB 1080p card, but you need a good price point, the 5060ti is just too mid price instead of low to be one.

Why dafuq did they not just make it 12GB
Must be 3GB not ready in time/volume, 12GB 5060-12GB/16GB 5060ti-18GB 5070 all sound like a way nicer line up. That said 4060ti 16gb/4070 12gb was a thing, 3060 12gb, 3070-3080 at 8-10 also, so maybe not, maybe it was the original plan all along.

at least it seem, according to market where available cards exist:

https://www.overclockers.co.uk/?query=5060ti&availability_status[]=available
https://www.canadacomputers.com/en/search?s=5060+ti+


Seem to be making more 16Gb than 8gb version...
 
Last edited:


Me as an owner of 3070, the 8gb VRAM struggles are real and this is a perfect video to highlight how and why. Most interestingly, everybody notice how the 16gb performed better than 8gb even when 8gb version was not obviously VRAM limited when settings got lowered? Extra vram helps performance as the card doesn't have to waste resources for shuffling stuff around from and into vram all the time. My 3070 chip could do better but it runs with a constant VRAM handbrake on, therefore it cannot offer the best performance it technically should be capable of. This 5060Ti 8gb vs 16gb is the perfect proof of why having enough vram matters, no matter what resolution you are gaming at.
 
Here comes ZeroBarrier to say that the GeForce RTX 5060 Ti 8GB is fine.

All you have to do is play at 1080p on low quality.
Wait, you agree with testing a 1080p card at 4K and calling it a failure because it fails at a resolution it clearly wasn't designed for?

This forum used to have smart level headed members. Oh how the ignore list grows.
 
Me as an owner of 3070, the 8gb VRAM struggles are real and this is a perfect video to highlight how and why. Most interestingly, everybody notice how the 16gb performed better than 8gb even when 8gb version was not obviously VRAM limited when settings got lowered? Extra vram helps performance as the card doesn't have to waste resources for shuffling stuff around from and into vram all the time. My 3070 chip could do better but it runs with a constant VRAM handbrake on, therefore it cannot offer the best performance it technically should be capable of. This 5060Ti 8gb vs 16gb is the perfect proof of why having enough vram matters, no matter what resolution you are gaming at.
That's honestly surprising, I have a 3070 mobile that only has access to 8GB and it runs games just fine at proper settings and resolutions. Have you tried customizing game settings down some?
 
Wait, you agree with testing a 1080p card at 4K and calling it a failure because it fails at a resolution it clearly wasn't designed for?
DLSS is good enough now that testing at upscaled at 4K do make sense (as people with mid-end PC still tend to have 4k tvs) all the example in that video are upscaled to 4k, they are not trying to run RT+ultra setting-4k native game.

Hardware unboxed tend to be pretty good overall, a 5060ti maybe do not sound like a 4k (even upscaled too) card, but they showed issue at 1080p medium/high setting type of situation, it is not even highest setting 1080p where issue could pop-up now, even, 1440p-medium preset for Indiana jones does not work well at all.

The 8GB not being a big issue for 1080p gamers was true in 2018-2020, borderline in 2022, in 2025 you need a low-priced product for it to make sense, play station 6 will launch during the active expected lifetime of an over $350 gpu purchased this summer. Not only the ps6 but 12gb vram PC gpu could/should become common under that generation expected useful life as well, from intel B580 and their next generation, AMD cards above the 9060xt 8GB, the popular old 3060-12gb, the upcoming 5060 12gb, 5060ti 16gb, once 3, 4, 6 GB vram module get online, stagnation could stop. 2018-2020 8GB vram were safe because Nvidia did not deprecate them for so long and gave them a really nice and long life, they are the reason game dev had to make any popular game fit into 8Gb.

5050 at $250 or less could be fine and marketed as a 1080p-medium not latest AAA, of course, there is a giant market for low demanding e-sport player that do not need much vram (run simple games at low detail for max clarity and latency) but still want high fps with it, making low vram but still powerful gpu attractive at the right price, but need to be marketed for it.
 
Last edited:

Well that was a disaster and no doubt nVidia knew this. Really don't get why they are dead set on 128 bit for the 60 series now. Even 160 bit at 10GB what have been world's better as we have seen with the 3080 performance as it seems they want nothing to do with gddr6. Then use a cut down 128 bit for the 5060 with 8 GB and 16 GB options. That would have carried over nicely for 15 GB 5060 ti super and 12 GB 5060 Super next year.

10 GB seems like a target for game developers for a while as it is about the actual available vram of the Series X.
 
They would have needed to know in advance 3GB ram module would have not been ready no ?

Those super revision could happen quite fast...
 
Wait, you agree with testing a 1080p card at 4K and calling it a failure because it fails at a resolution it clearly wasn't designed for?

This forum used to have smart level headed members. Oh how the ignore list grows.
Calling it a "1080p" card is a bit silly, as the 16GB version of the exact same card (same cooler, clocks, power etc) can run DLSS 1440p and 4K very acceptably in many games where the 8GB cannot.

So, if removing half the VRAM actually impacts performance in a noticeable way, taking an acceptable performance level and changing it into an unacceptable performance level, then you can't really say that the RAM bottleneck is only present in situations where the GPU isn't strong enough to utilise it.
 
That's honestly surprising, I have a 3070 mobile that only has access to 8GB and it runs games just fine at proper settings and resolutions. Have you tried customizing game settings down some?

Of course I have but there is only so much one can do before the image quality goes down the drain so much that the game becomes "unplayable", especially since I use a huge 50" 4k tv as a monitor. That was not the point, the point was that I know the GPU could do better but the amount of VRAM it has does not allow it.
 
Calling it a "1080p" card is a bit silly, as the 16GB version of the exact same card (same cooler, clocks, power etc) can run DLSS 1440p and 4K very acceptably in many games where the 8GB cannot.

So, if removing half the VRAM actually impacts performance in a noticeable way, taking an acceptable performance level and changing it into an unacceptable performance level, then you can't really say that the RAM bottleneck is only present in situations where the GPU isn't strong enough to utilise it.

Even calling it a 1080p card is a stretch unless one considers buying a new GPU in 2025 to play some games at Medium graphics settings a good idea. 🤦
 
Even calling it a 1080p card is a stretch unless one considers buying a new GPU in 2025 to play some games at Medium graphics settings a good idea. 🤦
As well as 144Hz 1440p screens being incredibly cheap, and many 4K 120+Hz screens being cheaper than the 5060 Ti itself.


If the 8GB version struggles in easy-to-replicate situations where the 16GB version does not: 8GB simply is not enough. There is a point where a RAM upgrade will offer no improvement in any modern use-case, such as decking the 5060 with 128GB of RAM. There are probably very VERY few situations where the hypothetical 128GB card performs noticeably (or even measurably) better. Mostly because 16GB is 'enough' for the modern game workloads that this GPU has the power to be effective in. But 8GB is not enough for the modern game workloads that this GPU has the power to otherwise be effective in.

If the 16GB version can play a game at 4K ~60FPS and the 8GB version cannot: no amount of calling it "a 1080p card" can justify it.
 
I noticed while watching that video that only a handful of games went over 12GB. It would be fun to see a comparison like that with a "$250" 12GB ARC B580 beating up on a "$380" 8GB RTX 5060Ti.

I'm curious how much worse a game would look if they tuned the settings on the 8GB card to run at about the same fps as the 16GB card. Lower textures or...?
Lower textures would often be the starting point, but there are lots of effects that consume quite a lot of memory as well. E.g. you can turn of some shader effects and gain more available memory, but a lot of the time the effects you turn off makes the image significantly worse and the GPU cost is often low to medium. Even a 10GB 3080 is starting to struggle in quite a few games at 1440P and the difference between 1080p and 1440p is often around 1GB of memory. The chip is capable of running good framerates, but the lack of memory creates a choice between textures and shaders in quite a few games.

The scary scenario for 8GB cards is that quite a few game developers have already started saying they are tired of the 8GB standard, which has been around since 2016, and have started to use more memory even at medium settings. My impression is that devs are targeting 12GB as lower mid-range and that any 8GB GPU will soon be treated as a low spec entry level cards. The 8GB cards might only be good for low settings in a few years and gamers will be wondering why they can run 100fps at low, but only a stuttering 30fps mess at medium on their obsolete 8GB cards.
 
This card should really not exist, or cost far far less than the 16GB one at least. And for sure it needs a different name.

Seriously, even people on a budget are buying 1440p 144hz+ monitors nowadays. The LCD variants are dirt cheap. Like 200-300 bucks, sometimes even less.
 
This card should really not exist, or cost far far less than the 16GB one at least. And for sure it needs a different name.

Seriously, even people on a budget are buying 1440p 144hz+ monitors nowadays. The LCD variants are dirt cheap. Like 200-300 bucks, sometimes even less.
I got my 1440p 180hz for $150, although, it was on like microcenter super sale.
 
I noticed while watching that video that only a handful of games went over 12GB. It would be fun to see a comparison like that with a "$250" 12GB ARC B580 beating up on a "$380" 8GB RTX 5060Ti.

I'm curious how much worse a game would look if they tuned the settings on the 8GB card to run at about the same fps as the 16GB card. Lower textures or...?
The only ones I saw showing over 10GB used were Last of Us 2 and Spiderman 2 and even then, that's no guarantee a 10 GB card will fall on its face since the 8 GB was on par when usage showed 8.5 GB from the 16GB on certain scenarios.

8 GB is definitely too low, but it's only by like 1 or 2 GB.
 
The only ones I saw showing over 10GB used were Last of Us 2 and Spiderman 2 and even then, that's no guarantee a 10 GB card will fall on its face since the 8 GB was on par when usage showed 8.5 GB from the 16GB on certain scenarios.

8 GB is definitely too low, but it's only by like 1 or 2 GB.
which makes it all the more questionable for me, so close yet so far.
 
The only ones I saw showing over 10GB used were Last of Us 2 and Spiderman 2 and even then, that's no guarantee a 10 GB card will fall on its face since the 8 GB was on par when usage showed 8.5 GB from the 16GB on certain scenarios.

8 GB is definitely too low, but it's only by like 1 or 2 GB.
Not by 1 or 2 GB, more like 4GB. Even a 3080 with 10GB has issues now days at 1440P and is borderline OK for 1080P in some of the latest titles. People buying a GPU have a longer perspective than just now, they want to have their GPU for a few years. The VRAM issue will get a lot worse in the next few years.
 
Not by 1 or 2 GB, more like 4GB. Even a 3080 with 10GB has issues now days at 1440P and is borderline OK for 1080P in some of the latest titles. People buying a GPU have a longer perspective than just now, they want to have their GPU for a few years. The VRAM issue will get a lot worse in the next few years.
I'm talking about scenarios that are usable for 5060ti/3080 class cards not future games at 4k rtx. Those cards will likely have to play future games at reduced settings anyhow. Again, haven't really seen a a reputable review benchmark of thr 3080 taking a dump from it's 10 GB vram.
 
I'm talking about scenarios that are usable for 5060ti/3080 class cards not future games at 4k rtx. Those cards will likely have to play future games at reduced settings anyhow. Again, haven't really seen a a reputable review benchmark of thr 3080 taking a dump from it's 10 GB vram.
Both the 5060 ti and 3080 chips are fast enough for 1440p rendering, but there are already quite a few games where you have to turn down settings on the 3080 in 1440p or get massive stutters every now and then. I retired my 3080 and got a 5070 ti because I got tired of having to fine tune settings or run 120 fps on medium in several games.

E.g. I played Indiana Jones the Great Circle on my secondary with a 7900 GRE 16GB because the 3080 had loads of texture loading/pop-in issues at 1440p due to lack of VRAM for texture cache. A 3080 would be faster if it had enough VRAM due to the 7900 GRE being weak at RT. Far-Cry 6 was either high res textures or RT, due to VRAM limitations on the 3080 at 1440p, even if the card could do both with no issues at all if it had enough VRAM. There are lots of games that will downscale the textures when it notices your GPU running out of VRAM so it may look fine in the FPS department, but your graphics quality takes a nosedive.

There are other games as well where the 3080 struggles with VRAM at 1440p even if the chip is capable of more, and some of them are a few years old. The maximum amount of VRAM available for games is typically your VRAM - 800MB and you typically want around 400mb leftover out of that for loading new textures etc. to avoid stutter.

The issues with VRAM at 8GB are real right now at 1080p and moving to 10GB won't help all that much when games are being optimized for 12GB on high and 16GB on very high. Basically you will end up with a mix of low and medium settings with 100fps just to avoid the massive stutters that come with running out of VRAM. Look at the hardware unboxed video where the 16GB model runs great at high or very high settings in some of the titles at 1080p while the 8gb version has massive 1% low drops or crashes at the same settings. You might as well get a 5050 if you want an 8gb card as you will have massive issues and going to 10gb doesn't help all that much.
 
This thread page is nearly 5 years old, with duscussion on how the 3080 would hold up:
https://hardforum.com/threads/the-slowing-growth-of-vram-in-games.1971558/page-3

5 years was once a very long time in the tech world with massive changes.
The 3080 has done great in that time and 10GB has been a sizeable advantage over 8GB.

All things considered, the 3080 has lived a long life with its vram size. New 12 GB cards at this performance tier will hold up very long as well.
 
the 3080 has lived a long life
It often beat the 6900xt in new games at 4k ultra setting min fps (not that either can necessarily run them, just an extreme case used to push vram), which was not necessarily what people would have predicted in 2020 what it would have looked like 4-5 years later:
https://www.techpowerup.com/review/stalker-2-fps-performance-benchmark/5.html
https://www.techpowerup.com/review/the-last-of-us-part-2-performance-benchmark/5.html
https://www.techpowerup.com/review/kingdom-come-deliverance-ii-performance-benchmark/5.html

Could be a bit of a self-fulling prophecy, too.

So it is a bit mixed, in a sense it was less of a big issue that many expected, but the second factor if 10gb for the 3080 was an issue is for how long those card will be relevant and need to perform, there is a world that the new 9060xt and 5060 make the 3080 irrelevant, turned it into an used $180 card and if it was a mistake, no big deal just easily buy a $300 better than the 16GB 3080 anyway gpu (like the 3060 was 25% faster than the 1080), but that not how silicon evolved.
 
A bit disingenuous since I have yet to see a single game that has this issues when playing at 1080p at medium settings. You know, where these GPUs are really meant to compete.

Honestly, the lower spec GPUs performance reviews make me really miss [H] reviews when they would conclude at what resolution and settings gamers would expect to use to have acceptable performance. All these tech tubers give me the below memes vibes so much. It's almost like they intentionally ignore the fact that a Prius level GPU isn't designed to compete in a quarter mile drag race.

View attachment 723997
Funny meme, and agreed that resolution expectations need to be set.

At the same time...there's a 16GB version. Why even release an 8GB model of the same card at that point? It's still a $400+ real world cost card. At that price level, yeah I'd say 8GB is dumb in 2025 and Nvidia could have just saved some face by not even making an 8GB version.
 
Funny meme, and agreed that resolution expectations need to be set.

At the same time...there's a 16GB version. Why even release an 8GB model of the same card at that point? It's still a $400+ real world cost card. At that price level, yeah I'd say 8GB is dumb in 2025 and Nvidia could have just saved some face by not even making an 8GB version.
Because there is a portion of consumers that will always look for the most inexpensive product, even if it's technically not worth the trade off versus the product one or two tiers higher. Nvidia would be a fool to forgo those consumers to AMD hen they already know how many of them are out there. We ourselves see it in the Steam hardware charts were the xx60 class card is always has the highest amount of users by substantial amounts.

Edit: And again, those same consumers do not run into these made-up issues because they aren't trying to game outside 1080p and medium settings.
 
Because there is a portion of consumers that will always look for the most inexpensive product, even if it's technically not worth the trade off versus the product one or two tiers higher. Nvidia would be a fool to forgo those consumers to AMD hen they already know how many of them are out there. We ourselves see it in the Steam hardware charts were the xx60 class card is always has the highest amount of users by substantial amounts.

Edit: And again, those same consumers do not run into these made-up issues because they aren't trying to game outside 1080p and medium settings.

The regular RTX 5060 exists too though. I feel like they could just ignore the 5060 Ti 8GB and just leave the 5060 8GB and 5060 Ti 16GB SKUs around and that should be enough to cover the $300-$450 range.
 
The regular RTX 5060 exists too though. I feel like they could just ignore the 5060 Ti 8GB and just leave the 5060 8GB and 5060 Ti 16GB SKUs around and that should be enough to cover the $300-$450 range.
The 5060 is just going to be a 4060ti 8GB at best. Really wish nVidia would make a 96 bit 12 GB card. GDDR7 has plenty of bandwidth to compensate.

The 6700xt and 7700xt are both better options than nvidias new 8GB options at this point and we haven't even seen what AMDs new midrange cards can do yet.
 
Back
Top