- Joined
- Feb 5, 2025
- Messages
- 113
8gb Vram seemed insane (like, good) even in 2019 on my Rx 580, at a time when plenty of people were rocking a 1060 3gb, and my last card was a GTX 680 4gb.
Last edited:
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Must be 3GB not ready in time/volume, 12GB 5060-12GB/16GB 5060ti-18GB 5070 all sound like a way nicer line up. That said 4060ti 16gb/4070 12gb was a thing, 3060 12gb, 3070-3080 at 8-10 also, so maybe not, maybe it was the original plan all along.Why dafuq did they not just make it 12GB
Wait, you agree with testing a 1080p card at 4K and calling it a failure because it fails at a resolution it clearly wasn't designed for?Here comes ZeroBarrier to say that the GeForce RTX 5060 Ti 8GB is fine.
All you have to do is play at 1080p on low quality.
That's honestly surprising, I have a 3070 mobile that only has access to 8GB and it runs games just fine at proper settings and resolutions. Have you tried customizing game settings down some?Me as an owner of 3070, the 8gb VRAM struggles are real and this is a perfect video to highlight how and why. Most interestingly, everybody notice how the 16gb performed better than 8gb even when 8gb version was not obviously VRAM limited when settings got lowered? Extra vram helps performance as the card doesn't have to waste resources for shuffling stuff around from and into vram all the time. My 3070 chip could do better but it runs with a constant VRAM handbrake on, therefore it cannot offer the best performance it technically should be capable of. This 5060Ti 8gb vs 16gb is the perfect proof of why having enough vram matters, no matter what resolution you are gaming at.
DLSS is good enough now that testing at upscaled at 4K do make sense (as people with mid-end PC still tend to have 4k tvs) all the example in that video are upscaled to 4k, they are not trying to run RT+ultra setting-4k native game.Wait, you agree with testing a 1080p card at 4K and calling it a failure because it fails at a resolution it clearly wasn't designed for?
Calling it a "1080p" card is a bit silly, as the 16GB version of the exact same card (same cooler, clocks, power etc) can run DLSS 1440p and 4K very acceptably in many games where the 8GB cannot.Wait, you agree with testing a 1080p card at 4K and calling it a failure because it fails at a resolution it clearly wasn't designed for?
This forum used to have smart level headed members. Oh how the ignore list grows.
That's honestly surprising, I have a 3070 mobile that only has access to 8GB and it runs games just fine at proper settings and resolutions. Have you tried customizing game settings down some?
Calling it a "1080p" card is a bit silly, as the 16GB version of the exact same card (same cooler, clocks, power etc) can run DLSS 1440p and 4K very acceptably in many games where the 8GB cannot.
So, if removing half the VRAM actually impacts performance in a noticeable way, taking an acceptable performance level and changing it into an unacceptable performance level, then you can't really say that the RAM bottleneck is only present in situations where the GPU isn't strong enough to utilise it.
As well as 144Hz 1440p screens being incredibly cheap, and many 4K 120+Hz screens being cheaper than the 5060 Ti itself.Even calling it a 1080p card is a stretch unless one considers buying a new GPU in 2025 to play some games at Medium graphics settings a good idea.![]()
Lower textures would often be the starting point, but there are lots of effects that consume quite a lot of memory as well. E.g. you can turn of some shader effects and gain more available memory, but a lot of the time the effects you turn off makes the image significantly worse and the GPU cost is often low to medium. Even a 10GB 3080 is starting to struggle in quite a few games at 1440P and the difference between 1080p and 1440p is often around 1GB of memory. The chip is capable of running good framerates, but the lack of memory creates a choice between textures and shaders in quite a few games.I noticed while watching that video that only a handful of games went over 12GB. It would be fun to see a comparison like that with a "$250" 12GB ARC B580 beating up on a "$380" 8GB RTX 5060Ti.
I'm curious how much worse a game would look if they tuned the settings on the 8GB card to run at about the same fps as the 16GB card. Lower textures or...?
I got my 1440p 180hz for $150, although, it was on like microcenter super sale.This card should really not exist, or cost far far less than the 16GB one at least. And for sure it needs a different name.
Seriously, even people on a budget are buying 1440p 144hz+ monitors nowadays. The LCD variants are dirt cheap. Like 200-300 bucks, sometimes even less.
The only ones I saw showing over 10GB used were Last of Us 2 and Spiderman 2 and even then, that's no guarantee a 10 GB card will fall on its face since the 8 GB was on par when usage showed 8.5 GB from the 16GB on certain scenarios.I noticed while watching that video that only a handful of games went over 12GB. It would be fun to see a comparison like that with a "$250" 12GB ARC B580 beating up on a "$380" 8GB RTX 5060Ti.
I'm curious how much worse a game would look if they tuned the settings on the 8GB card to run at about the same fps as the 16GB card. Lower textures or...?
which makes it all the more questionable for me, so close yet so far.The only ones I saw showing over 10GB used were Last of Us 2 and Spiderman 2 and even then, that's no guarantee a 10 GB card will fall on its face since the 8 GB was on par when usage showed 8.5 GB from the 16GB on certain scenarios.
8 GB is definitely too low, but it's only by like 1 or 2 GB.
Not by 1 or 2 GB, more like 4GB. Even a 3080 with 10GB has issues now days at 1440P and is borderline OK for 1080P in some of the latest titles. People buying a GPU have a longer perspective than just now, they want to have their GPU for a few years. The VRAM issue will get a lot worse in the next few years.The only ones I saw showing over 10GB used were Last of Us 2 and Spiderman 2 and even then, that's no guarantee a 10 GB card will fall on its face since the 8 GB was on par when usage showed 8.5 GB from the 16GB on certain scenarios.
8 GB is definitely too low, but it's only by like 1 or 2 GB.
I'm talking about scenarios that are usable for 5060ti/3080 class cards not future games at 4k rtx. Those cards will likely have to play future games at reduced settings anyhow. Again, haven't really seen a a reputable review benchmark of thr 3080 taking a dump from it's 10 GB vram.Not by 1 or 2 GB, more like 4GB. Even a 3080 with 10GB has issues now days at 1440P and is borderline OK for 1080P in some of the latest titles. People buying a GPU have a longer perspective than just now, they want to have their GPU for a few years. The VRAM issue will get a lot worse in the next few years.
Both the 5060 ti and 3080 chips are fast enough for 1440p rendering, but there are already quite a few games where you have to turn down settings on the 3080 in 1440p or get massive stutters every now and then. I retired my 3080 and got a 5070 ti because I got tired of having to fine tune settings or run 120 fps on medium in several games.I'm talking about scenarios that are usable for 5060ti/3080 class cards not future games at 4k rtx. Those cards will likely have to play future games at reduced settings anyhow. Again, haven't really seen a a reputable review benchmark of thr 3080 taking a dump from it's 10 GB vram.
It often beat the 6900xt in new games at 4k ultra setting min fps (not that either can necessarily run them, just an extreme case used to push vram), which was not necessarily what people would have predicted in 2020 what it would have looked like 4-5 years later:the 3080 has lived a long life
Amazingly enough it even falls to the 4060ti 8GB in some games.NVIDIA GeForce RTX 5060 Ti 8 GB Review - So Many Compromises
https://www.techpowerup.com/review/gainward-geforce-rtx-5060-ti-8-gb/33.html
Funny meme, and agreed that resolution expectations need to be set.A bit disingenuous since I have yet to see a single game that has this issues when playing at 1080p at medium settings. You know, where these GPUs are really meant to compete.
Honestly, the lower spec GPUs performance reviews make me really miss [H] reviews when they would conclude at what resolution and settings gamers would expect to use to have acceptable performance. All these tech tubers give me the below memes vibes so much. It's almost like they intentionally ignore the fact that a Prius level GPU isn't designed to compete in a quarter mile drag race.
View attachment 723997
Because there is a portion of consumers that will always look for the most inexpensive product, even if it's technically not worth the trade off versus the product one or two tiers higher. Nvidia would be a fool to forgo those consumers to AMD hen they already know how many of them are out there. We ourselves see it in the Steam hardware charts were the xx60 class card is always has the highest amount of users by substantial amounts.Funny meme, and agreed that resolution expectations need to be set.
At the same time...there's a 16GB version. Why even release an 8GB model of the same card at that point? It's still a $400+ real world cost card. At that price level, yeah I'd say 8GB is dumb in 2025 and Nvidia could have just saved some face by not even making an 8GB version.
Because there is a portion of consumers that will always look for the most inexpensive product, even if it's technically not worth the trade off versus the product one or two tiers higher. Nvidia would be a fool to forgo those consumers to AMD hen they already know how many of them are out there. We ourselves see it in the Steam hardware charts were the xx60 class card is always has the highest amount of users by substantial amounts.
Edit: And again, those same consumers do not run into these made-up issues because they aren't trying to game outside 1080p and medium settings.
The 5060 is just going to be a 4060ti 8GB at best. Really wish nVidia would make a 96 bit 12 GB card. GDDR7 has plenty of bandwidth to compensate.The regular RTX 5060 exists too though. I feel like they could just ignore the 5060 Ti 8GB and just leave the 5060 8GB and 5060 Ti 16GB SKUs around and that should be enough to cover the $300-$450 range.