• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

GeForce RTX 5060 Ti 8GB performance revealed (The review that NVIDIA doesn't want you to see)

Joined
Feb 20, 2017
Messages
1,017
Screenshot 2025-04-16 091805.png
Screenshot 2025-04-16 091847.png
Screenshot 2025-04-16 091950.png
Screenshot 2025-04-16 092011.png
Screenshot 2025-04-16 092157.png
Screenshot 2025-04-16 092247.png
Screenshot 2025-04-16 092713.png
Screenshot 2025-04-16 092756.png
Screenshot 2025-04-16 092837.png
Screenshot 2025-04-16 092900.png
Screenshot 2025-04-16 093020.png
Screenshot 2025-04-16 092925.png
Screenshot 2025-04-16 093137.png
Screenshot 2025-04-16 093157.png
Screenshot 2025-04-16 093221.png
Screenshot 2025-04-16 093323.png
Screenshot 2025-04-16 093351.png
Screenshot 2025-04-16 093414.png
Screenshot 2025-04-16 093532.png
Screenshot 2025-04-16 093651.png

Source: https://www.bilibili.com/video/BV1TWoMYUEGW/
 
In games such as Assassin's Creed Shadows and Horizon Zero Dawn Remastered, insufficient VRAM causes significant performance drops.

It's no wonder that NVIDIA doesn't want reviewers reviewing the GeForce RTX 5060 Ti 8GB.
 
What's sad is that Nvidia made the 16gb card specifically for reviews, and it won't be available after the initial launch.

The 8GB card will be all that is available for the rest of the life of the 5000 series.
 
In games such as Assassin's Creed Shadows and Horizon Zero Dawn Remastered, insufficient VRAM causes significant performance drops.

It's no wonder that NVIDIA doesn't want reviewers reviewing the GeForce RTX 5060 Ti 8GB.
A bit disingenuous since I have yet to see a single game that has this issues when playing at 1080p at medium settings. You know, where these GPUs are really meant to compete.

Honestly, the lower spec GPUs performance reviews make me really miss [H] reviews when they would conclude at what resolution and settings gamers would expect to use to have acceptable performance. All these tech tubers give me the below memes vibes so much. It's almost like they intentionally ignore the fact that a Prius level GPU isn't designed to compete in a quarter mile drag race.

1744834917746.jpeg
 
A bit disingenuous since I have yet to see a single game that has this issues when playing at 1080p at medium settings. You know, where these GPUs are really meant to compete.

Honestly, the lower spec GPUs performance reviews make me really miss [H] reviews when they would conclude at what resolution and settings gamers would expect to use to have acceptable performance. All these tech tubers give me the below memes vibes so much. It's almost like they intentionally ignore the fact that a Prius level GPU isn't designed to compete in a quarter mile drag race.

View attachment 723997
That is a terrible argument.

Today, 8GB VRAM is already insufficient. Right now, you may be able to tweak some settings to get it to work, but games are becoming more VRAM demanding and insufficient VRAM will become more and more of an issue in the future.
 
Looks like it actually does well. Maybe games are better able to utilize the cache?

That said, 5060ti should be 16GB only with the 5060 getting the 2 vram size options.

I'm really hoping the 5050(ti?) gets 96 bit gddr7 with 12GB. Should make for a great budget card.
 
Looks like it actually does well. Maybe games are better able to utilize the cache?

That said, 5060ti should be 16GB only with the 5060 getting the 2 vram size options.

I'm really hoping the 5050(ti?) gets 96 bit gddr7 with 12GB. Should make for a great budget card.
The cache only helps with memory bandwidth, not the lack of memory.
 
What's sad is that Nvidia made the 16gb card specifically for reviews, and it won't be available after the initial launch.

The 8GB card will be all that is available for the rest of the life of the 5000 series.
I don't think that will happen for the life of the 50 series, but I could see that until supply catches up.

I'm not sure what the bottleneck is making higher end cards as scarce as they are, but some people seem to think GDDR7 availability is part of the reason. If there is a shortage of GDDR7 it'll discourage the production of 16GB 5060Tis. Same chips as a 5070Ti, so why put them on a 16GB 5060Ti when you can make a 5070Ti or two 8GB 5060Tis? Or slap two sets on a 5090 and charge $3k for it?

It didn't happen with the 4060Ti for the life of the 40 series. I didn't pay much attention, but IIRC last summer the price gap between the 8GB and 16GB models had narrowed to about $50 from $100, mostly due to the 16GB version dropping in price below MSRP. They launched with an MSRP of $400 and $500, respectively, and IIRC the 16GB 4060Ti dropped to about $450. MSRP on the 5060Ti is lower than that at $429 for the 16GB and $379 for the 8GB.

Looks like it actually does well. Maybe games are better able to utilize the cache?

That said, 5060ti should be 16GB only with the 5060 getting the 2 vram size options.

I'm really hoping the 5050(ti?) gets 96 bit gddr7 with 12GB. Should make for a great budget card.
Leaks say the 5050 will have 8GB of GDDR6 on a 128-bit bus.

Personally I think it would have been better if NVidia only used GDDR7 on the 5070Ti, 5080 and 5090, then given the 5070 16GB GDDR6 on a 256-bit bus, 5060Ti & 5060 12GB GDDR6 on 192-bit, and the 5050 gets the 8GB/128-bit setup. GDDR7 is apparently a good bit more expensive than GDDR6. It might also help with availability. Out of the 50 series cards that launched before the 5060Ti, the only one that's relatively easy to get without getting looted is the 12GB 5070.
 
That is a terrible argument.

Today, 8GB VRAM is already insufficient. Right now, you may be able to tweak some settings to get it to work, but games are becoming more VRAM demanding and insufficient VRAM will become more and more of an issue in the future.
Absolutely no argument from me. The tests at both 1440p and 4K are indeed important, but let's stop bullshiting here. The tech tubers have all prioritize performance comparisons without a single though about the type of people who actually buy XX60 class cards. They are like a dog with a bone, continuously making moronic comparisons that have little to nothing to do at all with actual performance at acceptable resolutions and game settings; and go on and on about, "The XX60 does terribly against the previous generation XX70 at 4K with highest in game settings and path tracing turned on, as shown in our graphs". There is not a single game that runs into VRAM issues at 1080p with medium settings on even a mobile XX60 GPU with only 8GB of VRAM. NOT ONE. Every single bullshit video they've shown of the issue is at 1440p/4K with the highest settings possible to FORCE and issue that not a single user of a XX60 class card ever encounters. And by the time it does become and issue, the lifespan of said cards is well passed any reasonable persons expectations.

Imagine crying about a VRAM issue you won't encounter on a 5060 until the year 2037 or some other bullshit. Didn't think I'd see the day when [H] members forgot how the most defining part of [H] reviews were the apples to apples comparison PLUS the best playable settings.
 
For measuring the impact of differences in memory I'd be more interested in texture pop-in comparisons.

Many a modern game engine adapts its memory usage to the amount available to prevent its performance from tanking.
 
The cache only helps with memory bandwidth, not the lack of memory.
For the most part yes, but wouldn't you need less than 1 GB of ram if you could have a theoretical infinite bandwidth ie, the vram stores bulk texture data across several frames?

Sorry if that sounds retarded, but I would definitely like to see how a current 8 GB card with fast cache compares to an older 8 GB card of similar performance.

Don't get me wrong, I would still not touch the 8GB ti, but might be salvageable foe some that play on 4k tvs with medium settings (which typically uses far less vram than 1080p/1440p ultra.)
 
I don't think that will happen for the life of the 50 series, but I could see that until supply catches up.

I'm not sure what the bottleneck is making higher end cards as scarce as they are, but some people seem to think GDDR7 availability is part of the reason. If there is a shortage of GDDR7 it'll discourage the production of 16GB 5060Tis. Same chips as a 5070Ti, so why put them on a 16GB 5060Ti when you can make a 5070Ti or two 8GB 5060Tis? Or slap two sets on a 5090 and charge $3k for it?

It didn't happen with the 4060Ti for the life of the 40 series. I didn't pay much attention, but IIRC last summer the price gap between the 8GB and 16GB models had narrowed to about $50 from $100, mostly due to the 16GB version dropping in price below MSRP. They launched with an MSRP of $400 and $500, respectively, and IIRC the 16GB 4060Ti dropped to about $450. MSRP on the 5060Ti is lower than that at $429 for the 16GB and $379 for the 8GB.


Leaks say the 5050 will have 8GB of GDDR6 on a 128-bit bus.

Personally I think it would have been better if NVidia only used GDDR7 on the 5070Ti, 5080 and 5090, then given the 5070 16GB GDDR6 on a 256-bit bus, 5060Ti & 5060 12GB GDDR6 on 192-bit, and the 5050 gets the 8GB/128-bit setup. GDDR7 is apparently a good bit more expensive than GDDR6. It might also help with availability. Out of the 50 series cards that launched before the 5060Ti, the only one that's relatively easy to get without getting looted is the 12GB 5070.
Yeah the switch to gddr7 has been a bummer since it never got a capacity boost while carrying over the 128 bit trash for 60ti class cards. 12 GB would have been perfect for this performance level and I can't imagine a 192 bit bus of gddr6 being much more expensive.

But I sort of get why they did it. The chip will then be set up for 3gb GDDR7 gen 2 making the 5060 (super ti?) 12 GB card low cost and high performance.
 
A bit disingenuous since I have yet to see a single game that has this issues when playing at 1080p at medium settings. You know, where these GPUs are really meant to compete.

Honestly, the lower spec GPUs performance reviews make me really miss [H] reviews when they would conclude at what resolution and settings gamers would expect to use to have acceptable performance. All these tech tubers give me the below memes vibes so much. It's almost like they intentionally ignore the fact that a Prius level GPU isn't designed to compete in a quarter mile drag race.

View attachment 723997
My 1080 can play most games are medium settings and 1080p... What's your point?
 
cant they make up the performance problems by using DLSS4 and 4x MFG ?
DLSS and Frame Generation increase VRAM usage, so insufficient VRAM becomes an even bigger issue.

I didn't post a screenshot of it, but it's also shown in the video.
 
Absolutely no argument from me. The tests at both 1440p and 4K are indeed important, but let's stop bullshiting here. The tech tubers have all prioritize performance comparisons without a single though about the type of people who actually buy XX60 class cards. They are like a dog with a bone, continuously making moronic comparisons that have little to nothing to do at all with actual performance at acceptable resolutions and game settings; and go on and on about, "The XX60 does terribly against the previous generation XX70 at 4K with highest in game settings and path tracing turned on, as shown in our graphs". There is not a single game that runs into VRAM issues at 1080p with medium settings on even a mobile XX60 GPU with only 8GB of VRAM. NOT ONE. Every single bullshit video they've shown of the issue is at 1440p/4K with the highest settings possible to FORCE and issue that not a single user of a XX60 class card ever encounters. And by the time it does become and issue, the lifespan of said cards is well passed any reasonable persons expectations.

Imagine crying about a VRAM issue you won't encounter on a 5060 until the year 2037 or some other bullshit. Didn't think I'd see the day when [H] members forgot how the most defining part of [H] reviews were the apples to apples comparison PLUS the best playable settings.
You inadvertently revealed to everyone that you didn't look at the review.

"1920X1080" in large text doesn't need to be translated from Chinese.
 
It's pretty remarkable how well the 10GB 3080 is holding up though. I haven't seen an example where that card takes a dive from lack of vram outside forced scenarios where the card would be at 14 fps anyhow.
 
DLSS and Frame Generation increase VRAM usage, so insufficient VRAM becomes an even bigger issue.

I didn't post a screenshot of it, but it's also shown in the video.
Frame gem does increase vram slightly, but from what I've seen, DLSS actually decreases vram usage, all things given equal, since it's rendering at a lower resolution.
 
Looks like it actually does well. Maybe games are better able to utilize the cache?

That said, 5060ti should be 16GB only with the 5060 getting the 2 vram size options.

I'm really hoping the 5050(ti?) gets 96 bit gddr7 with 12GB. Should make for a great budget card.
8GB *60 class cards dont do well.

Texture quality is the cheapest way to increase the visual fidelity with least amount of performance hit. Not having enough VRAM, can manifest itself in either the game lowering the texture quality on the fly noticeably low as can be seen in games Hogwarts legacy, stutters and performance hit, like seen in Final Fantasy 16 or simply not working at some settings like seen in Indiana Jones the Great Circle game. Its not because the *60 GPU is too slow, its because the VRAM is not sufficient. Take a look at Daniel Owens video here:


View: https://www.youtube.com/watch?v=opaamKX2gt8

Fellow gamers should warn users against getting an 8GB *60 class card in 2025, regardless of Nvidia, Intel or AMD. Imagine the headache having to always make sure you have low enough texture quality or having to run at lower settings, only because you bought a *60 class card and did not pay attention to the VRAM or listened to people claiming its enough. 8GB in 2025 gives you already a noticeable worse gaming experience in many big titles and probably many more in the future.
 
Yes agree and to echo the sentiment of many here 8Gb is fine for budget GPUs even in 2025 but it has to be at or under $200 brand new or just keep saving up for something better.

10-12G is the minimum now for anything decent.
 
Last edited:
You inadvertently revealed to everyone that you didn't look at the review.

"1920X1080" in large text doesn't need to be translated from Chinese.
I watched the review, I saw the slides with 1080p, 1440p and 4K, and I also saw how in the 16 games average test they only show results for 1440p and 4K and how every single tech tuber has still refused to test the card at best playable settings at optimal resolutions to see how it performs. XX60 class cards with 8BG of memory DO NOT suffer the made up bullshit issue of running out of VRAM when played at 1080p and medium setting. Period, full stop, and replying to me as if you were some smartass doesn't change that.
 
If the game allows you can have better texture settings on one and not the other and if it's a mandatory baseline it'll be crippled. There's nothing made up about it. The 8GB version of the card shouldn't exist as it's mediocre cost reduction for nothing of value.
 
It's pretty remarkable how well the 10GB 3080 is holding up though. I haven't seen an example where that card takes a dive from lack of vram outside forced scenarios where the card would be at 14 fps anyhow.
It is probably fine if you are at 1080p, but at 1440p the 10GB 3080 needed tuning due to running out of VRAM in CP 2077: Phantom liberty, Far Cry 6 and several other games. It struggled a lot with texture dropouts/lower quality textures in Indiana Jones that pretty much all 12GB cards would be superior. Basically the 3080 has been borderline for a few years now at 1440p, with some manual tuning needed in quite a few games due to lack of VRAM. The framerate dives on the 3080 aren't constant, more like insane stutters while clearing VRAM/Cache, which would show up quite well on 1% and 0.1% lows.

The original 3080 would be OK for another couple of years at 1440p if it had 12GB of VRAM as it was still decently fast at rendering, but I retired mine and upgraded to a slightly overpriced 5070 ti to get rid of the VRAM limitations. Running CP 2077 with path tracing and RR + DLSS quality TF model looks really good and framerate is mostly in the 60-70 range, but it uses so much VRAM that you need a 16GB card to do so. The 5060 ti 8GB model is just dumb IMO as it will need to turn down lots of settings that it would run fine as a 12GB or 16GB card. The only GPUs using 8GB should be the ones in the lower $200 range and the higher priced models should have at least 12GB.
 
The 5060 and 5060 ti should be 12gb. The 5050 should be 8gb.
IMHO all the 50 series cards below the 5070Ti should be using GDDR6 and have a wider bus and more vram. GDDR6 is cheaper, so use a wider bus and put more of it on the card. A few more PCB traces and GPU pins don't cost much. The GPU and the ram are most of the cost of a card.
 
Reviewers really need to review more than popular AAA and eSports games. A game like Atelier Yumia takes 8GB+ VRAM at 1080p. Basically unplayable on my 2070.
 
But the real question is: Does it work in windows 2022 or 2025 server. Because if it doesn't, either their driver is unstable junk or the card is.

That is the real test for these cards other than Linux compatibility that they too reject unstable hardware.
 
Back
Top