• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

[MLID] AMD wary of launching 8gb 9060xt

I've been saying this in the 5060 threads, and people scoffed calling me an Nvidia shill. Now people are saying the same fucking thing I was saying, but in an AMD thread and all of a sudden 8GB cards are fine.

View attachment 726314
I just skimmed the 5060Ti thread and the responses seem about the same as here with some complaining, some defending it, and others saying it would be fine of the price was cheaper. A good portion of the negative comments were also more about the fact that they weren't allowing review samples of the 8GB variant.

That was the only front page thread discussing the 8GB 5060Ti and there were less comments about that than in this thread about a rumor so I'm not sure why you think Nvidia is being held to a higher standard.
 
If AMD releases a 9060 XT 8GB ..... I will 100% clown them.
The key here isn't that they released a 8GB card... its that they released not just a 60 class 8gb card. They released a 60 class TI version with 8gb.

Hopefully AMD if they had planned to follow with a 60 XT class card, they change their mind before it launches. Cause yes even the biggest AMD boosters will call that scammy bs as it is.
That's just goalpost moving. First it was, "No 8GB VRAM GPUs should be released in 2025", now it's, "If it's xx50 class then it's okay".

1745724887827.jpeg
 
That's just goalpost moving. First it was, "No 8GB VRAM GPUs should be released in 2025", now it's, "If it's xx50 class then it's okay".

View attachment 726432
Pretty sure my first post in this thread I said... release 50 class 8gb cards. A 9050 8gb who cares.

We all know nvidia releasing a 60 ti 8gb is BS. There are situations where even at 1080p your going to run into texture popping or handicapped FPS for no other reason then the ram.
On a 50 class card, that is aimed at a low end experience. Or for someone that really wants a new GPU for a media box or something. Ok whatever.

I'll say it again if AMD is stupid enough to release a 9060 XT 8gb... they deserve to loose every single bit of good will the may have gained with their 9070 launch. Release a 9050 8gb who the hell cares.
 
It is OK to release an 8GB card. Not everyone that want's a discrete GPU is trying to play AAA games (or games at all). It should still be marketed as a 50 series and priced appropriately though.
 
  • Like
Reactions: ChadD
like this
Let's be honest with each other here. Nvidia could have released the 5060 with 8GB of ram at $199 and people would still say Nvidia is bad, and the moment AMD announces their 8GB VRAM card at a hair under Nvidia's MSRP, people would change their narrative. just like right now.
Nvidia would be called evil if they were giving them away free, as it would be some part of a greater plot for world domination.

But at $199 the price-performance ratio would at least be pretty fantastic.
 
Nvidia would be called evil if they were giving them away free, as it would be some part of a greater plot for world domination.

But at $199 the price-performance ratio would at least be pretty fantastic.

True that. Honestly they should be releasing the 5060 at 12gb, 5060ti at 12gb and 16gb, and have a 5050 at 8gb price around $199. That would make the most sense. Really do hope AMD cancels the 8gb variant of the 9060, or at least switches to 12/16 instead. 8GB just isn't enough anymore to risk the negative attention for near $400+ costs. At $200, it's justifiable....
 
  • Like
Reactions: ChadD
like this
True that. Honestly they should be releasing the 5060 at 12gb, 5060ti at 12gb and 16gb, and have a 5050 at 8gb price around $199. That would make the most sense. Really do hope AMD cancels the 8gb variant of the 9060, or at least switches to 12/16 instead. 8GB just isn't enough anymore to risk the negative attention for near $400+ costs. At $200, it's justifiable....
The 3GB GDDR7 modules are currently stupid expensive, priced to not move, their failure rates are obscene. Yes it should be a 12, it may have originally been intended to be.
But as it currently stands that would have been a paper launch card as the chips needed to make it happen just aren’t available.
 
The 3GB GDDR7 modules are currently stupid expensive, priced to not move, their failure rates are obscene. Yes it should be a 12, it may have originally been intended to be.
But as it currently stands that would have been a paper launch card as the chips needed to make it happen just aren’t available.
Other solution would be to use GDDR6/X....Which would of been the smart move for 5060 and below tier cards. I mean hell, the 9070XT is on par with a 4080/5070ti using good ol' standard GDDR6, not even the X variant. This would STILL allow Nvidia to load up for their new Pro models and focus high capacity GDDR7 for those cards. Demand is down for the GDDR6/X modules now, but instead...Nvidia would prefer the planned obsolescence route and give their 5060ti an 8gb model. Honestly, they have no excuse, and the flack is deserved... I just hope AMD adjusts and does the smart move here. Nvidia needs more brand damage to fuel more competition and market share to AMD. The more they compete, the better for the consumer. Now if only intel could get their act together...lol.
 
Yeah, I think it would be a smart move. 12gb should be entry level now. 8gb should be for a 9040 level of graphics.
 
A 2080 RTX with 8 GB VRAM can still play a lot of titles even up to 4k with heavy DLSS usage (which frankly at that range you're probably going to be dependent on anyway). Just because some of the absolute latest titles (or some poorly coded ones) might crash, doesn't really mean much. Most of the time things were made to scale all the way back to consoles, which occasionally technically have more VRAM, but only technically. Keep in mind most people on Steam surveys are also rocking 60-level cards and even below. Devs would be dumb to code games that can't scale down far enough to play on majority hardware at all.

So 8 GB in a vacuum, at the right price point, is fine. Like what someone else said, the problem is the price. At this point 8 GB should not be pushing above the 300$ range at absolute most imo. If you're pushing anywhere near half a grand, then it's dumb to get a card that is that handicapped. Just adding 4GB more opens up a lot of possibilities, and with DLSS 12GB is plenty for a lot of titles. 16GB is I think the sweet spot for power users at the moment and that's why the 9070/XT and 5070 Ti are in good spots for price for perf and longevity (at least at MSRP; at current actual going prices the 9070 XT is a shit deal). Even the most demanding games rarely go above 16GB at the visuals vs framerate settings that they can actually push.

The 5060 Ti 16GB is just a dumb card. It probably can't use that much, and it's still priced too high for what it is. They should have instead made a 5070 with 16GB and maybe priced it at 600. Or hell scooted the 5070 price down to 500, and then made the 16GB like 575-625 or something. That would actually be a reasonable take. Then 5070 Ti still at roughly the same price point with 16 and give the 5080 24.

The 5080 is IMO just overpriced, especially at current going prices. If considering spending that much imo just consider trying to find a 4090 instead, which has more VRAM, is faster, and has a lot more compute power.

Putting aside that this whole gen is kind of shit, anyway. Just analyzing the least worst options.
 
When it's a problem on some games it's a real pain to non tech savvy people.

My sister has a rx 7600 and was trying to play Horizon Forbidden West. That game was a nightmare to fix up. There were like 3 sliders that would hit the vram hard and THEN I had to figure out it was the buggy ass Directstorage implementation that was killing it too so I had to delete some files.

These issues don't exist at 12g vram+. It's a gamble on each game if you know what you are doing or not. It's not always just texture size and ray tracing.
 
in some ways there is nothing wrong with a n 8gb card IF.....it is super cheap and not really a gaming card.

although I do still feel at least 12gb should be the minimum now.
 
60 class cards are just the new 50 class cards rebranded but costing as much as a 70 class card... or more.

8GB is more than fine for the vast majority of people wanting to play. Know your limitations l, slap yourself if you think you should get ultra at any resolution and all is fine.
 
A 2080 RTX with 8 GB VRAM can still play a lot of titles even up to 4k with heavy DLSS usage (which frankly at that range you're probably going to be dependent on anyway). Just because some of the absolute latest titles (or some poorly coded ones) might crash, doesn't really mean much. Most of the time things were made to scale all the way back to consoles, which occasionally technically have more VRAM, but only technically. Keep in mind most people on Steam surveys are also rocking 60-level cards and even below. Devs would be dumb to code games that can't scale down far enough to play on majority hardware at all.

So 8 GB in a vacuum, at the right price point, is fine. Like what someone else said, the problem is the price. At this point 8 GB should not be pushing above the 300$ range at absolute most imo. If you're pushing anywhere near half a grand, then it's dumb to get a card that is that handicapped. Just adding 4GB more opens up a lot of possibilities, and with DLSS 12GB is plenty for a lot of titles. 16GB is I think the sweet spot for power users at the moment and that's why the 9070/XT and 5070 Ti are in good spots for price for perf and longevity (at least at MSRP; at current actual going prices the 9070 XT is a shit deal). Even the most demanding games rarely go above 16GB at the visuals vs framerate settings that they can actually push.

The 5060 Ti 16GB is just a dumb card. It probably can't use that much, and it's still priced too high for what it is. They should have instead made a 5070 with 16GB and maybe priced it at 600. Or hell scooted the 5070 price down to 500, and then made the 16GB like 575-625 or something. That would actually be a reasonable take. Then 5070 Ti still at roughly the same price point with 16 and give the 5080 24.

The 5080 is IMO just overpriced, especially at current going prices. If considering spending that much imo just consider trying to find a 4090 instead, which has more VRAM, is faster, and has a lot more compute power.

Putting aside that this whole gen is kind of shit, anyway. Just analyzing the least worst options.
And yet here we are...

View: https://www.youtube.com/watch?v=AdZoa6Gzl6s

Its not a coding issue... or poor optimization. Sure some games may choose to just skip textures or implement annoying texture pop in techniques rather then crash but the result is the same unplayable games.
Is anyone spending $500 or so ish dollars on a GPU really going to be happy having to play at medium settings at 1080p? Even if they are this is still a marketing issue big time. I mean go to Nvidias 5060 site... does it say there enjoy the best medium setting 1440p experience $500 can buy? Or does it say enjoy RTX, Ray Tracing, DLSS4, Multi Frame gen. All the cool things like RT require VRAM. So even if your games textures neatly fight into 8gb of ram... using all the extra stuff is a no go.

Nvidia did do us a favor in away though... and provided 2 versions of a 5060 ti same exact die, 2 ram configs. The difference is very obvious when your not comparing something like a 2080 to a new GPU. You can just look at the 2 versions of the 5060 and see the obvious difference.
From the HUB testing;
Last of us... 1080p is good up to high settings. Higher settings or RT puts it over. 1440p Medium settings max no RT without running out of ram.
Alan Wake II... 1080p again high settings if fine Ray tracing OR frame gen push the ram over 10gb. 1440p a 8gb card can't fit more then low quality textures.
Cyberpunk... 1080 high is fine RT uses 11gb. At 1440p medium settings.
Horza... 1080 can't handle more then high settings RT pushes past 10gb. 1440 same thing.
Ghost of Tsush... is a bit better you can play at 1080p with high and RT and even frame gen. 1440p that is pushing it.
Horizan F west... 1080 high settings max. 1440 high is also doable. RT and Frame Gen will push past 10gb.
Hogwarts... same thing 1080 high forget RT or Frame gen unless you want texture popin
Rachet and Clank... 1080 medium settings are over 8gb.

[and to be clear in all these examples the 16gb version was happily gaming away at 80-120FPS, and had head room to flip on extra new VRAM intensive features like Frame Gen and RT. While the 8gb version was providing a slide show, crashing, or presenting games with terribly missing textures]

The 5060ti is more then capable of playing all those games at pretty decent frame rates with max settings, some reasonable RT settings. The die is capable, the 8gb version can't handle any of it due to the lack of frame buffer.

The only argument people make is Esports titles... or just not playing unoptimized games. But those are both bad faith arguments aren't they? I mean esports players may want to sell their card at some point. Why pay 90% the price of the real product for a version that can only handle esports low settings. As for the optimization level of modern games. It is what it is. Now people are just saying developers should hold games back. Don't use the big texture pack... make sure it fits into 8gb. OR create your game in a way that I don't notice the textures popping in on those distant trees. I mean what are we really asking for. All the modern games have 11-14gb of high res textures. How do you optimzie that to fit into a 8gb buffer exactly? And ok so the 5060ti isn't intended to play at ultra settings... ok BUT the 16gb version CAN. Say what I may about Nvidia hardware their 5060ti is capable of playing all the games hub looked at at 100FPS ultra type settings at 1080 and 1440p. Maybe at 4k its not a ultra setting card... but it actually is a very capable die for mass market 100ish FPS ultra single player game setting type gamers.

The 16gb 5060ti CAN in fact use the 16gb. Watch the HUB video. Plenty of games where the 5060ti 16 is using its full ram, 12gb no issues. You can look at mulitple games though where the 8gb vs 16gb is like 30FPS vs 80FPS the only differecnce being the 16gb version is using like 9gb of ram. ITS JUST over. Just over though is over, and now the 8gb version is having to swap to system memory and it kills performance. They also gave it the best case scenario as far as the board it was on, a lot of people will buy these for older MOBOs and the will see even worse numbers when their games at high 1080 settings break the 8gb my even just a little.
 
And yet here we are...

View: https://www.youtube.com/watch?v=AdZoa6Gzl6s

Its not a coding issue... or poor optimization. Sure some games may choose to just skip textures or implement annoying texture pop in techniques rather then crash but the result is the same unplayable games.
Is anyone spending $500 or so ish dollars on a GPU really going to be happy having to play at medium settings at 1080p? Even if they are this is still a marketing issue big time. I mean go to Nvidias 5060 site... does it say there enjoy the best medium setting 1440p experience $500 can buy? Or does it say enjoy RTX, Ray Tracing, DLSS4, Multi Frame gen. All the cool things like RT require VRAM. So even if your games textures neatly fight into 8gb of ram... using all the extra stuff is a no go.

Nvidia did do us a favor in away though... and provided 2 versions of a 5060 ti same exact die, 2 ram configs. The difference is very obvious when your not comparing something like a 2080 to a new GPU. You can just look at the 2 versions of the 5060 and see the obvious difference.
From the HUB testing;
Last of us... 1080p is good up to high settings. Higher settings or RT puts it over. 1440p Medium settings max no RT without running out of ram.
Alan Wake II... 1080p again high settings if fine Ray tracing OR frame gen push the ram over 10gb. 1440p a 8gb card can't fit more then low quality textures.
Cyberpunk... 1080 high is fine RT uses 11gb. At 1440p medium settings.
Horza... 1080 can't handle more then high settings RT pushes past 10gb. 1440 same thing.
Ghost of Tsush... is a bit better you can play at 1080p with high and RT and even frame gen. 1440p that is pushing it.
Horizan F west... 1080 high settings max. 1440 high is also doable. RT and Frame Gen will push past 10gb.
Hogwarts... same thing 1080 high forget RT or Frame gen unless you want texture popin
Rachet and Clank... 1080 medium settings are over 8gb.

[and to be clear in all these examples the 16gb version was happily gaming away at 80-120FPS, and had head room to flip on extra new VRAM intensive features like Frame Gen and RT. While the 8gb version was providing a slide show, crashing, or presenting games with terribly missing textures]

The 5060ti is more then capable of playing all those games at pretty decent frame rates with max settings, some reasonable RT settings. The die is capable, the 8gb version can't handle any of it due to the lack of frame buffer.

The only argument people make is Esports titles... or just not playing unoptimized games. But those are both bad faith arguments aren't they? I mean esports players may want to sell their card at some point. Why pay 90% the price of the real product for a version that can only handle esports low settings. As for the optimization level of modern games. It is what it is. Now people are just saying developers should hold games back. Don't use the big texture pack... make sure it fits into 8gb. OR create your game in a way that I don't notice the textures popping in on those distant trees. I mean what are we really asking for. All the modern games have 11-14gb of high res textures. How do you optimzie that to fit into a 8gb buffer exactly? And ok so the 5060ti isn't intended to play at ultra settings... ok BUT the 16gb version CAN. Say what I may about Nvidia hardware their 5060ti is capable of playing all the games hub looked at at 100FPS ultra type settings at 1080 and 1440p. Maybe at 4k its not a ultra setting card... but it actually is a very capable die for mass market 100ish FPS ultra single player game setting type gamers.

The 16gb 5060ti CAN in fact use the 16gb. Watch the HUB video. Plenty of games where the 5060ti 16 is using its full ram, 12gb no issues. You can look at mulitple games though where the 8gb vs 16gb is like 30FPS vs 80FPS the only differecnce being the 16gb version is using like 9gb of ram. ITS JUST over. Just over though is over, and now the 8gb version is having to swap to system memory and it kills performance. They also gave it the best case scenario as far as the board it was on, a lot of people will buy these for older MOBOs and the will see even worse numbers when their games at high 1080 settings break the 8gb my even just a little.



I'm not sure how you missed the majority of my post's point. I'm going to assume I just stated my points poorly.
  1. I said that 8GB should be for cards below an ~300$ MSRP at most. I don't know if you noticed but the 5060 Ti is 380-400+. That would exclude it from the cards that I think are fine at 8 GB. The 2080 was just illustrative because I've played many recent titles on it myself, even up to 4k. It does okay, especially with DLSS performance on. I mean I'm not actually sure how I should have stated this more clearly.
    At this point 8 GB should not be pushing above the 300$ range at absolute most imo. If you're pushing anywhere near half a grand, then it's dumb to get a card that is that handicapped. Just adding 4GB more opens up a lot of possibilities, and with DLSS 12GB is plenty for a lot of titles
  2. Just because I said that 16GB was stupid for a card at this performance level doesn't mean I think that 8GB was fine for it. I guess I didn't put this down explicitly (enough) but I think 12GB should have been what it released with. Likewise I think the 5070 should have released with a 16GB variant available... which I also said... explicitly.
All of your examples show it pushing just past 10-11GB and in edge cases. 12GB would have been perfectly fine.

But to put things in perspective even on it gaming at 8GB vs 16GB:
1745840563773.png
1745840585479.png
1745840607346.png

https://www.techpowerup.com/review/gainward-geforce-rtx-5060-ti-8-gb/33.html

The only place 16GB actually makes a noticeable difference on average is 4k. I'm sure some people are gaming at 4k on it, but I think they would be much better served by anything higher than it is, especially considering the 5070 performs 38-43% better at 4k for only a 28% increase in price. I'm sure it would choke on some games that push past 12GB (which is rare; I have played Cyberpunk fully maxed and it manages memory well enough to be fine at 12GB), but one has to keep in mind that the memory bandwidth of the 5070 is much higher, probably due to its 192 vs 128 bit bus.

This is to say nothing of the idiotic prices that the 5060 Ti is actually going for sometimes right now, and we're not factoring the 9070 series into the equation which would skew things further... since (... at MSRP........) it can actually sort of use the 16GB it comes equipped with for roughly the same price.

If you want a good fucking laugh, look at some of this shit:

1745841412291.png


To put this in perspective, there are 5070s actually in stock right this moment for down to 675$. Which is a mere 10% more than the highest end idiocy of this 5060 Ti chip. Outside of some clear edge cases, the 5070 will beat the everloving shit out of the 5060 Ti at almost anything, 16GB or not. This is to say nothing of the occasional 3080 Ti's which have been floating around for ~400-500 USD and essentially match the 5070 in performance (while having a lot more compute power).
 
Last edited:
  • Like
Reactions: ChadD
like this
TPU charts don't show 1% low's. Frame stutter due to GPU out of VRAM isn't going to show up as easily in AVG framerate. And I doubt anyone is going to stay frame stutter is no big deal.

8GB is fine on a 'just a video adapter' / 50 series style $150-200 product, but 1080p or higher 'gaming' cards should start at 12GB.

I also remember back 15 years ago or so, the issue was some cards had too much vram. I remember when people said you don't need 4GB of VRAM for gaming, 2GB is just fine, and GPU makers were ripping people off by selling 4GB cards. That'd be like a 4060 TI shipping with 32GB of memory now days. Just kinda shows you how stagnated VRAM has been the past decade.
 
Last edited:
To put this in perspective, there are 5070s actually in stock right this moment for down to 675$. Which is a mere 10% more than the highest end idiocy of this 5060 Ti chip. Outside of some clear edge cases, the 5070 will beat the everloving shit out of the 5060 Ti at almost anything, 16GB or not. This is to say nothing of the occasional 3080 Ti's which have been floating around for ~400-500 USD and essentially match the 5070 in performance (while having a lot more compute power).
To be clear I didn't mean my post to sound like I was attacking you, which it probably does. o7 Nvidia (and potentially AMD if they release a 9060 8) are the enemy. Your right that 12gb would be more acceptable at the 60 mark. Nvidia did the same crap last generation. I get that mem controler bit depts cause hardware issues, but selling 12gb 70s and 16gb 60s is just BAD market segmentation. Pay more for less... its not cool, and why 5070s can be found in stock no one is scalping 12gb cards that should say it all.

Though I agree 12gb would have been better. I respectfully disagree that that should have been the goal either. I think its time for the in between 12gb type solutions to go away as well. I would say the 5070 12gb should never have been released as well. I would have to assume their not selling is 100% because they only have 12gb.

I agree that 12gb is generally enough in a 60 class of card today. I think everyone can see that it might not be in a year or two, and who wants to buy a product that is headed to the same place 8gb is in a year or two. I don't think anyone really wants to be spending $600 or so dollars every year. The issue with vram as I see it isn't just textures. Its all the extra stuff Nvidia thinks people should be using. Multi Frame generation, actually uses more vram. Ray Tracing, uses vram. Even reflex uses a little bit of extra vram.

12gb would for sure be a lot better... and it wouldn't catch hell from reviewers. Still I think its clear consumers have decided 12gb is not attractive either. Going through the testing hub did, I mean I would say half the games they looked at that were 100% playable 5060 16gb games were using 13g-14b of ram at 1440. The 8gb cards clearly showed that a game that used just a little bit over like 8.5gb was enough to tank performance or cause game issues/crashes. Getting half the FPS of the 16gb version when a game goes over the vram pool by 500mb. That same thing would happen at 12gb, just at 1440 instead of 1080.
 
Last edited:
To be clear I didn't mean my post to sound like I was attacking you, which it probably does. o7 Nvidia (and potentially AMD if they release a 9060 8) are the enemy. Your right that 12gb would be more acceptable at the 60 mark. Nvidia did the same crap last generation. I get that mem controler bit depts cause hardware issues, but selling 12gb 70s and 16gb 60s is just BAD market segmentation. Pay more for less... its not cool, and why 5070s can be found in stock no one is scalping 12gb cards that should say it all.

Though I agree 12gb would have been better. I respectfully disagree that that should have been the goal either. I think its time for the in between 12gb type solutions to go away as well. I would say the 5070 12gb should never have been released as well. I would have to assume their not selling is 100% because they only have 12gb.

I agree that 12gb is generally enough in a 60 class of card today. I think everyone can see that it might not be in a year or two, and who wants to buy a product that is headed to the same place 8gb is in a year or two. I don't think anyone really wants to be spending $600 or so dollars every year. The issue with vram as I see it isn't just textures. Its all the extra stuff Nvidia thinks people should be using. Multi Frame generation, actually uses more vram. Ray Tracing, uses vram. Even reflex uses a little bit of extra vram.

12gb would for sure be a lot better... and it wouldn't catch hell from reviewers. Still I think its clear consumers have decided 12gb is not attractive either. Going through the testing hub did, I mean I would say half the games they looked at that were 100% playable 5060 16gb games were using 13g-14b of ram at 1440. The 8gb cards clearly showed that a game that used just a little bit over like 8.5gb was enough to tank performance or cause game issues/crashes. Getting half the FPS of the 16gb version when a game goes over the vram pool by 500mb. That same thing would happen at 12gb, just at 1440 instead of 1080.

No I came off as unnecessarily combative, too, sorry.

Anyway, I need to make a clear distinction. There's what I think is ideal, and there's what I think is "realistically achievable considering all the factors".

In other words, even with "Nvidia being Nvidia", I still think the 5060/Ti should have just straight up come with 12GB. Like for their own good in the press (and actually then they wouldn't have had to make 2 different model lines, too, which is silly), not just for gamers' sake. Then the 5070 should have either at least come with a 16GB variant or just straight up come with 16GB. The 5070 Ti is OK where it's at (20GB would have been nice, though). The 5080 needs 20-24GB to offer any meaningful differentiation between it and the 5070 Ti considering the price disparity. They also need a 5080 Super to actually fill in the stupidly large void in between the 5080 and the 5090. Currently the "better than 5080, worse than 5090" is I guess just a 4090. The void between the 5080 and the 5090 is even larger than the void between the 4080 and 4090 was.

Now, ideally, I think at this point frankly everything should just be 16GB+. Nvidia should know by now that it fucked up pretty bad with releasing the 5070 at 12GB, because some occasions already exist where it can surpass that (and at settings that it can actually push). But are they getting punished for it? Only a little. MSRP 5070s got bought out left and right. It kind of makes sense.
  • If you're looking at the ~500-600$ price range strictly (which is common at midrange), you may not have the means to jump to the 5070 Ti to get that 16GB.
  • 9070/XT don't exist anywhere near MSRP.
  • Even as good as it has gotten, the 9070/XT still has some issues with ray tracing and FSR4 isn't as good as DLSS if you're intending to play any older titles, so gamers sitting here weighing all of the options might still be buying these "shitty" cards.

Does Nvidia care? We can sit here and talk all day about what things should have come with. But then people shouldn't be buying 5090s at 3.5k+. People shouldn't be buying 5060 Ti's at 8GB. People shouldn't be buying 5060 Ti's at all at $600+ But they still do it. Voting with the wallet only works if it happens. Considering all these factors and market conditions, most of the time that's where I'm coming from. Just purely realistically. Tariffs are making things even worse now because companies do technically have justification for raising prices with all of this nonsense.
 
Last edited:
What AMD should do, use the naming and MSRP announcement the web pundit want to see so they can gather all the good press and reviews, then let the market actually go to sell them at a much higher price (it is not like people can go and click add to cart to an $250 B580...).

People shouldn't be buying 5060 Ti's at 8GB.
And that what they are probably doing, it is hard to judge because so little market have choice but in those where people has some:
https://www.overclockers.co.uk/pc-components/graphics-cards?sort=order_count_desc

Not many 5060ti 8GB show up in best selling lists (the only 5060ti featured int he best selling list are all 16GB), here I need to reach page 9 after ~10 sku of 16gb variant before finding the first 5060TI 8GB, the web algo when ranking 5060ti by relevance show 12 16GB model before the first 8GB one.

It is possible the 8GB is a make the 16GB variant look good, low volume card, the price gap between the 2 is smaller this time around. Or could be more a card for the oem-pre built than the DIY direct purchase, there pricing get massed down with all the components.
 
Last edited:
Things are looking good for the direction of RTG’s leadership if these rumors are true. If what Tom from MLID said in a previous video about GDDR6 only costing AMD $2 per GB is true, then there’s absolutely no reason to release an 8GB variant. The market would want it to be at least ~$70 cheaper, but AMD would only save $16 on the VRAM and take a lot of criticism in the process. If they cancel the 8GB model even though it’s already in production, it will be the second time in a few months that Jack Huynh has been willing to pull the emergency brake to stop the company from shooting itself in the foot, the first being the 6-week delay of the 9070 launch to ensure that supply and drivers were ready. It would signal that RTG leadership might be up to the task of bringing real competition back to desktop GPUs, and it’s about damn time.
 
If what Tom from MLID said in a previous video about GDDR6 only costing AMD $2 per GB is true,
that a really big if, not that we ever see those cost in public to have any idea ourself.... but:

https://www.alibaba.com/product-det...offerlist.normal_offer.d_title.18b313a0FytKu0
On alibaba they seem to be around ~$18 each, for much slower GDDR6 ram (14 gbs)

$11.50 here:
https://www.zeusbtc.com/ASIC-Miner-Repair/Parts-Tools-Details.asp?ID=1476

$8.50 here ($13 if you want 24gbps):
https://www.aliexpress.com/item/1005006125455905.html

It move a lot in public facing "deals", buying in bulk like them can get you better deal, but maybe they pay a bit extra for a guarantee of volume on time in exchange as well.

If they can get them at ~$7 ($3.50 pr GB), $28 more in raw cost, $5 in clamp design added pcb-cooling-assembly-failure rate all the extra, %40 margin and you get around the $50 we tend to see in extra cost for the 16GB version.

If it is ~4 ($2 per GB), $14 more raw ram cost, $5 in everything that it involve, %40 margin that $32 more, for what they charge oem for the basic BOM that include memory-gpu, could still be a significant price difference between the 2.
 
I've been saying this in the 5060 threads, and people scoffed calling me an Nvidia shill. Now people are saying the same fucking thing I was saying, but in an AMD thread and all of a sudden 8GB cards are fine.

View attachment 726314
No, they are not fine. If the card is sub $200, then sure. If the card is above $200, then not fine. Above $300 then you're smoking crack. Above $400, then god is smoking crack.
 
No, they are not fine. If the card is sub $200, then sure. If the card is above $200, then not fine. Above $300 then you're smoking crack. Above $400, then god is smoking crack.
I agree, I don't think anyone (more than the 5060/ti thread) is saying '8GB is fine'

Instead I think they're saying '8GB is fine IF IT'S CHEAP'

The RX570 had 4GB back in the day and sold for pennies, and kept a LOT of budget gamers happy for a long time. This was at a time when gamers thought 4GB was too small. And they were right. 4GB was too small... For anything other than a 'as cheap as can be' card that could still at least play the games at the time. It was priced as a competitor to the 1050 ti, and the 1650 ti. It was a '50 class' card.

The 8GB card AMD should sell today is a '50 class' card, which is what gamers typically fall back on when they resort to searching the couch cushions for an extra few bucks. Anything lower than a '50 class' card and you're looking at something that no longer classifies as a 'gaming' card, like a '40 class' or '30 class' and most likely will fall behind integrated solutions nowadays.

'30 class' gpus have always been passively cooled and used for getting grandma's computer back online so she can video call.
 
I've been saying this in the 5060 threads, and people scoffed calling me an Nvidia shill. Now people are saying the same fucking thing I was saying, but in an AMD thread and all of a sudden 8GB cards are fine.

View attachment 726314
The reason for it is that nVidia wants to sell you on their features correct? However, lack of VRAM even at 1080 can prevent you from even using them. Once that happens you're on the raster train. This is while they are charging you a premium for the card. Furthermore you can't grade the card on features that may or may not be there. You've got to look at native performance and once you do things start to get nasty. It's really a 50 class card as 8GB isn't enough for 1440p anymore. If AMD releases the 9060xt 8GB at the same price point it will get dunked on as well.
 
Last edited:
Nvidia would be called evil if they were giving them away free, as it would be some part of a greater plot for world domination.

But at $199 the price-performance ratio would at least be pretty fantastic.
Dude, and if it had a 16gb variant? LLM machine!!!
 
With the Radeon RX 7600 still in production, AMD can rebadge it as the Radeon RX 9050 XT.
7600 is 33% bigger than the upcoming 9060xt not sure if it would be a way to save money, they went quite all-in in price cutting has they could as they promised to do... (is the 7600 really still in production ? nothing available under $50 over the msrp it seem in the US market at least and not many sku left it seem).

If they start to launch 9000 series that do not support FSR 4.x, that could become a bit of a branding challenge as well.
 
No reason to make games efficient. South Park said it best: “Put 16GB in it and make it play”
 
7600 is 33% bigger than the upcoming 9060xt not sure if it would be a way to save money, they went quite all-in in price cutting has they could as they promised to do...
How do you know that? As far as I know, the die size has not been disclosed.

The Radeon RX 7600 uses TSMC N6, so it shouldn't be that expensive to produced.

(is the 7600 really still in production ? nothing available under $50 over the msrp it seem in the US market at least and not many sku left it seem).
Yes. AMD just launched a refresh called the Radeon RX 7650 GRE at $249.

If they start to launch 9000 series that do not support FSR 4.x, that could become a bit of a branding challenge as well.
At $249, I think the expectation will be low.
 
How do you know that? As far as I know, the die size has not been disclosed.
That the TPU spec:
https://www.techpowerup.com/gpu-specs/amd-navi-44.g1070

Maybe that some best rumours fill-in until actual one come-in (it would be small versus the 5060), but as a general rules if it was not cheaper to achieve 7600 performance using RDNA 4, may as well just rename the 7700xt-7800xt and so on has well.

Yes. AMD just launched a refresh called the Radeon RX 7650 GRE at $249.
We would need to see the date on the die to have any idea too, they could be 2024 chips and that can be a way to help cleaning all the NAVI 33 around for when the 9050 launch.

At $249, I think the expectation will be low.
That would have been true of the $150 1650/2050 Turing in 2019, they still did not use the RTX and 2000 brand on it, being tensor and RT core less, it is a bit cleaner if all your 9000 series do something than having subclass, if they want to sell $250 7600, bringing in a $250 7650 GRE and being that the under 9060 could do very well.
 
I doubt that many would buy a Radeon RX 9060 with 96-bit bus GDDR6, never mind a Radeon RX 9050 with 64-bit bus GDDR6.
All depends on the price.

"There is no such thing as a bad video card just bad prices." Elmer Fudd
 
No, they are not fine. If the card is sub $200, then sure. If the card is above $200, then not fine. Above $300 then you're smoking crack. Above $400, then god is smoking crack.
When was the last time an "entry level" product was sub $200? Nvidia did have the 3050 nerfed edition, with only 6GB of ram, not sure what markets that sold in though, the normal 3050 was over $200, 3060 though was only available in smoking crack edition, ditto with the 2060. 1650 was sub 200 but let's be honest friends didnt let friends get a 16 series card, hell the only sub 200 ($199) 1060 was the 3GB version the normal 1060 was over $209. Hell the 960 was $199. And these are all actual prices of the day, not inflation adjusted prices.

So I get what you're saying, but your demands are for something that has never existed when you do count for the value of the dollar. SO sure the entry level product should be $199 now get into a time machine and you can make that happen
 
Those are just placeholders

Maybe that some best rumours fill-in until actual one come-in (it would be small versus the 5060), but as a general rules if it was not cheaper to achieve 7600 performance using RDNA 4, may as well just rename the 7700xt-7800xt and so on has well.


We would need to see the date on the die to have any idea too, they could be 2024 chips and that can be a way to help cleaning all the NAVI 33 around for when the 9050 launch.
When you put something new in production, it costs a lot of money.

AMD can cut the price of the Radeon RX 7600 because it has been it has long been in production.

That would have been true of the $150 1650/2050 Turing in 2019, they still did not use the RTX and 2000 brand on it, being tensor and RT core less, it is a bit cleaner if all your 9000 series do something than having subclass, if they want to sell $250 7600, bringing in a $250 7650 GRE and being that the under 9060 could do very well.
I was discussing mullet's hypothetical scenario in which AMD launches the Radeon RX 9050 XT 8GB and how that would happen.

In that scenario, AMD would relaunch the Radeon RX 7600 as the Radeon RX 9050 XT.
 
Wouldn't a discrete graphics card be vast overkill for that use case?
reply, depends on if you want to push high refresh rate at 4k hdr or higher with the latest DP or HDMI versions and be able to use it for encodes for us that still like to buy physical media? prob still run a lot better with a discrete card?
 
Back
Top