• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

[MLID] AMD wary of launching 8gb 9060xt

Not quite that cheap, it depends on the chip types, you can get the slower low grade stuff for peanuts but the faster higher quality stuff comes at a premium.
But GDDR is in a weird place, the cheap stuff is record low the good quality stuff is still high not peak COVID highs but still higher than it should be. But they all shifted away for HBM and GDDR7 so the good GDDR6 is in short supply so the price is up.

Funny how that happens, when a chip gets cheap they stop making it so there are fewer of them so the price goes up. Almost like the 2 companies who can reliably make the stuff have a vested interest in keeping the prices high. It’s not collusion though because they don’t communicate their production decreases they just do it on their own because they need to keep prices in check…
 
8GB 9060XT. GDDR6 is less than $3/GB.
Hardware Unboxed said the 16GB model costs around $30 more to manufacture, including more complicated packaging.

That would already be 10% of the MSRP, assuming the Radeon RX 9060 XT 8GB is $299.

I am not saying the Radeon RX 9060 XT 8GB will be a good product: it wouldn't, but I can see why AMD would offer it.

Suppose AMD only launched the Radeon RX 9060 XT 16GB for $349 MSRP, that would leave the GeForce RTX 5060 with no competition at that same price.

My issue is the name. It should be the Radeon RX 9060 8GB, making clear that it's a different product from the Radeon RX 9060 XT 16GB.
 
Last edited:
But TSMC can’t supply 3 times as many chips so that strategy doesn’t work.

You can all keep saying that things don’t work this way, but time and time again AMD and Nvidia show us they do. And until people stop paying for the GPU’s it won’t change. But people keep buying them so they won’t.
I think we're saying the same thing, just emphasizing different points. I'm saying that TSMC's limited supply is the reason prices are so high. NV and AMD can't make enough chips to satisfy the full gaming and compute demand anyway, so why not charge more? When the supply stops being so constrained, prices will go down.
 
AMD then (on 12gb navi 22)

https://www.amd.com/en/products/graphics/gaming/vram.html

Having enough VRAM is important for smooth and detailed graphics in games.​

Headroom to Max Out Settings​

High quality textures. Beautiful shaders. Immersive raytracing. Having more video memory gives you more control over dialing your in-game graphics settings to max, enabling features like raytracing, all while maintaining good frame rates. Having ample VRAM reduces visual issues that can occur, like texture pop-in.​

AMD now

https://x.com/AzorFrank/status/1925651286998794443

Majority of gamers are still playing at 1080p and have no use for more than 8GB of memory. Most played games WW are mostly esports games. We wouldn't build it if there wasn't a market for it. If 8GB isn't right for you then there's 16GB. Same GPU, no compromise, just memory options.​
 
AMD then (on 12gb navi 22)

https://www.amd.com/en/products/graphics/gaming/vram.html

Having enough VRAM is important for smooth and detailed graphics in games.​

Headroom to Max Out Settings​

High quality textures. Beautiful shaders. Immersive raytracing. Having more video memory gives you more control over dialing your in-game graphics settings to max, enabling features like raytracing, all while maintaining good frame rates. Having ample VRAM reduces visual issues that can occur, like texture pop-in.​

AMD now

https://x.com/AzorFrank/status/1925651286998794443

Majority of gamers are still playing at 1080p and have no use for more than 8GB of memory. Most played games WW are mostly esports games. We wouldn't build it if there wasn't a market for it. If 8GB isn't right for you then there's 16GB. Same GPU, no compromise, just memory options.​
You quote that like there's some kind of conflict between those two statements, but there isn't. It's both true that if you want maxed out settings, you need more vram, and that the majority of people aren't playing that way. Geez, your second quote even says "if 8gb isn't right for you then there's 16gb". Compare that to nvidia's "the 5060's great compared to our last-generation version, as long as you use upscaling and 3/4 of your frames are fake!"
 
...

My issue is the name. It should be the Radeon RX 9060 8GB, making clear that it's a different product from the Radeon RX 9060 XT 16GB.
When did that argument crop up? I blinked and suddenly it is being parroted everywhere on the internets, as if 8GB vs 16GB is not clear and honest enough. Even the most stoopid of consumers know 16 is better than 8.
 
Last edited:
Comment:

People aren't buying 8 GB graphics cards because they want to, they are doing it because there isn't an alternative.

https://hole-in-my-head.blogspot.com/2025/05/so-whats-next-rate-of-advancement-in.html?m=1
This is exactly the problem. Just cause AMD offers a 16GB variant for $350, doesn't mean people can afford to buy it. Lets be honest, you won't find it for $350 either. There's a reason why the RTX 3060 is currently the most popular GPU on Steam, because it's cheaper and has 12GB of VRAM. If all you can buy is 8GB cards, then nobody should complain why nobody buys 1440p and 4k monitors. What would be the point when 8GB is the only affordable choice? 1080p isn't even entirely feasible with 8GB, as some games do require more. What's worse is that 8GB cards maybe entirely reliant on upscalers like FSR4 and DLSS4 to reduce VRAM usage. So if you don't want upscaling, or can't get upscaling like in Elden Ring, then you're screwed.
 
, then nobody should complain why nobody buys 1440p and 4k monitors.
Would anyone ever complain (how would you start to care ?) about something like that..... 4k tv make sure game designer do not forget about 4k resolution anyway....

Comment:

People aren't buying 8 GB graphics cards because they want to, they are doing it because there isn't an alternative.

https://hole-in-my-head.blogspot.com/2025/05/so-whats-next-rate-of-advancement-in.html?m=1
That a bit of a strange statement:
we're shipping fewer dedicated GPUs (dGPUs) than even in 2021/U]

Even 2021 ? has its graph show you need to go back to 2013 for a non crypto boom year with more GPU shipped than 2021, which was a peak year in that regard, with covid and Ethereum giant demand. Less than 2019 is the interesting data point, which seem explained by AMD leaving the market seemingly, not able to make and sale GPU at a profit it seems with their RDNA 3 design (or just making so much Epyc money and PS5 pro soc obligation, probably a mix of both considering how low their gaming segment margin got over that time).

There's a reason why the RTX 3060 is currently the most popular GPU on Steam
And number #2 on newegg.com bestselling gpu list, but the 4060, 6600, still sell well as well so, 7600 is #1 on bestbuy, so still a big market for 8GB cards, but in general when 80% of the skus are sold out (or their equivalent in priced out) yes we cannot really use customer choice that much to judge what they want.
 
Last edited:
1748133484182.jpeg
 
And number #2 on newegg.com bestselling gpu list, but the 4060, 6600, still sell well as well so, 7600 is #1 on bestbuy, so still a big market for 8GB cards, but in general when 80% of the skus are sold out (or their equivalent in priced out) yes we cannot really use customer choice that much to judge what they want.
Crazy thing is the RX 7600 and 7600XT don't even show up on the latest Steam hardware survey. The 7900XTX does along with the 7800XT and 7700XT, but the 7600 cards are AWOL. Somehow the 7900XTX is more common than the 7800XT, which in turn is more common than the 7700XT. It's completely backwards from NV where the cheaper SKUs are more common. Hopefully they'll make enough 9000 series cards to correct this. The 9070 & XT are a good value at MSRP or at least decently cheaper than a 5070Ti, and the 16GB 9060XT likely will be as well, but if they don't order enough chips from TSMC...

If AMD really wants to stick it in and twist they should make a 9060 or 9050 with 16GB. Just guessing based on the 9060XT having half the cores & mem bus and very slightly higher clocks than the 9070XT, I'm thinking it'll land a bit short of the 5060Ti unless you compare a 16GB model to an 8GB model and the 8GB model runs out of vram. That leaves room for a cut down version to use up the slightly imperfect dies. Cut it down a little, roughly match a 5060 in GPU performance, and offer a 16GB version.
 
Crazy thing is the RX 7600 and 7600XT don't even show up on the latest Steam hardware survey. The 7900XTX does along with the 7800XT and 7700XT, but the 7600 cards are AWOL. Somehow the 7900XTX is more common than the 7800XT, which in turn is more common than the 7700XT. It's completely backwards from NV where the cheaper SKUs are more common. Hopefully they'll make enough 9000 series cards to correct this. The 9070 & XT are a good value at MSRP or at least decently cheaper than a 5070Ti, and the 16GB 9060XT likely will be as well, but if they don't order enough chips from TSMC...

If AMD really wants to stick it in and twist they should make a 9060 or 9050 with 16GB. Just guessing based on the 9060XT having half the cores & mem bus and very slightly higher clocks than the 9070XT, I'm thinking it'll land a bit short of the 5060Ti unless you compare a 16GB model to an 8GB model and the 8GB model runs out of vram. That leaves room for a cut down version to use up the slightly imperfect dies. Cut it down a little, roughly match a 5060 in GPU performance, and offer a 16GB version.
The internal AMD slides put the 16GB AMD card a whole 6% faster than the Nvidia 8GB card at 1440p, there’s not much of a knife to twist…

Honestly I’d assume that anybody who buys any of the Nvidia X060 8GB variants is letting the Nvidia app configure their game graphics settings, or letting the game auto detect the graphics settings for their system. So those who are getting 8GB are probably not overly worried as it’s still likely much better than what they have and within their budget.

As to the 7900xtx being comparatively high on the chart, most purchased it with a strong FU Nvidia sentiment because if you were over spending on a GPU anyways so might as well go for broke.
 
The internal AMD slides put the 16GB AMD card a whole 6% faster than the Nvidia 8GB card at 1440p, there’s not much of a knife to twist…
Which one? NV has more than one 8GB card.
 
Would anyone ever complain (how would you start to care ?) about something like that..... 4k tv make sure game designer do not forget about 4k resolution anyway....
Because who is going to buy a 1440p or 4K monitor when their 8GB video card can't drive them? It's one of the many reasons why VR isn't taking off either.
And number #2 on newegg.com bestselling gpu list, but the 4060, 6600, still sell well as well so, 7600 is #1 on bestbuy, so still a big market for 8GB cards, but in general when 80% of the skus are sold out (or their equivalent in priced out) yes we cannot really use customer choice that much to judge what they want.
Do you know how many stores there are to buy graphic cards? Do you know how many Steams there are? Steam is ultimately the better source for what gamers are using to game.
 
Because who is going to buy a 1440p or 4K monitor when their 8GB video card can't drive them? It's one of the many reasons why VR isn't taking off either.

Do you know how many stores there are to buy graphic cards? Do you know how many Steams there are? Steam is ultimately the better source for what gamers are using to game.
Realistically most people don’t think that way, they get the machine they can afford that they think will get them what they want. They use the display device they have or they find the one they can afford.

The likelihood of somebody dropping $600+ on a monitor to play their $1200 pc on is unlikely. But chances are they already have a TCL 4k at home, or some $150-$300 monitor.
Maybe it’s new maybe it’s 8 years old and they are still happy with it.

The Average user keeps their monitor for a solid decade, probably makes its way through 2 or 3 PC’s before it gets replaced.

And yeah an 8GB won’t do max settings at 4k not a chance, but mid to high with upscaling to get 60 fps, not impossible.

The bulk of the 4k displays out there are still only 60hz.

My coworker was still happy using a OC 1060 6gb until a few weeks ago when it started artifacting. Dug a 1080 out of an old workstation in the parts pile for him and he’s happy. Plays on their old TV which he laments replacing because Smart TV’s piss him off.

Point is most people view the ultra and max settings for people dropping thousands of dollars on their rigs.

That said I’ll give him my 6750xt 12gb before I let him spend money on an 8Gb card.


Additional:
I currently find myself at a Costco where they are pushing an iBuyPower with a 4060 8gb running on a discount HiSense 4k TV. And Jesus $1800 CAD for this system is a ripoff… $700 for the TV is decent.
 
Last edited:
Because who is going to buy a 1440p or 4K monitor when their 8GB video card can't drive them? It's one of the many reasons why VR isn't taking off either.
I mean, I do not even know how you start to care about gamers buying 1440p or 4k monitor ? how is this even a topic of conversation ? What do I loose in no one else ever in the world buy a 1440p monitor to play game, what do I win if they do ? I never heard/read someone complaining about that ever before.... What do society win ?

Do you know how many stores there are to buy graphic cards? Do you know how many Steams there are? Steam is ultimately the better source for what gamers are using to game.
Of course, but there is 7 years of sold gpus in there, not necessarily the best source for 2025 gpu sales... did you quote the right message ? I fail to see the connection.
 
Because who is going to buy a 1440p or 4K monitor when their 8GB video card can't drive them?
People who also like to have the desktop real estate?

I've got an rx6600 8gb on my work/play desktop. My plan is to run it at 720p with integer scaling if I encounter a new game that requires such a drastic solution.
 
Realistically most people don’t think that way, they get the machine they can afford that they think will get them what they want. They use the display device they have or they find the one they can afford.
That's why I'm trying to tell you how these people will direct the market.
The likelihood of somebody dropping $600+ on a monitor to play their $1200 pc on is unlikely. But chances are they already have a TCL 4k at home, or some $150-$300 monitor.
Maybe it’s new maybe it’s 8 years old and they are still happy with it.
According to Steam, and yes I'm quoting Steam's Hardware survey, the primary desktop resolution is 1080p at 55%. 1440p is 20% and 4k doesn't exist. It does but at 4.57%. Got more people using 1366 x 768 than 4k.
The Average user keeps their monitor for a solid decade, probably makes its way through 2 or 3 PC’s before it gets replaced.
It will likely last longer due to them never getting a graphics card to drive it.
And yeah an 8GB won’t do max settings at 4k not a chance, but mid to high with upscaling to get 60 fps, not impossible.
I'm sure you could play some games at 4k like Hollow Knight and CS:GO.
I mean, I do not even know how you start to care about gamers buying 1440p or 4k monitor ? how is this even a topic of conversation ? What do I loose in no one else ever in the world buy a 1440p monitor to play game, what do I win if they do ? I never heard/read someone complaining about that ever before.... What do society win ?
Why you quoted me twice to answer me twice? What you lose is nothing, but the market will stagnate in sales. Nobody is going to buy new GPU's or monitors, and we will likely see people holding onto their aging hardware ever more so. Gamers won't buy games they can't run at a playable frame rate and enjoyable resolution. Why you think e-sports titles tend to target lower end hardware? They want mass market appeal, and that means people with not great PC hardware. What this means is that people with higher end hardware are stuck playing games that run on potatoes. Of course there are developers who do not care and optimizations are just a suggestion and not something they actually must do. In that case, we end up with people who can't find a payable setting for Stalker 2.
Of course, but there is 7 years of sold gpus in there, not necessarily the best source for 2025 gpu sales... did you quote the right message ? I fail to see the connection.
You started quoting store sales to counter my Steam statistic. For your statistic to work you would need to add up the sales of all stores, while Steam alone is enough to determine the average users buying habit. NewEgg and BestBuy are big, but not enough to paint a picture with.
 
That's why I'm trying to tell you how these people will direct the market.

According to Steam, and yes I'm quoting Steam's Hardware survey, the primary desktop resolution is 1080p at 55%. 1440p is 20% and 4k doesn't exist. It does but at 4.57%. Got more people using 1366 x 768 than 4k.

It will likely last longer due to them never getting a graphics card to drive it.

I'm sure you could play some games at 4k like Hollow Knight and CS:GO.

Why you quoted me twice to answer me twice? What you lose is nothing, but the market will stagnate in sales. Nobody is going to buy new GPU's or monitors, and we will likely see people holding onto their aging hardware ever more so. Gamers won't buy games they can't run at a playable frame rate and enjoyable resolution. Why you think e-sports titles tend to target lower end hardware? They want mass market appeal, and that means people with not great PC hardware. What this means is that people with higher end hardware are stuck playing games that run on potatoes. Of course there are developers who do not care and optimizations are just a suggestion and not something they actually must do. In that case, we end up with people who can't find a payable setting for Stalker 2.

You started quoting store sales to counter my Steam statistic. For your statistic to work you would need to add up the sales of all stores, while Steam alone is enough to determine the average users buying habit. NewEgg and BestBuy are big, but not enough to paint a picture with.

Two years ago, 1080p stood at 65% and 1440p at 12%. Alternatively, you can look at the March-April 2025 change, which was -1.2%pt for 1080p and +0.8%pt for 1440p (though that can be more iffy due to "steam statistics").

The inescapable conclusion is that steam users are slowly but surely abandoning 1080p and moving to 1440p (or higher) when they replace their monitors. Even the so-called non-existent 4k saw a 1.8%pt growth of usage share in the past two years.

Source (2023 steam survey): https://web.archive.org/web/2023051...eam-Hardware-Software-Survey-Welcome-to-Steam
 
Last edited:
You started quoting store sales to counter my Steam statistic. For your statistic to work you would need to add up the sales of all stores, while Steam alone is enough to determine the average users buying habit. NewEgg and BestBuy are big, but not enough to paint a picture with.
Counter ? I fully reinforced your point about the 3060 being popular on steam by saying that it was even today #2 on newegg (how is that countering), I use the and in: And number #2 on newegg.com bestselling gpu list, to show addition to the point, not counter. Do you think a 2020 GPU at original msrp still showing up at the top 2 selling list in 2025 was countering your point ;) ?

Nobody is going to buy new GPU's or monitors, and we will likely see people holding onto their aging hardware ever more so.

I am not sure if there is a limit of how powerful a gpu can be needed to do 1080p render (it is not like game at 1080p are close to look like Saving Private Ryan bluray yet), GTA 6 native render as low as 1080p in their trailer, GTA 5 ps4 edition launched back in the day at 1080p, they have enough budget and good enough to be able to use all the extra power in interesting way not in low return/marginal gain for a lot of watt higher render resolution. higher resolution make gpu maker just copy-paste of the same core, rise memory bandwith that easy, the easiest, absolutely nothing easier and cheaper for game developer to use extra power as rising resolution or fps (that usually why the specially edition on a ps5 pro type of version will choose to do, as it did cost about nothing), but that a different subject and resolution stagnation could be better than worse for the industry (i.e. I feel like we should wish for a stagnation and not going to 6-8k for a very long time)

According to Steam, and yes I'm quoting Steam's Hardware survey, the primary desktop resolution is 1080p at 55%. 1440p is 20% and 4k doesn't exist. It does but at 4.57%. Got more people using 1366 x 768 than 4k.
Yes but about half of AAA games are being play on consoles, most of them or plugged on 4k TV, in your imagine of the future people are playing game on a PS6, on a 4K tv... a 24GB unified memory PS6 being likely. GDDR7 will have 3GB, 4GB up to 8GB module in the future, unified memory system/laptop will become good enough. So if someone fear that game will not make texture or what not to take advantage of 4K because of people on 1080p monitor, that not necessarily the case.
 
MLID put out a new video where he says 20 to 40% of the 9060 XT’s supply is the 8 GB model. That means AMD officially didn’t listen, and decided to confidently stomp their foot into the same shit-pile they just watched NVIDIA step in. You can lead a man to reason, but you can’t make him think. Especially if that man is Frank Azor.
 
MLID put out a new video where he says 20 to 40% of the 9060 XT’s supply is the 8 GB model. That means AMD officially didn’t listen, and decided to confidently stomp their foot into the same shit-pile they just watched NVIDIA step in. You can lead a man to reason, but you can’t make him think. Especially if that man is Frank Azor.
At this stage just wait for the prices to drop

$350 is a good price for the 16gb model. Worth waiting (or queing up in micro center)
 
20 to 40% (twice the amount range) seem a bit like saying we do not know....

Seeing how many 8GB or less model sold above $300 we see in the top 5 best sellings GPU on all store in many market, a 50% faster than them $299 9060xt should be easy to liquidate quick and a non-issue.

If anything, 5060 at msrp quickly selling out was maybe more reassuring than seen as an issue for the industry.
 
Last edited:
8gb models, designated as a gaming card is a disservice to the buyers, a setback for gaming in general where a huge share of the market will be still stuck on less than 10gb cards while consoles and particularly next generation will be able to use way above that for graphics. I can understand an 8gb card for very low end, less than $200 market but at $299 which bloats way above that is a shame. 20% and cancelled would probably be OK, but 40% and not -> sad.
 
Two years ago, 1080p stood at 65% and 1440p at 12%. Alternatively, you can look at the March-April 2025 change, which was -1.2%pt for 1080p and +0.8%pt for 1440p (though that can be more iffy due to "steam statistics").

The inescapable conclusion is that steam users are slowly but surely abandoning 1080p and moving to 1440p (or higher) when they replace their monitors. Even the so-called non-existent 4k saw a 1.8%pt growth of usage share in the past two years.

Source (2023 steam survey): https://web.archive.org/web/2023051...eam-Hardware-Software-Survey-Welcome-to-Steam

Those are good stats, I think the Steam survey is the best measure of reality since so many gamers get their games from Steam. So at the current rate, 1080p should only be dominant for a couple more years. One could probably plot the changes over time and it's likely a logarithmic gain not linear, so quite likely it will be less than that. Then again, shit happens and people can't afford a new card as often and that will slow the rate of change I think.
 
I still haven't had a graphics card with more than 11GB VRAM, and I'm doing fine at 1440P (from GTX 1080 Ti 11GB, to RTX 3070 8GB, to RTX 3080 10GB). I don't see 8GB of VRAM as a problem, especially if you're sticking to 1080P with 1440P capability in some games. At 1440P, the RTX 3080 10GB still outperforms the RTX 4070 Ti 16GB and the RX 7800XT 16GB; and the RTX 3070 8GB still outperforms the RTX 4060 Ti 16GB.

The only reason I would ever buy a GTX 5060 8GB or 9060 XT 8GB is for the low power consumption for the kids to play games at 1080P on the HTPC. However, I have an RTX 4060 8GB, so no need for that.
 
I still haven't had a graphics card with more than 11GB VRAM, and I'm doing fine at 1440P (from GTX 1080 Ti 11GB, to RTX 3070 8GB, to RTX 3080 10GB). I don't see 8GB of VRAM as a problem, especially if you're sticking to 1080P with 1440P capability in some games. At 1440P, the RTX 3080 10GB still outperforms the RTX 4070 Ti 16GB and the RX 7800XT 16GB; and the RTX 3070 8GB still outperforms the RTX 4060 Ti 16GB.

The only reason I would ever buy a GTX 5060 8GB or 9060 XT 8GB is for the low power consumption for the kids to play games at 1080P on the HTPC. However, I have an RTX 4060 8GB, so no need for that.
I would say there are definite use cases where the extra cost for the 16gb makes little sense (business apps, Esport gaming, Old games, light video editing). For the general public buyer who wants to play most recent and upcoming games, who are not so knowledgeable, complete build buyers wanting a gaming machine -> Terrible buys, which put developers at odds for advancing gaming in general, lower game sales as well since newer worthwhile titles may play badly if at all.
 
I would say there are definite use cases where the extra cost for the 16gb makes little sense (business apps, Esport gaming, Old games, light video editing). For the general public buyer who wants to play most recent and upcoming games, who are not so knowledgeable, complete build buyers wanting a gaming machine -> Terrible buys, which put developers at odds for advancing gaming in general, lower game sales as well since newer worthwhile titles may play badly if at all.
8GB is going to be a nasty surprise for a whole lot of not so PC gaming hardware savvy people buying pre-builts. A lot of people look for a price point, have no idea what they're doing, and more or less just buy what's marketed at their price point... and NV is totally shoveling 8GB cards at the most popular price points. DIY builds I don't worry about. If you're building it's on you to know what's good. We'll see if AMD is any better than NV. If I see a bunch of pre-built offerings with 8GB 9060XTs I'm going to say no, they're not.

I think new AAA games will work on 8GB cards for quite some time, but it's going to be the new minimum for vram. In other words they'll be stuck on "low" or "medium" for texture quality. The installed base of 8GB cards is just too big for game devs to cut off, plus they're selling even more of them.
 
Having given this some thought and decided that nVidia and AMD aren't complete morons, I think the reason for this SKU rests almost entirely with OEMs. Sure with Microcenter it might be 50 to 70 bucks less, but OEMS? Those bastards charge a premium for going from 8GB to 16GB. So this SKU for AMD and Nvidia is important because it's volume sales. AMD and Nvidia are going to sell way more of the 60* series SKU than anything else. It's also the sweet spot in die size and cost per wafer. It could very well be that AMD and nVidia are making WAY more than $50 bucks on this SKU when looked at in total with the 16Gb model.

In a nutshell the 8GB model serves as an upsell for OEMs which both of them profit off of. This is on top of the markup by going OEM.
 
Last edited:
If you take Frank's comments at face value, I don' think he is that wrong.

The issue though is if you look look at 8GB of VRAM through the lens of complete stagnation of VRAM over the past several years, it tells a completely different story and why thinking like his, in this market, is holding back progress in a way we've never experienced before.

If these mid-range cards had 16GB of VRAM as a baseline with optional 24-32GB upgrades (that are "worthless" like VRAM upgrades were years ago) - and upper mid-tier cards were pushing 20-24GB as baseline - that market would understand if you shipped an 8GB 50 series discounted card designed for ESPORTS with that boldly printed on the box as a cheap option that enthusiast and reviewers would largely ignore. It would exist for markets where it's needed as a true low cost option.
 
Last edited:
Back
Top