5070 will reportedly only have 12 GB of VRAM

vjhawk

Limp Gawd
Joined
Sep 2, 2016
Messages
498
Is this really acceptable for 2025?

And will the price of the 5070 also move up from $599 price point of the 4070 to satisfy NGreedia?

1728684085728.png


Only 12GB feels really lackluster for a modern GPU. It should have minimum of 16GB.

But what I hear is the 5080 will have 16GB. Which again feels backwards for a card that will likely cost at least $999 if not more.

So Nvidia's plan is to give less Vram and less percentage performance comparative to the 5090 and see how many of us will still buy?

Feels like video cards are moving BACKWARDS and not forwards!

Source: https://www.msn.com/en-us/news/tech...S&cvid=02dc2052b5874c848110c69e80118b89&ei=10
 
All this talk of "NGreedia" yet people still buys their GPUs anyway when AMD offers a great alternative at this price point. The ones buying such lame Nvidia cards are the ones to blame, because if Nvidia can get away with only giving 12GB on a 70 class card and people will buy it anyway then why not? I'm sure without a doubt that RDNA4 will offer at least 16GB if not more VRAM on their competing class GPU yet people will just buy a 5070 instead.
 
rumors are that the 5080 will launch with 16GB VRAM but a 24GB VRAM version will launch later on...still sucks on Nvidia's part to not give the 5080 more VRAM but blame the people that rush out to buy the 4090 and 5090...Nvidia is now gimping the 80 series
 
a lot of that is because of Nvidia's vastly superior ray tracing performance

RDNA4 supposedly makes big improvements to RT performance, probably won't be on par with Blackwell but hopefully there won't be such a massive gap between AMD and Nvidia in RT anymore. And this is just my opinion but if I was shopping for a 70 class GPU then I wouldn't be using ray tracing, at least not to very high levels.
 
Is this really acceptable for 2025?

And will the price of the 5070 also move up from $599 price point of the 4070 to satisfy NGreedia?

View attachment 684883

Only 12GB feels really lackluster for a modern GPU. It should have minimum of 16GB.

But what I hear is the 5080 will have 16GB. Which again feels backwards for a card that will likely cost at least $999 if not more.

So Nvidia's plan is to give less Vram and less percentage performance comparative to the 5090 and see how many of us will still buy?

Feels like video cards are moving BACKWARDS and not forwards!

Source: https://www.msn.com/en-us/news/tech...S&cvid=02dc2052b5874c848110c69e80118b89&ei=10
Blame gddr7. No way was nVidia was going to make a 70 class card 256 bit.
 
This, plus dlss upscaling quality and driver quality overall. 12gb vram will remain enough for years to come at 1080p and 2560x1440 resolutions, and 16gb for 4k.

I'm hitting the limit in some games that would otherwise perform well. 12GB on a $600 card that is supposed to last 2+ years isn't quite enough, IMO.

The issue is the 70s used to be something you could turn up all the settings at for the most part and not worry about messing around and finding settings that work or how to stay within VRAM limits. They're becoming more like the 60s, and the 80s are becoming more like the 70s. The supposedly leaked specs of the 5080 looks more like a 5070, IMO.
 
I'm hitting the limit in some games that would otherwise perform well. 12GB on a $600 card that is supposed to last 2+ years isn't quite enough, IMO.

The issue is the 70s used to be something you could turn up all the settings at for the most part and not worry about messing around and finding settings that work or how to stay within VRAM limits. They're becoming more like the 60s, and the 80s are becoming more like the 70s. The supposedly leaked specs of the 5080 looks more like a 5070, IMO.

That's insightful.
 
I'm hitting the limit in some games that would otherwise perform well. 12GB on a $600 card that is supposed to last 2+ years isn't quite enough, IMO.

The issue is the 70s used to be something you could turn up all the settings at for the most part and not worry about messing around and finding settings that work or how to stay within VRAM limits. They're becoming more like the 60s, and the 80s are becoming more like the 70s. The supposedly leaked specs of the 5080 looks more like a 5070, IMO.
NV model numbers don't mean the same thing they used to. NV's numbering system is ~20 years old and they're changing what they mean.

At the high end they're making the price/perf curve more linear. In the past price/performance was reasonably sane until you got to the Titans unless you go way back, like before 2010. Then the Titans were a rip-off for gaming. They didn't offer the performance to match their price. The big change at the high end is the 4090 is no longer a rip-off in price/perf. It's about 2x the price of a 4070Ti, roughly twice the cores, mem bus and vram, and close to 2x the performance at 4k with RTX on. The 4080 was supposed to sit halfway in-between at 3/4s the price of the 4090, but it missed the mark on performance especially at 4k + RTX. NV did a price cut on it with the 4080 Super, which is probably as close as you'll ever get to hearing NV admit they screwed up without a bug or design defect. This is a big difference from Ampere, where the 3090 was 2x the price of the 3080 and really not nearly enough faster.

At the lower end of the scale the 60 card has turned into an anchor. It jumped up from $250 for a 1060 6GB to $350 for a 2060 6GB when they added ray tracing and die sizes got much larger, pushing up costs. A $350 60 card caused a lot of rage, and they lowered the MSRP of the 60 card for the $330 3060 and $300 4060.

This makes a mess of NV's traditional numbering system. At one end you have $300 = 60, and at the other end they're trying to straighten out the price/perf ratio to make top end cards perform like their price tag but they only have 7, 8 and 9 to work with. So they shifted the numbers around and added a bunch of Tis. Tis are full fledged levels in the product stack now. Not sure why they don't just end model numbers in 5. They used to do that. For Lovelace it's 60-60Ti-70-70Ti-80-90, so the 70 card is now in the bottom half of the lineup. "Tis count now" may also explain why there's a lack of generational uplift in much of the RTX 4000 series. Instead of the 4070 beating the 3080 it roughly matches it with the 4070Ti being assigned the job of beating it. The 4060Ti rather than the 4060 surpasses the 3070. Then there's the poor 4060, which can't even beat a 3060Ti.

So yeah, the 4070 is more like a 60 card of past generations in a lot of ways. You can also look at the 90 as an SLI replacement.

I don't see any help coming for the 12GB on the 5070 at launch. Samsung won't start production of the 3GB GDDR7 chips needed to put 18GB on a 192-bit bus until early next year, so not in time for a February launch. I wouldn't be surprised if we got a 16GB 5070Ti though. The 5080 is supposed to be the full GB203 chip, so they'll have to do something with all the duds.

As for just turning up settings and going, I don't think that's really a thing anymore if ray tracing is involved unless your card is "bigger" than your screen. I've always had to fiddle with settings due to being a high res/big screen junky. 21" CRT at 2048x1536, GDM-FW900 at 2304x1440, 3x 1440p surround setup, and now a sorta normal 48" 4k OLED. My 3090 has never been able to just max out whatever at 4k, though a lot of games without ray tracing run ok at native res with the details cranked up. The catch is AAA stuff these days pretty much has RT, so of course I'm fiddling with settings. I think it's actually gotten worse. I used to just turn stuff down until it ran well enough on an LCD. Now I'm mucking with DLSS too so it's like being back in the CRT days where I'm messing with settings and changing resolution via DLSS. Based on what I've seen trying to run stuff at 4k native just for the hell of it I don't think even a 5090 will eliminate messing with settings at 4k with RT.
 
Let me preface this by saying: I believe this, I don't like this, and I want better.

It makes absolutely NO sense for Nvidia to give gamers any more than absolutely necessary to ensure purchases at the highest margin.

Nvidia won't have any competition, AMD's card could have 48GB GDDR7 but if it "Only" performs like a 4080 at BEST, lacking RT performance of a 4080, lacking DLSS, Lacking that Nvidia vibe and Geforce charm, the 5070 will have no competition.

Which means your options are get a 5070 12GB or bitch about it. you won't buy the competition.

Next up is that the 5070's biggest competition is not AMD, but Nvidia from the past. I guarantee more old-stock/second hand 4080's will be purchased by gamers than AMD cards (all of them) in 2025. So you'd think Nvidia would make sure they have a compelling option? NO! wrong thinking. They are going to thread that needle so that the 5070 doesn't compete too well with the 6060. They need to make sure that once 2025's products become "Nvidia from the past" they don't have a ton of longevity. Nvidia really screwed up with the 1080Ti: WAAAY too much potential and longevity with that. Could you imagine how much extra they could have made if those pesky 1080Ti owners upgraded in the 20X0 or 30X0 generation???

The other thing is that even if RAM was free, even if adding more RAM made the card CHEAPER to make, Nvidia would still limit it to what is ABSOLUTELY necessary because ever extra megabyte of RAM makes the card more and more suited to AI workloads and Nvidia does NOT want AI customers to get the Geforce discount.

So you have two options: pay $4500 for a 5070 with 24GB of RAM that is sold out all the time because it can run large AI models

Or

Stick with weak, artificially bottlenecked 5070 and deal with it because 'yall won't buy AMD to save your life, and AMD's Radeon division won't help you because they are SO DAMN BAD at their job.
 
Let me preface this by saying: I believe this, I don't like this, and I want better.
Good way of putting it. The only way it's going to get better for consumers is competition. Without it NV will just do whatever they want.

Next up is that the 5070's biggest competition is not AMD, but Nvidia from the past. I guarantee more old-stock/second hand 4080's will be purchased by gamers than AMD cards (all of them) in 2025. So you'd think Nvidia would make sure they have a compelling option? NO! wrong thinking. They are going to thread that needle so that the 5070 doesn't compete too well with the 6060. They need to make sure that once 2025's products become "Nvidia from the past" they don't have a ton of longevity. Nvidia really screwed up with the 1080Ti: WAAAY too much potential and longevity with that. Could you imagine how much extra they could have made if those pesky 1080Ti owners upgraded in the 20X0 or 30X0 generation???
True. NV has most of the market share. Their challenge is not to take customers from AMD, it's to get existing customers to upgrade so NV from the past is their biggest competition. I'm not sure they'll get a lot of used card buyers to get a brand new card though. More like they'll get them to buy another used card sooner so the new card buyers can recover part of their upgrade cost, but that works too. Then the buyer of the used card sells their old used card or just tosses it. It's ok as long as the cards keep moving and NV keeps selling. 1080Tis were bad for that. 4070s should be no problem.

Stick with weak, artificially bottlenecked 5070 and deal with it because 'yall won't buy AMD to save your life, and AMD's Radeon division won't help you because they are SO DAMN BAD at their job.
Sometimes I think Intel is a bigger threat to NV's vid card dominance than AMD is. AMD has been going at it with NV ever since they bought ATI in 2006 and they just keep failing. Meanwhile Intel shows up with their first try in a long time with XeSS and arguably better ray tracing than AMD. Still kinda crap compared to the competition, but you expect that for a first try. What happens if Battlemage is a lot better? Then Intel gets 18A working nicely and starts fabbing ARC Celestial GPUs on it while NV and AMD have to fight with Apple, Qualcomm, and AMD's datacenter division for TSMC's top processes? At any rate it's track record of repeat failure vs. the new guy who doesn't know what he's doing, so who do you want to bet on? It's noob vs. fail up against NV, so "neither" is probably safe... but we can hope lol.
 
rumors are that the 5080 will launch with 16GB VRAM but a 24GB VRAM version will launch later on...still sucks on Nvidia's part to not give the 5080 more VRAM but blame the people that rush out to buy the 4090 and 5090...Nvidia is now gimping the 80 series
Correct. the upcoming 5080 would be more accurately badged the 5070 according to historic percentages of CUDA cores, bus width, and vram. Ngreedia wants to sell a super cut down "5080" and see how many people it can fool into paying 80 series prices for 70 series level silicon. Tech Powerup has also confirmed the specs.

1729733078871.png


If that's the case then the 5070 will be more on par with a 60 series part and so on down.

I expect an upcoming 5080 Ti to have 24gb of memory and the cuda core percentage we used to expect of an 80 series card.

It's just more scummy Nvidia tactics because they believe consumers are dumb. They already know the 5090 will sell out no matter what because it's a HALO product, what they want to do is MILK the rest of us for as much money as they can get for as little silicon as they can give.
 
IMO the 5070 should be called the 5060 and the 5080 should be called the 5070. Remember that 4080 12GB version?.....yeah

Like I said before, Nvidia is good and fleecing customers.
Not quite, but close. You have to include all of the Tis now. They're full fledged price levels. It's not 60-70-80-80Ti-Titan anymore, more like 60-60Ti-70-70Ti-80-90 for Lovelace. We'll probably get a 80Ti next time since the 90 is moving up to 2x 5080 instead of 2x 4070Ti, plus that $1200 4080 thing didn't work out so well. At any rate yes the numbers don't mean what they used to but you have to treat "Ti" cards as full fledged levels. And no, I don't know why NV is ending model numbers in 0 or 0Ti instead of just 0 or 5. They used to end model numbers in 0 or 5.
 
Last edited:
Correct. the upcoming 5080 would be more accurately badged the 5070 according to historic percentages of CUDA cores, bus width, and vram. Ngreedia wants to sell a super cut down "5080" and see how many people it can fool into paying 80 series prices for 70 series level silicon. Tech Powerup has also confirmed the specs.

View attachment 687000

If that's the case then the 5070 will be more on par with a 60 series part and so on down.

I expect an upcoming 5080 Ti to have 24gb of memory and the cuda core percentage we used to expect of an 80 series card.

It's just more scummy Nvidia tactics because they believe consumers are dumb. They already know the 5090 will sell out no matter what because it's a HALO product, what they want to do is MILK the rest of us for as much money as they can get for as little silicon as they can give.
They tried to do this with what was initally going to be the 4080 12GB. But people threw a fit (rightfully) when they saw its specs were a) notably below the "real" 4080 (which was in itself anemic related to previous xx80 cards) in mem bus, shaders and ROPs, not just reduced VRAM. The fact that they had to step the "4080 12gb" back to a 4070 label was, I'm certain, a planned continguency that didn't take nV as off guard as people thought: it was all designed to test the waters as to how far they could cut down hardware and still stay within the rails of the model conventions they'd established over the past 20 years.

This round, they're going all in on both widening the SKU segmentation and incremental, mid-cycle updates: the aims being to minimize production costs and roping their current consumers in an accelerated sales cycle. Oh, there will be a 16GB version of 5070 (I'd guess in late 2025) -- but it will exist almost uniquely to entice people who skipped (or bought the 12GB) model to upgrade.
 
Is this really acceptable for 2025?
No

So Nvidia's plan is to give less Vram and less percentage performance comparative to the 5090 and see how many of us will still buy?

Yes, and people will still buy them and either:

a) Complain about the lack of VRAM on the internet for the card they bought anyway.
b) Try to justify it on the internet in order to validate the decision in their own mind.

That's basically how this works. Worked this way for the 4070, so too it will be for the 5070.
 
No



Yes, and people will still buy them and either:

a) Complain about the lack of VRAM on the internet for the card they bought anyway.
b) Try to justify it on the internet in order to validate the decision in their own mind.

That's basically how this works. Worked this way for the 4070, so too it will be for the 5070.
Weird, I bought a 4070 shortly after launch and I didn't do either a or b. Sorry to disappoint you, I've had a bunch of fun the past 1-1/2 years.
 
Weird, I bought a 4070 shortly after launch and I didn't do either a or b. Sorry to disappoint you, I've had a bunch of fun the past 1-1/2 years.
Why would I be disappointed? What you do is up to you. I'm not personally willing to spend money on an insufficient amount of VRAM and I've seen plenty of A and B since the launch. If it works for you, great, I'm glad you're enjoying it. That doesn't mean I suddenly think Nvidia is asking a reasonable price for a product with what I feel is insufficient VRAM, particularly while marketing features such as ray tracing which demand more VRAM.
 
Why would I be disappointed? What you do is up to you. I'm not personally willing to spend money on an insufficient amount of VRAM and I've seen plenty of A and B since the launch. If it works for you, great, I'm glad you're enjoying it. That doesn't mean I suddenly think Nvidia is asking a reasonable price for a product with what I feel is insufficient VRAM, particularly while marketing features such as ray tracing which demand more VRAM.

Put me down for one of those people that will complain but may purchase it regardless. I wouldn't mind moving up to the 5080 but I know that will be priced $1000+, and honestly, I just don't value the extra performance that much.
 
going down to lower bits inference using less vram could be common on that gen (say for DLSS and raytracing reconstruction, 8 bits first, who knows for that 4bits will be good enough), but my feeling they will need higher memory super skus soon enough when the 3GB GDDR7 become available
 
Put me down for one of those people that will complain but may purchase it regardless. I wouldn't mind moving up to the 5080 but I know that will be priced $1000+, and honestly, I just don't value the extra performance that much.
better off just getting a used 4090 imo.
 
This, plus dlss upscaling quality and driver quality overall. 12gb vram will remain enough for years to come at 1080p and 2560x1440 resolutions, and 16gb for 4k.
Driver quality argument has been busted 10000x over...

The ones who usually tout "NVIDIA has better drivers" are the one who only buy NVIDIA it seems, because "AMD drivers suckkkkkkkkkkk"

If NVIDIA had better drivers, why was it always NVIDIA cards I had problems with BSOD's due to their one DLL files in the 2k / xp early win 7 days..

For most people, they install the card, the drivers and they are done, I feel like people who have drivers issues, from either camp, are the ones always fiddling with them, changing settings, doing custom configs (that most probably dont need)...
 
  • Like
Reactions: noko
like this
Driver quality argument has been busted 10000x over...

The ones who usually tout "NVIDIA has better drivers" are the one who only buy NVIDIA it seems, because "AMD drivers suckkkkkkkkkkk"

If NVIDIA had better drivers, why was it always NVIDIA cards I had problems with BSOD's due to their one DLL files in the 2k / xp early win 7 days..

For most people, they install the card, the drivers and they are done, I feel like people who have drivers issues, from either camp, are the ones always fiddling with them, changing settings, doing custom configs (that most probably dont need)...
Cute, but no. Theu have consistently sucked since the 9700 pro when I've tried ati/amd. Their reputation is well deserved and part of why 90% of cards sold are Nvidia.
 
better off just getting a used 4090 imo.

Depends on how much they sell for. Other issues would be heat/power usage and size. I have a mid size case and I assume a 4090 would fit but it would be quite a tight fit. But yes, that can be a good option for some.
 
Cute, but no. Theu have consistently sucked since the 9700 pro when I've tried ati/amd. Their reputation is well deserved and part of why 90% of cards sold are Nvidia.
I have switched back and forth between both nvidia and AMD over the last 30 years, and the most issues Ive had has been with nvidia. Both have had problems but nvidia was way worse, and was the only time I had to do a windows reinstall when nv hosed my windows XP install.
 
honestly both camps have good drivers. With the million of hardware combos, windows versions, other driver versions I am surprised both work as well as they do. I do prefer team green for other reasons
 
Cute, but no. Theu have consistently sucked since the 9700 pro when I've tried ati/amd. Their reputation is well deserved and part of why 90% of cards sold are Nvidia.
Yea, it is especially dire if you play older or indie games, but they struggle on new releases or big names often enough too. Yup, that's still true in 2024. But is that a surprise when they have such a tiny portion of the dekstop market? They have far less resources. That's why they should be cheaper, they are the budget brand, but their prices are too close to nvidia, so of course they are not growing, they are simply not attractive.

There are still common issues on WoW (AMD+DX12, over a year old), and SM2 on release was not smooth either (Steam forums show a ridiculous number of threads talking about it, despite AMD representing a tiny portion of the GPUs being used). That's just two recent examples.

Drivers are of course not the only reason AMD is doing poorly, but alongside inferior features and prices too close to nvidia, it is not a pretty picture. Having more VRAM won't cut it.
 
Last edited:
All this talk of "NGreedia" yet people still buys their GPUs anyway when AMD offers a great alternative at this price point. The ones buying such lame Nvidia cards are the ones to blame, because if Nvidia can get away with only giving 12GB on a 70 class card and people will buy it anyway then why not? I'm sure without a doubt that RDNA4 will offer at least 16GB if not more VRAM on their competing class GPU yet people will just buy a 5070 instead.
I bought an amd RX6800. So yes I do use both brands. My favorite video card was the GTX1080. Performance and value. We know Nvidia can deliver that, they just don’t want to.

Instead they are pushing subpar products at exorbitant prices and they will rightfully catch our criticism when they do so.

The 5080 they will be pushing is not worthy of the 80 badge, having only 50% the cuda cores and memory of the 5090. Similarly the 5070 with only 12 gb they propose is awful value for a gpu in 2024.
 
I bought an amd RX6800. So yes I do use both brands. My favorite video card was the GTX1080. Performance and value. We know Nvidia can deliver that, they just don’t want to.

Instead they are pushing subpar products at exorbitant prices and they will rightfully catch our criticism when they do so.

The 5080 they will be pushing is not worthy of the 80 badge, having only 50% the cuda cores and memory of the 5090. Similarly the 5070 with only 12 gb they propose is awful value for a gpu in 2024.

You should be calling them out for it. But what I'm saying is that people will complain and then still buy it anyways so why should Nvidia change? Let's say top RDNA4 is priced to compete against the 5070. 5070 has 12gb while RDNA has 16gb, RDNA4 greatly improves RT performance as shown by Ps5 pro, FSR4 with AI now catches up to DLSS. Will people now ignore the lame 5070 and buy AMD instead? Maybe a small percentage will, but the vast majority will just buy the 5070 despite complaining about vram.
 
Put me down for one of those people that will complain but may purchase it regardless. I wouldn't mind moving up to the 5080 but I know that will be priced $1000+, and honestly, I just don't value the extra performance that much.

If the 5080 has 16GB then that’s enough for the foreseeable future. Sure, it’s less than what AMD is offering, but RAM is something you need ENOUGH of. All measures I’ve seen are indicating 16GB is sufficient. An excess of RAM won’t do anything for you, but a deficiency of RAM will be noticeable, and Nvidia asking $800 for a card with 12GB of VRAM is unacceptable in my opinion. That said, they still sold out, so a lot of people don’t care, and if they want to spend their money there, it doesn’t bother me. I’m an Nvidia shareholder, be my guest.

Pricing-wise, I don’t see a way it really goes down much compared to last generation. The 3000 series was fairly priced in my view, but that was released after the last crypto crash. Now we have the AI ramp and a market where customers have said they’re happy to pay more. Nvidia will oblige, AMD will see what Nvidia did and say “ok that but 10% less”.
 
Last edited:
Unfortunately the market doesn’t have relevant competition to even make Nvidia provide decent value products anymore. Maybe intel will get there in a couple generations.
 
Correct. the upcoming 5080 would be more accurately badged the 5070 according to historic percentages of CUDA cores, bus width, and vram. Ngreedia wants to sell a super cut down "5080" and see how many people it can fool into paying 80 series prices for 70 series level silicon. Tech Powerup has also confirmed the specs.

View attachment 687000

If that's the case then the 5070 will be more on par with a 60 series part and so on down.

I expect an upcoming 5080 Ti to have 24gb of memory and the cuda core percentage we used to expect of an 80 series card.

It's just more scummy Nvidia tactics because they believe consumers are dumb. They already know the 5090 will sell out no matter what because it's a HALO product, what they want to do is MILK the rest of us for as much money as they can get for as little silicon as they can give.
Damn, this is the truth and the reason why we should should all just use DLSS and buy graphics card on the used market 3 years after release for what it's actually worth . Well said brother. I am going to do my best to skip the 5000 series and I mean it this time lmao 😂
 
5070Ti using GB203, cut down version of the 5080 probably with 256-bit bus and 16GB. I was expecting that. NV isn't getting much in the way of process improvement this time, so they'll have to depend on bigger chips, watts & clocks, and GDDR7 for performance improvements. Last time around they got a big improvement on the process side, Samsung 8nm -> TSMC 4, so they could just lean on that. This time I bet most of the stack shifts half a notch back towards Ampere/historical norms. Go back a few generations and usually the 70 and 80 cards shared a chip, but Tis count as full fledged price levels now so I think it'll be the 5080 and 5070Ti sharing GB203, 5070 and 5060Ti on GB205, etc. Also 5090 and 5080Ti on GB202, maybe also 5080Ti Super for the refresh. 5090 32GB/512, 5080Ti 24/384, 5080Ti Super 28/448 maybe.
 
Go back a few generations and usually the 70 and 80 cards shared a chip

For what it's worth, the "4070 Ti Super" is based on the 4080 (which is why it has 16GB RAM), so there is precedent for this even in the current generation.
 
While the internet has been busy hating the 30 and 40 series Ti's, we have been enjoying them in this house.

5070Ti will probably still be a good card, while the internet continues to hate them lol.
 
Back
Top