5070 will reportedly only have 12 GB of VRAM

I think the average gamer is happy to use "high" textures instead of "ultra"

That said, not happy with Nvidia aggressively targeting a minimum viable product at every level except the X090 😮‍💨
 
For what it's worth, the "4070 Ti Super" is based on the 4080 (which is why it has 16GB RAM), so there is precedent for this even in the current generation.

Which kind of makes it worse. A 5070 may come out at $600 with 12GB, but right now you can buy a 4070ti Super with 16GB for under $800. If the 5070 is about as fast as the 4070 as when it came out (around 3080 speeds or slower), it would be in the ballpark of a 4080 or slower, Nvidia's offerings performance per dollar won't change much. I have to assume the 5070 will probably be faster in some areas than a 4080, otherwise they would essentially be taking a 4070ti Super performing card, reducing the VRAM and charging $200 less for it. Which is not that great of a jump from what current products offer.
 
Last edited:
Indiana Jones makes the 4080 Super crawl with 16gb of vram with Nvidia Path Tracing. Basic slide show which frame generation would incur an additional burden on vram that is not there. If one wants to use that level of image quality, it takes vram which Nvidia is intentionally limiting. Making any useful, engaging, step up for RT mute.

Now if developers used Nvidia AI type texture compression, proprietary, that could change, albeit limited to one vendor to serve you and at whatever price.

A 5070 12gb if priced less than $500 would be ok but really limiting for some games as in settings one can use.
 
We'll be lucky if the 5070 maintains it's $600 price point at this rate.

^ this...

4070's are not even under $500..

With current GPU stocks dwindling down... gonna be early adopters chomping at the bit and have some money burning a hole in their pocket once new releases hit the shelves...
I wouldn't be surprised if pricing gets beyond stupid again lol
 
Both the Series X|S and PlayStation 5 can use up to 10GB of its shared RAM for graphics. I think Sony increased it to 12GB in later firmware revisions.

Regardless, this was only a true statement for generation 8 consoles when they could only use up to 5GB of the 8GB available for graphics during a time when 8GB VRAM became the standard on high end video cards.. The real performance bottleneck that was holding games back then and now is the mobile-tier CPU cores.
 
^ this...

4070's are not even under $500..

With current GPU stocks dwindling down... gonna be early adopters chomping at the bit and have some money burning a hole in their pocket once new releases hit the shelves...
I wouldn't be surprised if pricing gets beyond stupid again lol
We will see if AMD and maybe Intel that is if they can release a B770/750, if Nvidia can maintain a premium price. I would not hold my breath. Nvidia just has outstanding mind share, similar to a cult, in my view.
 
AMD has shown time and time again that they will be like a ramora on a shark, or seagulls on a fishing barge and price their products as pathetic 2nd choice to Nvidia's pricing. They will not try to change the industry to capture mindshare/market share, instead they just sit under Nvidia's table and pick up the scraps that fall of their plate.

AMD does not keep Nvidia prices low.

Nvidia keeps AMD prices high.

Intel, on the other hand, seems hungry and may actually make waves.
 
  • Like
Reactions: noko
like this
AMD has shown time and time again that they will be like a ramora on a shark, or seagulls on a fishing barge and price their products as pathetic 2nd choice to Nvidia's pricing. They will not try to change the industry to capture mindshare/market share, instead they just sit under Nvidia's table and pick up the scraps that fall of their plate.

AMD does not keep Nvidia prices low.

Nvidia keeps AMD prices high.

Intel, on the other hand, seems hungry and may actually make waves.
Pretty much agree lately, previously the exception was the HD 480, HD 7870, the 290 was not bad other than reference model noise. If AMD does another price high, then dump the price shortly afterwards, pissing off all the first buyers (does AMD even comprehend that) then Nvidia 12gb 5070 will sell like pancakes.
 
I think the average gamer is happy to use "high" textures instead of "ultra"

That said, not happy with Nvidia aggressively targeting a minimum viable product at every level except the X090 😮‍💨
but is the average gamer happy to abandon RT + frame generation? 16GB is barely enough for 1080p when you factor these in
 
  • Like
Reactions: noko
like this
If the 5070 Ti gets 16GB of VRAM and beats out the previous generation 4080 that might be THE card for gamers to target as long as the price is reasonable. If priced at $699 that would be a killer value.

And yes I find it hard to say 699 is good value for a 70 series card. But if it beats out the 4080 which was a $1000 card, then it would be.

4070ti has 16GB and is close to the 4080 in terms of performance.

relative-performance-2560-1440.png
relative-performance-3840-2160.png


I wouldn't say a $100 price drop (4070tis have been as low as $760) for a 15% improvement in performance is all that good of an upgrade. 5070Ti would need to perform closer to a 4090.
 
the 290 was not bad other than reference model noise
As well as heat/power draw and the fact that you couldn't buy non reference for over six months from launch. None were available at all.

It would throttle like crazy and was loud as heck when I tried one in my system which had previously been using two gtx 970 cards in sli cool and quiet. I went back to the 970 pair and stayed with it, despite the 3.5gb vram issue and had a much better experience.
 
Last edited:
I'm still shaking my head that Nvidia is going to try and sell a 5070 mobile version with only 8GB of VRAM in 2025. It's absolutely nuts.

This is why I'm not planning on buying a gaming laptop and buying a new gaming PC instead, the value isn't there for a gaming laptop imho. When I'm away from my gaming PC (vacation or work) I don't have the time to game nor want to be distracted by it if I'm on business. If I need temporary entertainment I'll use my mobile phone.
 
But consider where these cards are meant to be placed, and what target resolutions they are meant for. A 5070 wouldn't be capable of running 4K with RT mainly because it lacks memory, but because the GPU just isn't powerful enough.
If the amount of memory is so important, why is my 10GB 3080 faster than AMD's 16 GB products at 4K, as you can see in the TechPowerup chart a couple of posts above?

Of course memory can come into play, but should people bitch about the memory size when their 5070 isn't able to run Indiana Jones with full path ray tracing in 4K if the product's sweet spot is meant to be in 2560x1440?
 
But consider where these cards are meant to be placed, and what target resolutions they are meant for. A 5070 wouldn't be capable of running 4K with RT mainly because it lacks memory, but because the GPU just isn't powerful enough.
If the amount of memory is so important, why is my 10GB 3080 faster than AMD's 16 GB products at 4K, as you can see in the TechPowerup chart a couple of posts above?

Of course memory can come into play, but should people bitch about the memory size when their 5070 isn't able to run Indiana Jones with full path ray tracing in 4K if the product's sweet spot is meant to be in 2560x1440?
Thing is, 16GB limit is exceeded even at 1440p in many RT games.

5070 is 5060.
 
Let me preface this by saying: I believe this, I don't like this, and I want better.
(snip)
'yall won't buy AMD to save your life, and AMD's Radeon division won't help you because they are SO DAMN BAD at their job.
Well this is awkward, because as of summer 2024 all 9 PCs in my family (my household, parent's, and sibling's) have flipped over to Radeon (4x RX 7000, 4x RX 6000, and 1x Z1 Extreme). I understand that's the size of an atom on Jensen's left testicle in terms of market share, but considering these were all running RTX 2000 and 3000 prior it's a 100% swing in the other direction for my little corner. The best part is we're all gaming just fine, no one (except me) noticed a damned difference.
 
Thing is, 16GB limit is exceeded even at 1440p in many RT games.

5070 is 5060.
But that wasn't my point. My point was that it is pointless to give a mid level card 24 GB ram to be able to do ray tracing in 2560x1440 or 4K (if needed) if the actual GPU on board isn't capable of doing it anyway. And all these cards are aimed at different market segments, so they will adjust the memory size to hit a price/performance point.
A 24GB 5070 would be much more expensive than a 12 GB one, with both Nvidia's earnings and complaints from the buyers being negatively affected.
 
I hope AMD comes back to give some competition. A friend as the 7800 XTX and its doing great. But it is not as good in productivity which I need some of. So while I hate how Nvidia has been handling things for years I don't have too much of a choice without making my situation more annoying lol.

Targeting a 5080 for my rebuild depending on the reality of it all tonight.
 
5070 is $549.
And has neural AI rendering of materials which will reduce VRAM needs. They claimed it's around the performance of the 4090, which I assume is meant while using DLSS but still incredible. 5070Ti with 16gb and much beefier specs is only $749.
 
And has neural AI rendering of materials which will reduce VRAM needs. They claimed it's around the performance of the 4090, which I assume is meant while using DLSS but still incredible. 5070Ti with 16gb and much beefier specs is only $749.

DLSS and probably ray tracing. I think for games without ray tracing the performance increase won't be quite as big. Of course ray tracing is what really kill frame rates and more games are using it, but still a number of games that don't. So I am doubtful it will reach 4090 levels of performance in most games. 5070ti seems decent, and is quite close to the 5080 but $250 less. I assume it will be about ~20% slower. Will have to see how those overclock in comparison to the 5080, but it seems like the best bang for your buck will be the 5090 if you need top of the line performance, or the 5070ti.

TPU's 4070 benchmark shows the 5090 is roughly 70% faster in both ray tracing and without ray tracing in an average of a number of games tested. Kind of doubting we'll see an average of 70% jump from a 4070 to 5070.

The main thing to see is how much their new frame gen will impact VRAM. Current iterations use a lot of VRAM and 12GB on the 4070 can run into some issues in some games with ray tracing, which is when you'd want to use it.
 
And has neural AI rendering of materials which will reduce VRAM needs. They claimed it's around the performance of the 4090, which I assume is meant while using DLSS but still incredible. 5070Ti with 16gb and much beefier specs is only $749.
Neural rendering is per-game and can ONLY be used by the 50 series, I doubt it will take off this generation.

The claimed performance of a 4090 is when 4x frame gen is used, as the 4090 can "Only" use 2x frame gen. so the 5070 is closer to half the raw power of the 4090. If a game does not support DLSS, or it's an older title that doesn't have RT, the 5070 is about 20% faster than a 4070.
 
The main thing to see is how much their new frame gen will impact VRAM. Current iterations use a lot of VRAM and 12GB on the 4070 can run into some issues in some games with ray tracing, which is when you'd want to use it.
Less vram than before I think, despite the added frames and higher quality and more parameters:

Our new frame generation AI model is 40% faster, uses 30% less VRAM,
For example, in Warhammer 40,000: Darktide, this model provided a 10% faster frame rate, while using 400MB less memory at 4K, max settings, using DLSS Frame Generation.


https://www.nvidia.com/en-us/geforce/news/dlss4-multi-frame-generation-ai-innovations/
Maybe they went from 16 bits to 8 bits or something,or just going from CNN to Transformer
 
Jensen also talked about AI 's effectiveness in compressing data. He also told how from a low pixel count AI can create a coherent full image. Compression, effectiveness, has been a major part of Nvidia's success I'd argue for a long time, not that others don't do the same, but for Nvidia their solutions have saved in hardware costs, in silicon space, though you can always use the saved space for new things like they have. It's like in the Windows enviroment, how their driver uses more of the CPU compared to Radeon driver, which is rarely an issue in my opinion, at large not a practical one, which you could think of an effective use of the resources at hand, doing less on the GPU. That's how I interperiate their makings.
 
Jensen also talked about AI 's effectiveness in compressing data. He also told how from a low pixel count AI can create a coherent full image. Compression, effectiveness, has been a major part of Nvidia's success I'd argue for a long time, not that others don't do the same, but for Nvidia their solutions have saved in hardware costs, in silicon space, though you can always use the saved space for new things like they have. It's like in the Windows enviroment, how their driver uses more of the CPU compared to Radeon driver, which is rarely an issue in my opinion, at large not a practical one, which you could think of an effective use of the resources at hand, doing less on the GPU. That's how I interperiate their makings.
The issue with using ML Data compression algorithms is that it requires developers to integrate them.

And the REASON we need so much VRAM is because developers are rushed and overworked, so the easiest, quickest path to ship a minimum viable product. They aren't going to integrate Neural rendering unless Nvidia basically pays Epic to integrate it into Unreal and make it a hands-off tickbox that's on by default.
 
Less vram than before I think, despite the added frames and higher quality and more parameters:

Our new frame generation AI model is 40% faster, uses 30% less VRAM,
For example, in Warhammer 40,000: Darktide, this model provided a 10% faster frame rate, while using 400MB less memory at 4K, max settings, using DLSS Frame Generation.


https://www.nvidia.com/en-us/geforce/news/dlss4-multi-frame-generation-ai-innovations/
Maybe they went from 16 bits to 8 bits or something,or just going from CNN to Transformer

Untitled.jpg


I'm seeing games push high 9 to 10GB in some games with ray tracing and frame gen and I'm on 2560x1440. An improvement is good, but going from 9GB to 8.6GB is not that big of a saving. Chances are it might not save you from hitting that limit in some newer games, unless other games see a bigger reduction in usage.
 
But that wasn't my point. My point was that it is pointless to give a mid level card 24 GB ram to be able to do ray tracing in 2560x1440 or 4K (if needed) if the actual GPU on board isn't capable of doing it anyway. And all these cards are aimed at different market segments, so they will adjust the memory size to hit a price/performance point.
A 24GB 5070 would be much more expensive than a 12 GB one, with both Nvidia's earnings and complaints from the buyers being negatively affected.
Nobody asked for a 24 GB 5070.

16GB should be the minimum amount for a graphics card capable of rendering 2k/4k games. And that should be most modern graphics cards.

Generational improvement should have the new cards increased performance to rival the old tier card.

For example 4080 should match or beat 3090.

So a 5070 should match or beat the 4080, etc. The memory should also match so it's capable of rendering the resolutions without running out of frame buffer.

A 5070 12 GB in 2025 is a joke.

The 5070 Ti 16GB might be where Nvidia wants to push gamers. We'll see what actually gets released and how they perform on benchmarks.
 
I hope AMD comes back to give some competition. A friend as the 7800 XTX and its doing great. But it is not as good in productivity which I need some of. So while I hate how Nvidia has been handling things for years I don't have too much of a choice without making my situation more annoying lol.

Targeting a 5080 for my rebuild depending on the reality of it all tonight.
Doubtful. Just heard about the Radeon RX 9070 and it's goal is to replace the 7900XT and 7800XT tier. Meanwhile the RX 9060 is supposed to replace the 7700 XT and 7600 XT tier.

Notice how the 7900XTX is not mentioned at all. This means AMD is going after the mid tier. They will NOT release RDNA4 to compete with the 4090 or 5090.

1736386429992.png
 
Doubtful. Just heard about the Radeon RX 9070 and it's goal is to replace the 7900XT and 7800XT tier. Meanwhile the RX 9060 is supposed to replace the 7700 XT and 7600 XT tier.

Notice how the 7900XTX is not mentioned at all. This means AMD is going after the mid tier. They will NOT release RDNA4 to compete with the 4090 or 5090.

View attachment 702625
I know they wont this time, but at some point like they did with CPU's I hope they come back at all levels to drive some competition. They don't have to be the best. They never really pushed on Titan in the past, but they had solid cards that performed well. I loved the handful I had over the years but I need the productivity side to have something which leaves me with NVIDIA :(
 
Yeah at 1440p gotta say doubt as well and agree with you.

At 4k, I think 16gb is fine right now. Will it stay fine in 2-3 years? I don't know.
Indiana Jones at 4k with path tracing DLSS quality and frame gen NEEDS over 16GB as its utilizing over 17GB and allocating over 19GB.
 
Indiana Jones at 4k with path tracing DLSS quality and frame gen NEEDS over 16GB as its utilizing over 17GB and allocating over 19GB.
One game, with the highest texture setting available only on 16gb+ cards with full path tracing. I think you may be mixing up the dlss q with a DLAA setting... Got a link? How are you telling it uses 17gb when on a 16gb card????

Like he said, in a few years maybe that will be more common, but that is not today.
 
One game, with the highest texture setting available only on 16gb+ cards with full path tracing. I think you may be mixing up the dlss q with a DLAA setting... Got a link? How are you telling it uses 17gb when on a 16gb card????

Like he said, in a few years maybe that will be more common, but that is not today.
It always starts with "one game" and the point was that we have that game NOW so we wont be waiting 2 or 3 years. It was the settings I said not DLAA and saw it for myself on the 4090.
 
  • Like
Reactions: noko
like this
Back
Top