Nvidia would like to correct the record on VRAM it seems lol

OK we'll play your game. Nvidia is putting 16GB on a "low end" card. How stupid does one have to be to keep telling people 8GB is all you need?
You have a quote from anyone here that says 8GB is all you need?

Also, why are you even brining up a mid tier 70's class card when debating this whole 8GB thing. The current 70's class isn't even 8GB.

Seriously, you're all over the place here...
 
If you can't beat the competitions' shit you call up engineering and marketing to baffle the consumer with technological terms, fancy graphics, and other bullshit. I was going to get an RTX 4080, but I just choke on any video card that costs more than a grand. Bought an XFX Radeon RX 7900 XT reference card for $810 a few months ago instead. Nvidia is at the top of their game whereas AMD is just getting started. That means Nvidia < AMD in the long run in this cycle.
Wow, saving this one for later. I'm wagingering it's going to age curddled milk instead of fine wine.
 
You have a quote from anyone here that says 8GB is all you need?

Also, why are you even brining up a mid tier 70's class card when debating this whole 8GB thing. The current 70's class isn't even 8GB.

Seriously, you're all over the place here...
To be fair, the 4070ti WAS going to be the 4080 until we caught Nvidia lying, which would in fact make it technically top tier.
 
How quick people forget about that heap of junk """4080""" 12gb. Apparently Nvidia thought it was high end enough :D
No one has forgotten, it's just not the first thing you think of since they haven't had competition. Nothing notable to think about in any of these releases except for the 4090 to be honest.
 
That's a good experiment, but many games will "allocate" vram but not necessarily use it. I don't think that Afterburner shows actual use but shows allocated. Be cool if someone figures out how to measure vRam usage with accuracy.

I play at 1440p as well. What $600 card? 3070 was $500. It does suck that midrange cards prices have creeped upwards, but they all have.

I can agree with the sentiment. But those cards sell at those prices, so it apparently is acceptable. But it is a midrange card regardless of the price.

I think giving users the choice of either 8Gb or 16Gb is awesome. It could be in response to the HU noise, or it could have been planned from the beginning. Those launches are so close, and the vRam debate so recent, that I suspect it was already planned and not reactionary. That decision was probably made 4 or 5 months ago, if not longer.

AMD offering more vRam is their Marketing and PR guys doing their jobs, and doing it well. And of course they will try to play that up as an advantage. And in some cases it might very well be advantageous to them. They can hope that the extra vRam allows their cards to reach performance parity with the nVidia counterpart product, probably does in a few cases but I doubt it will in all.
I see that as AMD doing what they have to to help sell their cards and help shrink the performance gap between them and the nVidia cards.
It's called responding to the competition.

And that was a cool experiment. But what titles showed improvement and how much? Did it turn it into a 3090? Let me search on youtube.. and find: Lol, nope. In most titles, there was very little difference:
He tests a modded 3070 with 16Gb vRam and compares with an 8Gb 3070:


View attachment 571451
View attachment 571452
View attachment 571453
And at 4k Ultra settings and Psycho raytracing:
View attachment 571454
LOL!
View attachment 571455
View attachment 571456
View attachment 571457
And he even tested The Last of Us!!
View attachment 571458

This result shows that HU was incorrect in their assumption that the 8Gb vRam was the cause of the performance issues they had with The Last Of Us. Which is what this whole fucking thread and other threads are about. Hahahahaa....
He did say that the 8Gb card was more stuttery on LoU
View attachment 571459
View attachment 571460
View attachment 571461
(NaughtyDog patch notes say they are still working on the stuttering some users have. Already they have made significant progress.)
And finally a game that shows an decent performance difference!
View attachment 571462
The biggest difference I spotted was 70fps vs 97fps:
View attachment 571466
But even then it's still playable!! Hahaha, wtf!

He tested several other games but those all shows under 8Gb vram usage and the same performance.
In the games where there actually was a difference: From less than 1% difference to 8% in one game and maybe about 35% in RE4. He would need to show the average FPS in RE4 to accurately compare that title.

It's worth noting that the 16Gb 3070 in the above tests was a triple fan card, and the 8Gb loser 3070 was a dell OEM with only 2 fans.. which could be responsible for some portion of the performance differences..

Everyone wants lower priced cards including me. But wishing it was true isn't going to make it so.

That's not really what's happening. It's calling out inaccurate bullshit. But the above comparisons seem to indicate that the 8Gb card had enough vRam at the time it was new, over 3 year ago. I think that is enough justification. And look at that, it can still play games!

lol, maybe for some. My card has 24Gb so that isn't it in my case.

I thought the big deal was the improved 1% lows which were advantaged by the larger vRAM and not average FPS, as the architecture of the card didn't change.
 
The problem is, it isn't "1 shitty console port". I play Caliber from time to time. I've seen it using up to 10GB VRAM. I'll HAPPILY play every game I have installed and then some and log my VRAM usage at 1440p ultra. I can assure you. at 1440p anyway (what I play at) 8GB don't cut it anymore for a LOT of titles. ANd I'm sorry, people don't buy a $600+ card to "turn settings down"

The amount of games I've played that run out of VRAM at 1440 with 8GB is slim. The Last of Us and Hogwarts legacy is about all I can think of. Last of Us patches greatly reduced VRAM usage have not tried it since, textures are extremely high quality in that game. You can turn down to the next level without much quality loss. Hogwarts was also a buggy mess. Generally playable but that one does have some VRAM issues. Halo Infinite seems to push 8.5-9GB, but moving to a faster GPU with more VRAM did not help with the occasional stutters.

RE Village gives warnings about not enough VRAM, but it plays perfectly fine.

Dying Light 2 with DLSS Quality and some ray tracing uses 5.3-6GB of VRAM. My 3070 couldn't run with all ray tracing options enabled, so running out of VRAM with ray tracing was a moot point. I am not sure if it did, I but I never bothered playing much because frame rates were around 40 anyways.

What games run out of VRAM for you?
 
Last edited:
3070 came out nearly 3 years ago. The only cards sold with 8Gb now are 4060's. Those are low end, so 8Gb is appropriate because people that buy those want to spend the least possible. 4050's will probably come with 8Gb as well.

3070 is not a high end part.

3050 - very low end
3060 - low end
3070 - mid tier
3080 - high end
3090 - Halo

I don't get why you are bitching about the last-gen mid-range card... now...
You are about 3 years late.
Reality is a bit different.

xx50: entry level
xx60: mainstream
xx70: mainstream+
xx80: high end
xx90: ultra high end
 
"I'd rather trust a corporation whose sole purpose is to make money off me about why their increased profit margins are due to technical reasons and not desire to make more money off me."

OK
Funny how you omit 2/3 of that whole statement to do what exactly? Make fun of me? Call me a shill? What?

I would rather trust a company that’s been making GPU’s for 25 years than a group of people regurgitating YouTube content as their source material for their arguments.

Considering they know all the ins and outs of how every single piece of hardware works, since, y’know they’ve been designing and building the hardware for sometime. I seriously doubt these tech tubers and people sheepishly following their every word is going to magically know something Nvidia doesn’t.

Yes, they are a business, very much like MS, Google, etc. with vested interest in getting our hard earned dollars, but if people are willing to ground their expectations in reality, just like Apple, Nvidia releases quality hardware loaded with features that are often imitated while take the performance crown more often than not.

So, yes, I’ll gladly give my money to Nvidia if they continue to innovate. I’m tired of waiting on AMD to try and gain parity with them while using copied features that aren’t nearly as good as the source material.
 
Last edited:
The amount of games I've played that run out of VRAM at 1440 with 8GB is slim. The Last of Us and Hogwarts legacy is about all I can think of. Last of Us patches greatly reduced VRAM usage have not tried it since, textures are extremely high quality in that game. You can turn down to the next level without much quality loss. Hogwarts was also a buggy mess. Generally playable but that ones dose have some VRAM issues. Halo Infinite seems to push 8.5-9GB, but moving to a faster GPU with more VRAM did not help with the occasional stutters.

RE Village gives warnings about not enough VRAM, but it plays perfectly fine.

Dying Light 2 with DLSS Quality and some ray tracing uses 5.3-6GB of VRAM. My 3070 couldn't run with all ray tracing options enabled, so running out of VRAM with ray tracing was a moot point. I am not sure if it did, I but I never bothered playing much because frame rates were around 40 anyways.

What games run out of VRAM for you?

Only one I had a direct issue with (game crash) was No Mans Sky. How many games are good at coping with 8gb, but would be running better if there was more?
 
So, yes, I’ll gladly give my money to Nvidia if they continue to innovate,
Their biggest innovations lately are how to give as little as possible for as much as possible, while justifying it with meaningless techno babble. Uncontested leaders on that front. Do continue to support that if you want, I'd rather call them out on BS and put my money elsewhere.
 
You have a quote from anyone here that says 8GB is all you need?

Also, why are you even brining up a mid tier 70's class card when debating this whole 8GB thing. The current 70's class isn't even 8GB.

Seriously, you're all over the place here...
Well since you think 8GB is enough then I guess we aren't debating. Thanks for agreeing.
 
Two separate issues. One is dealing with fidelity at a specific setting. The other is an optimization specifically to deal with low VRAM situations that the original target spec didn't have a problem with.

Two separate issues under the larger umbrella of 'general optimization' that should have been done from the start/before release

I agree
 
1D5F881B-7689-436C-81B7-33077595A47F.jpeg


My laptop is basically a 3070 (3080 mobile) level gpu with 8GB.

The low VRAM isn’t a huge deal, worst case in games like Harry Pottery I changed textures from ultra to high and DLSS helps too, which I can’t notice the artifacts.

I don’t think there was a single game where I could tell the difference in the past decade or two between ultra and high on textures.

I do agree at these performance levels and price points (more the 4070 and higher) VRAM should be more.
 
Two separate issues under the larger umbrella of 'general optimization' that should have been done from the start/before release

I agree
In no world is fixing a bug considered an optimization. Also you're describing every game release for the past 20 years or so. That's nothing new.
 
Well I don't think it's just AMD that can apply pressure. Gamers are a loud bunch and can affect change too. The problem is we have far too many people not asking for more from nVidia when they should be. Instead we are literally seeing people ask developers to keep shoe-horning games into 8GB which is mental. I've never seen that before. People are telling GPU makers "no no, 8GB is just fine you don't have to give us more" . WTF?!?!? We're on an enthusiast site. When in your memory have we asked manufactures to stay the same? I can't remember when we've ever done this.

To put things into perspective, the Steam Deck has 4gb of memory for graphics.
 
All well, and good, but what reason does Nvidia have to compete? So far, they have had nothing contested in any meaningful way.
AMD is competing fairly well, though with slightly lower prices and slightly faster performance. Which for many people isn't enough to deter them from buying Nvidia. It does have a lot to do with drivers, features, and etc which really don't matter, but a lot of people here will tell you that DLSS is fantastic even though it reduces image quality.
As a corporation, they are acting specifically as they should, reducing cost while increasing profits.
Great for shareholders but bad for consumers. That's why AI. AI AI AI.
https://www.reddit.com/r/wallstreetbets/comments/13kztsd/early_release_of_the_nvda_earnings_call/
And it's as obvious as shot that AMD doesn't give a fuck and just wants to ride the gravy train by offering a second rate product as first rate prices just as much as Nvidia. The only thing keeping Nvidia ahead in those tiers is literally they brand recognition and their superior ecosystem, regardless how anyone here feels about those.
AMD didn't give a shit because up until now they were riding Nvidia's shadow and it worked for them. Because of the several years of crypto, AMD would bank on high return GPU's because of the demand. The math worked out that if AMD were to compete against Nvidia, then it would lower prices. It was better that AMD matched those prices, and ride the crypto train because a GPU that costed AMD $200 to manufacturer was selling for $1k. That's over and now AMD has to seriously think about capturing market share.
Smart ones here are just waiting and hoping that Intel comes in to take their mid-tier lunch money and force these 2 to actually compete.
I'm hoping for Intel's next graphics card to be substantially better.
 
Their biggest innovations lately are how to give as little as possible for as much as possible, while justifying it with meaningless techno babble. Uncontested leaders on that front. Do continue to support that if you want, I'd rather call them out on BS and put my money elsewhere.
So… AI based Frame Generation, Shader Execution Reordering, Opacity Micro-maps, and Reflex aren’t innovating… and is giving very little?

What’s AMD done lately? Chiplets? Oh, that’s right, they have their version of Frame Gen coming soon… let’s be real, games are coming out with very little in the way of optimizations. It’s evident when patches for these games making what was unplayable on certain level cards playable. I say this because that's all everyone talks about now is VRAM capacity, arguing that 16GB is the new norm, and Nvidia is greedy for not including it in all the mid-range cards.

The reason Nvidia isn’t adding gobs of VRAM to their consumer line-up is relatively simple—they use CUDA cores in both their workstation and consumer GPU’s. Gaming performance, VRAM and memory bandwidth are the main differences between the two. This is what makes the 4090 so good, it can game like a beast, and is capable of professional use.

The following is all speculation, but seems to be logical:

Let’s say they made the 4080 with 24GB’s of VRAM, then professionals would be looking hard at those GPU’s over the 4090 or their workstation offerings. So they have to keep it at 16GB to keep supply from being devoured by the professional work space, and push that category of buyer towards the 4090 or higher, probably why they priced it where it’s at, to push the workstation crowd upwards.

The same for the 4070/4070Ti, making them 16GB would devour 4080 sales, and it would bring about the same problem, professionals looking for budget alternatives.

Since professional work is far more bandwidth intensive, raw bandwidth is king. With how they’d have to configure the memory is either 16GB 256-bit which would appeal to professionals and would harm 4080 sales, or 16GB 128-bit which wouldn’t appeal to anyone and would outright kill the cards performance, the only alternative was 12GB 192-bit to keep the GPU from being bandwidth starved. This is probably why Nvidia opted to use gobs of L2 Cache, it helps alleviate bandwidth issues with lower memory configurations for gaming while making it less appealing for workstation users.

AMD doesn’t use the same hardware for their professional and consumer lines so they can add gobs of VRAM to their consumer line-up and not impact their professional line-up.

To me this is the only logical conclusion I can think of as to why Nvidia would keep VRAM where it’s at.
 
Last edited:
Well since you think 8GB is enough then I guess we aren't debating. Thanks for agreeing.
I have said no such thing. Your post here is about the lowest IQ post I've had the displeasure of replying to. I won't be entertaining moronic messages like this again.
AMD is competing fairly well, though with slightly lower prices and slightly faster performance. Which for many people isn't enough to deter them from buying Nvidia. It does have a lot to do with drivers, features, and etc which really don't matter, but a lot of people here will tell you that DLSS is fantastic even though it reduces image quality.
Not as much as you would like to believe. AMD sells the equivalent of couch change in comparison to Nvidia, no matter what metric you use. So, no, not really. They are competitive enough to keep the lights on in the GPU department.
 
I have said no such thing. Your post here is about the lowest IQ post I've had the displeasure of replying to. I won't be entertaining moronic messages like this again.
Figured. LOL. Thank God. Please continue to argue for less ram because nVidia. Who TF argues for less at higher prices? Satan? At least we know there's nothing that nVidia can't make y'all do. How anti-consumer are you? At least with the ray-tracing argument you're at least getting something (not much but something). With this argument you have every nVidia shill trying to convince everyone that the REAL problem is "lazy developers".

At least nVidia released this CYA statement I believe purely to prevent from being sued.
1684611803094.png
 
Last edited:
Figured. LOL. Thank God. Please continue to argue for less ram because nVidia. Who TF argues for less at higher prices? Satan? At least we know there's nothing that nVidia can't make y'all do. How anti-consumer are you? At least with the ray-tracing argument you're at least getting something (not much but something). With this argument you have every nVidia shill trying to convince everyone that the REAL problem is "lazy developers".

At least nVidia released this CYA statement I believe purely to prevent from being sued.
View attachment 571632
Again, I have said no such thing. Do not put words in my mouth I have not said. Your trolling is getting out of hand and is neither wanted nor welcome. Stop.
 
Last edited:
Oh, so upscaling is fine for AMD but not dlss for nvidia. Right....... :rolleyes:.

Gimme a break the GPU portion of the Deck is teensy, it's smaller than it's current competition from AMD, and they still give it four gigs.

We're talking about a discount super-budget APU with more memory per instruction unit than a $300 GPU. The whole deck, stripped down to the cheapest model costs $400 and still has 16 gigs of RAM, CPU plus GPU.

I'm not saying AMD is better than Nvidia, I'm saying Nvidia can get stuffed. It's up to AMD to decide if they need stuffing, too.
 
Gimme a break the GPU portion of the Deck is teensy, it's smaller than it's current competition from AMD, and they still give it four gigs.

We're talking about a discount super-budget APU with more memory per instruction unit than a $300 GPU. The whole deck, stripped down to the cheapest model costs $400 and still has 16 gigs of RAM, CPU plus GPU.

I'm not saying AMD is better than Nvidia, I'm saying Nvidia can get stuffed. It's up to AMD to decide if they need stuffing, too.
Both of them can get stuffed currently. The current landscape is bleh as far as I'm concerned. The techtubers can also get stuffed for that matter; the garbage they are releasing as either content or worse, as journalism in the last weeks has been eye rolling.
 
Both of them can get stuffed currently. The current landscape is bleh as far as I'm concerned. The techtubers can also get stuffed for that matter; the garbage they are releasing as either content or worse, as journalism in the last weeks has been eye rolling.

Yeah I'm not impressed with their copying pricing, either. Which is an extremely polite way of saying they blew a serious opportunity to engender a swath of gamers just looking for decent hardware at a decent price. I know, servers and consoles are where the money's at, but long tail, fucking sell me a tits card for a tits price that will do for me what the 580, 290, and 2000 did, first.

I know that features move cards now; but RAM isn't a God damn feature, it's a hard wall.
 
Oh, so upscaling is fine for AMD but not dlss for nvidia. Right....... :rolleyes:.
Huh? FSR looks great in the latest Zelda game want to talk about it? Are you allowed to?
Also, why are you making stuff up that ZeroBarrier never said?
Do you sit across from their desk? OSHA says you get a 15 min break every 4 hours. So i'm glad ZB is taking advantage of that, thus allowing you one last anti-consumer quip. Either way I don't believe I made anything up. nVidia corrected the record on VRAM and thank god they did. Once your nVidia slack channel updates you'll be in line with their latest PR release and won't that be great for everyone?
 
Last edited:
Not as much as you would like to believe. AMD sells the equivalent of couch change in comparison to Nvidia, no matter what metric you use. So, no, not really. They are competitive enough to keep the lights on in the GPU department.
AMD certainly needs to up their game when it comes to graphics. Right now if you wanted to buy a graphics card from AMD vs Nvidia, it's actually a hard choice. For example the RTX 4070 is $600, has 12GB and in most titles is slower than a RX 6950 XT except when Ray-Tracing is enabled. The RX 6950 XT is $600 has 16GB of VRAM, and is faster overall but not with Ray-Tracing. AMD might have better performance in the long term, but does that long term include Ray-Tracing? Unless AMD magically increases their Ray-Tracing performance, a lot of people would go Nvidia due to familiarity.
Oh, so upscaling is fine for AMD but not dlss for nvidia. Right....... :rolleyes:.
DLSS and FSR are trash technologies. Their purpose is to increase gaming performance without lowering image quality "noticeably". DLSS maybe the least trash of these upscalers, but it is still trash because it still lowers image quality. DLSS is the most problematic of these trash technologies because it is not only exclusive to Nvidia, but soon DLSS3 is only for RTX 40 series. Again, if you're using DLSS or FSR then your graphics card isn't capable of performing. I use FSR for games like God of War and Spiderman but that's on a R9 Fury and RX 480 running Linux. You shouldn't need DLSS on a RTX 40 series GPU, other than Ray-Tracing.
I say this because that's all everyone talks about now is VRAM capacity, arguing that 16GB is the new norm, and Nvidia is greedy for not including it in all the mid-range cards.
You doubt Nvidia's greed?
The reason Nvidia isn’t adding gobs of VRAM to their consumer line-up is relatively simple—they use CUDA cores in both their workstation and consumer GPU’s. Gaming performance, VRAM and memory bandwidth are the main differences between the two. This is what makes the 4090 so good, it can game like a beast, and is capable of professional use.
You're spewing out techno babble without knowing what any of it means. Yes Nvidia gaming GPU's and workstation GPU's have little difference, but that's not justification for not putting in enough VRAM. Also the RTX 4090 has 24GB of VRAM, so I don't see you're point.
Let’s say they made the 4080 with 24GB’s of VRAM, then professionals would be looking hard at those GPU’s over the 4090 or their workstation offerings. So they have to keep it at 16GB to keep supply from being devoured by the professional work space, and push that category of buyer towards the 4090 or higher, probably why they priced it where it’s at, to push the workstation crowd upwards.
Intel's A770 has 16GB, so there goes Nvidia's sales. AMD's RX 6800's and up have 16GB, so there really goes Nvidia's workstation sales. If professionals bought graphic cards entirely based on VRAM, then Nvidia has been fucked for a while.
The same for the 4070/4070Ti, making them 16GB would devour 4080 sales, and it would bring about the same problem, professionals looking for budget alternatives.
What 4080 sales? You see 4080's selling? Also, why isn't the GPU's performance enough to sell? Stop defending Nvidia already.
w30fOmznmYSCvymZJPyMg5YhHHxyG7N-kYIteJfwPKY.jpg

To me this is the only logical conclusion I can think of as to why Nvidia would keep VRAM where it’s at.
You really can't think of a better reason? Arkarms razor not kicking in for you?
 
DLSS and FSR are trash technologies. Their purpose is to increase gaming performance without lowering image quality "noticeably". DLSS maybe the least trash of these upscalers, but it is still trash because it still lowers image quality. DLSS is the most problematic of these trash technologies because it is not only exclusive to Nvidia, but soon DLSS3 is only for RTX 40 series. Again, if you're using DLSS or FSR then your graphics card isn't capable of performing.
FSR means you can play Diablo IV (for example) on Intel integrated Xe without stuttering, and with details turned down. I spent the Server Slam weekend playing it on an i5-1235U mini PC (instead of my normal gaming PC) to see how it would work and it was perfectly playable.
 
You doubt Nvidia's greed?
Never said they weren't greedy, but, so is AMD. They aren't some magical white knight saving the day with $800-$1000 offerings, and an 8GB entry level GPU. I don't mind them being greedy, since, y'know, that's how businesses operate and stay successful. Businesses aren't there to be our friends, they're there to take our money, plain and simple. Anyone who think's Nvidia or AMD should be offering low prices on luxury items "otherwise they're greedy" obviously live in la-la land.
You're spewing out techno babble without knowing what any of it means. Yes Nvidia gaming GPU's and workstation GPU's have little difference, but that's not justification for not putting in enough VRAM. Also the RTX 4090 has 24GB of VRAM, so I don't see you're point.
How is it not justified? Simply put, the 4090 is the closest thing to that on the consumer side as one could get, and it's mainly because of the VRAM/Bus-width. If people could get away with it, they'd be doing some content creation on a 4070Ti, but considering its memory bus and VRAM capacity it's a no-go for anything serious, but if they could then there would probably be less demand for 4090's.
Intel's A770 has 16GB, so there goes Nvidia's sales. AMD's RX 6800's and up have 16GB, so there really goes Nvidia's workstation sales. If professionals bought graphic cards entirely based on VRAM, then Nvidia has been fucked for a while.
I didn't say they did, but I figure it wasn't implied with mentioning CUDA cores and it being common knowledge that Nvidia cards are just flat out better than anything AMD has... every major professional/workstation application scales extremely well with CUDA, making even cards like a 4070 a better content creation card than even a 7900XTX, the point of the VRAM comment was to point out that yes, VRAM and memory bus are king, pair a decent memory bus with a decent amount of VRAM even on a mid-range Nvidia GPU and I can almost guarantee a lot of professionals would probably eye-ball it as a cheap alternative to Nvidia's workstation GPUs, that also doubles up as a very capable gaming card.
What 4080 sales? You see 4080's selling? Also, why isn't the GPU's performance enough to sell? Stop defending Nvidia already.
View attachment 571701

You really can't think of a better reason? Arkarms razor not kicking in for you?

Despite what you may assume the 4080 does sell, maybe not as well as the 3080 did, but it is selling. I'm not defending Nvidia, just explaining what seems logical. You have to look at it from a business perspective, a business that has investors and shareholders. I'm confident in believing what I put down was probably discussed long ago and agreed upon by Nvidia... figuring out ways to keep regular consumers from just buying their consumer grade GPUs for professional grade workloads since they technically use the same hardware--CUDA/Tensor cores, and VRAM was probably at or near the top of the list of ways to differentiate the two.

You seem like a reasonable individual--why else do you think Nvidia would limit VRAM on consumer grade cards? To get you to buy their higher tier cards for more money? It's not like the higher tiers are performing the same, so charging more for higher tier cards also comes with better performance, not just more VRAM. You think Nvidia is being nefarious and screwing consumers, consumers who are capable of reading, researching, and making purchasing decisions based on the information they find? Nvidia's been making GPUs for something like 25 years now, I think they understand the underlying technology better than any of us do, and it's not like they're advertising the 4060Ti 8GB card to be more than what it really is. They stated themselves basically this GPU can't run RE4 Remake with the RT preset, nor can it run Plague Tale Requiem with High settings which I'm sure also includes RT, and it may not just be because of VRAM since they don't imply that, it could just literally be the performance was just crappy with RT on.

At the end of the day--if people buy an 8GB GPU and get mad that it's not lasting longer than 2-3 years without them having to adjust some graphics settings... that's on them, you can't blame Nvidia. Realistically, even with having 16GB the 6800XT will probably be good for another year or two before it really starts falling behind, and the 6950 shortly after. It would be especially true if UE5 becomes more common, since, as I've said before, VRAM is only as good as the GPU's performance, you can throw 24GB of VRAM on a 4070, does that mean it will run games as good as and last longer than a 4080 with 16GB? Nope, because when time comes around to utilize it all, the card will most likely buckle under pressure.
 
I still have a 2080 8GB on a secondary system. Never regretted it. Plays fine with the games that I play and I can always tweak the settings to get the performance I want. I also avoid playing buggy games at launch and wait until they get fixed and optimized so I don't run into the same scenario as what was shown by DF in the TLOU Pt. 1 revisit.
 
Last edited:
"Won't somebody please think of the shareholders?"

Great. They found a way to rationalize selling me gimped hardware. Makes it feel so much better. Wouldn't want to hurt sales of the workstation cards.
 
"Won't somebody please think of the shareholders?"

Great. They found a way to rationalize selling me gimped hardware. Makes it feel so much better. Wouldn't want to hurt sales of the workstation cards.
Lack of VRAM in the 3080 was a key component in my decision to switch to AMD from only buying nvidia cards for my gaming machine since the riva128. Combined with the inability to find cards at msrp.

Although suppose losing 1 customer doesn’t matter that much.
 
Never said they weren't greedy, but, so is AMD. They aren't some magical white knight saving the day with $800-$1000 offerings, and an 8GB entry level GPU.
If AMD offered $200 and lower graphic cards with 8GB of VRAM then that's fine because it's entry level. Even though back in the day it was common to put a lot of VRAM on cheap cards to entice customers because VRAM is cheap and consumers used to buy based on the amount of VRAM. RTX 4060 is not entry level, but a mid range product.
I don't mind them being greedy, since, y'know, that's how businesses operate and stay successful. Businesses aren't there to be our friends, they're there to take our money, plain and simple. Anyone who think's Nvidia or AMD should be offering low prices on luxury items "otherwise they're greedy" obviously live in la-la land.
The only reason they aren't low priced is because of the past several years of crypto. Also I wouldn't call these luxury items but instead toys, as they're mostly used for playing video games.
How is it not justified? Simply put, the 4090 is the closest thing to that on the consumer side as one could get, and it's mainly because of the VRAM/Bus-width. If people could get away with it, they'd be doing some content creation on a 4070Ti, but considering its memory bus and VRAM capacity it's a no-go for anything serious, but if they could then there would probably be less demand for 4090's.
The 4090 is in demand because it's the fastest, and there are people who spend any amount of money on the fastest GPU. Offering a 4070Ti with 24GB of VRAM won't deter people from buying a 4090. It's the reason why nobody bought a 4080 even though it has 16GB of VRAM.

I didn't say they did, but I figure it wasn't implied with mentioning CUDA cores and it being common knowledge that Nvidia cards are just flat out better than anything AMD has...
How does CUDA cores play into this?
every major professional/workstation application scales extremely well with CUDA, making even cards like a 4070 a better content creation card than even a 7900XTX, the point of the VRAM comment was to point out that yes, VRAM and memory bus are king, pair a decent memory bus with a decent amount of VRAM even on a mid-range Nvidia GPU and I can almost guarantee a lot of professionals would probably eye-ball it as a cheap alternative to Nvidia's workstation GPUs, that also doubles up as a very capable gaming card.
AMD's lack of professional performance has more to do with their drivers. On Linux for example AMD's ROCM is just shit. This is why for productivity AMD hasn't been a good choice. But the reason a professional buys Nvidia's workstation cards over gaming cards is for stability. The idea is that workstation cards have been verified and tested that nothing would go wrong. Not that people haven't figured out the gaming cards and workstation cards are the same and Nvidia has put in methods to block people from buying gaming cards for productivity. Try running an Nvidia card in a VM and see how badly it goes.
You seem like a reasonable individual--why else do you think Nvidia would limit VRAM on consumer grade cards?
Greed. Planned obsolescence. Product segmentation.
To get you to buy their higher tier cards for more money? It's not like the higher tiers are performing the same, so charging more for higher tier cards also comes with better performance, not just more VRAM.
The problem is Nvidia is selling higher performing GPU's with low amounts of VRAM that limit their performance. You can't exactly go and buy more VRAM and add it to the GPU like you can with a CPU.
You think Nvidia is being nefarious and screwing consumers, consumers who are capable of reading, researching, and making purchasing decisions based on the information they find?
You're giving consumers way too much credit. To give you an idea how nafarious Nvidia has been, they will often take advantage of consumers inability to read, reserach, and make good purchasing decisions. For example the GTX 970 4GB had only 3GB of working VRAM. GTX 1030 also had a DD4 version that was massively slower. GTX 1060 has a 3GB version that is also slower, not just lower VRAM. And of course there's an 8GB RTX 3060 that is slower than the 12GB RTX 3060. The reason this is nefarious is because most people would Google 1030, 1060, 3060, and find lots of reviews on the faster models. Then go on Amazon or NewEgg and buy the cheapest cards they can find, only to be saddled with a vastly slower 1030 with DDR4, and slower 1060 with 3GB, and a slower 3060 with 8GB, because Nvidia hardly announces the release of these cards and reviewers hardly review them.
So Nvidia being nefarious assholes is kinda their thing, especially with VRAM. Oh god they fucked around a lot with VRAM.
torvaldsnvidia-640x424.jpg

Nvidia's been making GPUs for something like 25 years now, I think they understand the underlying technology better than any of us do,
Better than you do apparently. Some of us have been around long enough to remember like Peppridge farm remembers.
and it's not like they're advertising the 4060Ti 8GB card to be more than what it really is. They stated themselves basically this GPU can't run RE4 Remake with the RT preset, nor can it run Plague Tale Requiem with High settings which I'm sure also includes RT, and it may not just be because of VRAM since they don't imply that, it could just literally be the performance was just crappy with RT on.
Guess we'll never know since, oh wait they're also making a 4060Ti 16GB model. We can just simply test it and see if 16GB makes a difference. How much you wanna bet it will?
At the end of the day--if people buy an 8GB GPU and get mad that it's not lasting longer than 2-3 years without them having to adjust some graphics settings... that's on them, you can't blame Nvidia.
You mean 2-3 minutes because 8GB isn't enough for a lot of new titles now at 1080p.
Realistically, even with having 16GB the 6800XT will probably be good for another year or two before it really starts falling behind, and the 6950 shortly after. It would be especially true if UE5 becomes more common, since, as I've said before, VRAM is only as good as the GPU's performance, you can throw 24GB of VRAM on a 4070, does that mean it will run games as good as and last longer than a 4080 with 16GB? Nope, because when time comes around to utilize it all, the card will most likely buckle under pressure.
That really depends on which direction developers go with Ray-Tracing. With UE5 and Lumens we might see AMD compete with Ray-Tracing, but without Ray-Tracing I can see AMD cards lasting much longer. Especially when new demanding games either don't come with Ray-Tracing, or come with very limited Ray-Tracing.
 
Back
Top