[April Fools joke?] Nvidia 4050 to launch with 6GB vram claims tweeter

Marees

2[H]4U
Joined
Sep 28, 2018
Messages
2,091
Rumour has it that Nvidia's RTX 4050 will be based on Nvidia's AD107 silicon, which is rumoured to be the silicon behind Nvidia's RTX 4060. The RTX 4050 appears to feature a 96-bit memory bus, with recent reports claiming that the graphics card will only feature 6GB of video memory, which is not enough to play many modern games without issues.


6GB of VRAM is not enough

The past week has seen The Last of Us Part 1 and Resident Evil 4 (Remake) arrive on PC, and both of these games can utilise a lot of graphics memory, even at low 1080p resolutions. With the 8GB quickly becoming the minimum amount of memory that is advisable for 1080p gaming, Nvidia's RTX 4050 will be poorly positioned within today's PC market. This is especially true knowing that Nvidia's RTX 3050 has 8GB of graphics memory.



https://overclock3d.net/news/gpu_di...edly_launching_in_june_with_too_little_vram/1

https://twitter.com/Zed__Wang/status/1640986350986969089
 
Rumour has it that Nvidia's RTX 4050 will be based on Nvidia's AD107 silicon, which is rumoured to be the silicon behind Nvidia's RTX 4060. The RTX 4050 appears to feature a 96-bit memory bus, with recent reports claiming that the graphics card will only feature 6GB of video memory, which is not enough to play many modern games without issues.


6GB of VRAM is not enough

The past week has seen The Last of Us Part 1 and Resident Evil 4 (Remake) arrive on PC, and both of these games can utilise a lot of graphics memory, even at low 1080p resolutions. With the 8GB quickly becoming the minimum amount of memory that is advisable for 1080p gaming, Nvidia's RTX 4050 will be poorly positioned within today's PC market. This is especially true knowing that Nvidia's RTX 3050 has 8GB of graphics memory.



https://overclock3d.net/news/gpu_di...edly_launching_in_june_with_too_little_vram/1

https://twitter.com/Zed__Wang/status/1640986350986969089
6gb will still crush 1080p, this would be a 1660 super replacement if this is actually a thing.
 
Eh, 6GB will be rough for some games and high/ultra settings at 1080p. It really all depends on price. Anything over $200-250 for this would be the real joke.
Nah, with the exception of Hogwarts a 1660s crushes 1080p max in the 60+ range. This is likely going to take that a slot, the components for the 1660 are harder to come by and getting more costly to make. So this will be its successor should this be real, I could see tbis in a lot of entry gaming laptops easily.
 
"With the exception of newer games that punish <8 GB cards, low memory cards aren't punished"
Imagine having to downscale 1080p to get 60 fps....even lowlier than consoles for what is likely to be $300+.....
 
$350 MSRP, $400 retail price. :p
nVIDIA can eat a big fat one at that price. EDIT: I feel I should add that this is coming from a (mostly) loyal nVIDIA patron from the GeForce 2 MX 400/GeForce 4 Ti 4400 up until things got nuts with the cryptobooms + GPU shortages
 
Last edited:
Nah, with the exception of Hogwarts a 1660s crushes 1080p max in the 60+ range. This is likely going to take that a slot, the components for the 1660 are harder to come by and getting more costly to make. So this will be its successor should this be real, I could see tbis in a lot of entry gaming laptops easily.
There's quite a few games that will struggle: A Plague Tale- Requiem comes to mind, Guardians of the Galaxy, Last of Us, RE4 Remake. The fact of the matter is the consoles are rocking with 16GB of VRAM. Granted it's shared but at the minimum the AAA titles are using 8GB, but likely more. This is a 1060 and a 1660 is 20% faster which isn't going to bring the performance into "crushing" territory. Yes, a theoretical 4050 will have more bandwidth than the predecessor but not that much. Not enough to change the fact that you will have to turn down IQ @ 1080p.

1680451911437.png
1680452006171.png
 
The fact of the matter is the consoles are rocking with 16GB of VRAM.
I will concede the 1660s is no longer sufficient but this is wrong here, the consoles may have 16GB of GDDR, but that is shared between the CPU and GPU and at most 12GB can be put towards the GPU leaving 4 for the console. On the PS5 you declare your split on game launch, on Xbox they were using direct access but now I suspect Heap there will be a very big deal.
 
he fact of the matter is the consoles are rocking with 16GB of VRAM.
Isn't more around 10gb ?, consoles are rocking 16 of ram in total, you need to run an limited OS and a game on that ram budget has well.

This is a case of no bad product, only bad price.

There would be a place for a 6gb ram (if the card is not powerful enough to run more than 1080p medium-high title anyway), for which the only sacrifice would be to limit texture size because of the vram and the 4050 would be good slot (has the x040 does not exist and that would be too big for the xx30 product), I kind of successor to that GTX 950 2gb-GTX 750 ti

But it would need to be priced at a price point that they will not do it, at $200 MSRP, $205-210 in the real world.... (considering that 3050 are $285 on newegg that would obviously not happen), it would be a nice improvement over the current offer at the price from Nvidia and a nice product.

At the price that it would be released if it was true, probably not a good product.
 
Last edited:
6GB isn't enough for a gaming card at this point IMO but the last 50 series card that I would consider a decent (budget) gaming card was the 750ti and even then it's biggest selling point was as a gaming card that didn't require a power connector.
 
“Some” modern games coming out this year need more than 8 even at low textures require 10GB at least. For nvidia and AMDs next offerings better offer at least that with their cheaper products. Maybe not for straight entry level (1650/1600 nvidia series level) but most low/mid ranges should.

I have a funny feeling games these days will require a lot of VRAM because I don’t think they can just patch in direct access after the fact (even if they could I doubt they will)
 
For what it's worth...thinking TLOU VRAM usage outrage is a bit overblown. In my 3 hours with the game, I've found that the game estimates a good 1GB of VRAM usage over what's actually used.

Playing at 1440P with Balanced DLSS, mix of high/medium textures.

Game's estimate:
image-2.png

MSI Afterburner:
image-3.png

Specs in sig.
 
For what it's worth...thinking TLOU VRAM usage outrage is a bit overblown. In my 3 hours with the game, I've found that the game estimates a good 1GB of VRAM usage over what's actually used.

Playing at 1440P with Balanced DLSS, mix of high/medium textures.

Game's estimate:
View attachment 561626

MSI Afterburner:
View attachment 561625

Specs in sig.

They already mentioned they have memory issues. How much can be fixed, well we'll have to wait and see. Another hotfix is coming today with a larger patch later in the week. I couldn't use this or Hogwarts as a good benchmark because they have noted technical issues. Hogwarts is mostly fixed when I played it, not sure how it is now.
 
Games made for consoles over time will push the hardware. Meaning for the PS5 and XBOX Series X the Vram and CPU threads. 12gb Vram and 16 threads.

If one have lesser hardware, less Vram, expect issues on ports besides the normal less than stellar ones. Plus the overhead on Windows just adds to the requirements.

To have better than console experience you will need more than the consoles, like 16gb of Vram, better CPU but with at least 16 threads. For better textures, higher resolution, higher level of detail if objects plus any added unique pc features.

The Last Of Us is pushing the CPU, at least 12 threads to 100%, 16 threaded CPUs may be pushed towards 100% as well with high end GPUs.

So what is going on with the last of Us PC version. It was built around the console, pushing graphics to the next level. The modern api is being leverage to use multiple threads to push the draw calls needed for the very complex enviroments, added objects, textures, compute shaders, high resolution lightmaps, shadow maps, level of detail to minimize pop ins meaning less culling, hammering the CPU. You could not do this with like DX11 minimal threading capability with draw calls.

Not to say the engine used is as efficient as it could be, but there are reasons for the issues for GPUs less than 12gb of Vram which 12gb would be limited to console level graphics in this game. 16gb and above graphics cards should do well, that is if more of the other bugs are worked out.

As a minimum, you always want a graphics card having more Vram than what the console can use. That is if you want to play games later designed around those consoles with better graphics than the consoles.

Would like to see pcie testing with this game. If the CPU is being hammered, what is going on with pcie transfer rates. I normally wait for games to wrinkle out and get better optimized except a few from great developers like ID software. May have to get this early for some testing. Do want to play this series anyways.
 
Last edited:
Rumour has it that Nvidia's RTX 4050 will be based on Nvidia's AD107 silicon, which is rumoured to be the silicon behind Nvidia's RTX 4060. The RTX 4050 appears to feature a 96-bit memory bus, with recent reports claiming that the graphics card will only feature 6GB of video memory, which is not enough to play many modern games without issues.


6GB of VRAM is not enough

The past week has seen The Last of Us Part 1 and Resident Evil 4 (Remake) arrive on PC, and both of these games can utilise a lot of graphics memory, even at low 1080p resolutions. With the 8GB quickly becoming the minimum amount of memory that is advisable for 1080p gaming, Nvidia's RTX 4050 will be poorly positioned within today's PC market. This is especially true knowing that Nvidia's RTX 3050 has 8GB of graphics memory.



https://overclock3d.net/news/gpu_di...edly_launching_in_june_with_too_little_vram/1

https://twitter.com/Zed__Wang/status/1640986350986969089
The whole Nvidia lineup is off. The 4090 at $1599, ok I can see that.

The 4080 is in the 4080 Ti spot, should have been 20 gb 320 bit bus, ada 102 GPU, $1299. Ada 102 only for one consumer product? While ampere 102 was in 4, 2 initially.

The 4080 with 16gb should have been the $899 spot. 4070 Ti $699, 12gb which would be high $ for the amount of Vram matching console levels. Everything further down makes no sense at their inflated prices. You get more with much less cost with a console.

Any game that push current gen consoles will perform worst with less than 12gb cards. Talking about a lot of games in the end potentially. Unless the ports dumb it down for Nvidia lower vram cards.
 
Last edited:
They'll release it for only very slightly less than a 4060 to push people upmarket.
 
6GB isn't enough for a gaming card at this point IMO but the last 50 series card that I would consider a decent (budget) gaming card was the 750ti and even then it's biggest selling point was as a gaming card that didn't require a power connector.
Yeah, from what I recall the '50s have always been shitty budget cards that are one step above integrated graphics and struggle with the current games at their release. Basically good for e-sport games and 20-40 fps for everything else that's not a 3 year old game.
 
Yeah, pretty much. Its just they usually didn't cost as much as a high performance cpu lol.
True but almost everything that's a new model has gone up in price dramatically; Mobos, PSUs, GPUs, etc are all way up. I guess we just need to wait and see what AMD does. Do they release a competitive card at a reasonable price or do they do their normal tactic of 'Whatever Nvidia sets their price at minus $50'?
 
True but almost everything that's a new model has gone up in price dramatically; Mobos, PSUs, GPUs, etc are all way up. I guess we just need to wait and see what AMD does. Do they release a competitive card at a reasonable price or do they do their normal tactic of 'Whatever Nvidia sets their price at minus $50'?
with how many 6k series cards are still in the wild and selling there's no real incentive to release a mid range 7k series that's just barely faster than the 6k series for more. i think they'll wait and see what nvidia does before doing anything else.
 
Well, anything above $300 and this card is DOA. I’d imagine going by Nvidia’s current pricing from high to low $1600-$1200-$800-$600, I’d imagine the 4060 hopefully follows the trend where the 4070 is $200 cheaper than the 4070Ti, so maybe 4060 would be $399, and the 4050 would be $199.
 
Well, anything above $300 and this card is DOA. I’d imagine going by Nvidia’s current pricing from high to low $1600-$1200-$800-$600, I’d imagine the 4060 hopefully follows the trend where the 4070 is $200 cheaper than the 4070Ti, so maybe 4060 would be $399, and the 4050 would be $199.
Considering the prices we see now. I just don't see that at all. I would say the 4060 will be $479, 4060ti $529, and the 4050 $299. This makes more sense in the climate of overpriced video cards.

The 4070 should be $499 since it only has 12GB of memory.
 
Last edited:
Games made for consoles over time will push the hardware. Meaning for the PS5 and XBOX Series X the Vram and CPU threads. 12gb Vram and 16 threads.

If one have lesser hardware, less Vram, expect issues on ports besides the normal less than stellar ones. Plus the overhead on Windows just adds to the requirements.

To have better than console experience you will need more than the consoles, like 16gb of Vram, better CPU but with at least 16 threads. For better textures, higher resolution, higher level of detail if objects plus any added unique pc features.

The Last Of Us is pushing the CPU, at least 12 threads to 100%, 16 threaded CPUs may be pushed towards 100% as well with high end GPUs.

So what is going on with the last of Us PC version. It was built around the console, pushing graphics to the next level. The modern api is being leverage to use multiple threads to push the draw calls needed for the very complex enviroments, added objects, textures, compute shaders, high resolution lightmaps, shadow maps, level of detail to minimize pop ins meaning less culling, hammering the CPU. You could not do this with like DX11 minimal threading capability with draw calls.

Not to say the engine used is as efficient as it could be, but there are reasons for the issues for GPUs less than 12gb of Vram which 12gb would be limited to console level graphics in this game. 16gb and above graphics cards should do well, that is if more of the other bugs are worked out.

As a minimum, you always want a graphics card having more Vram than what the console can use. That is if you want to play games later designed around those consoles with better graphics than the consoles.

Would like to see pcie testing with this game. If the CPU is being hammered, what is going on with pcie transfer rates. I normally wait for games to wrinkle out and get better optimized except a few from great developers like ID software. May have to get this early for some testing. Do want to play this series anyways.
Correct. Consoles should be your minimum spec. We are currently still in first maybe second generation of games for consoles so the engine and OS are still probably taking up more than they will once we get further out in their lifespan. When that happens VRAM usage will go up.
Hardware Unboxed linked a video with a UE5 dev literally saying you want 10 to 12 at least.
 
There's quite a few games that will struggle: A Plague Tale- Requiem comes to mind, Guardians of the Galaxy, Last of Us, RE4 Remake. The fact of the matter is the consoles are rocking with 16GB of VRAM. Granted it's shared but at the minimum the AAA titles are using 8GB, but likely more. This is a 1060 and a 1660 is 20% faster which isn't going to bring the performance into "crushing" territory. Yes, a theoretical 4050 will have more bandwidth than the predecessor but not that much. Not enough to change the fact that you will have to turn down IQ @ 1080p.

View attachment 561345View attachment 561346
Not sure what this proves. I don't see terrible lows at 1080p med. Both screens show 6gb vram but that may be allocation. I highly doubt this game needs more than 6 gb vram AR 1080p med.
 
This will likely match up well with a Series S. The series S has 10GB shared memory so likely closer to 6 GB for the GPU. It has a 128bit gddr6 bus even though it should be 160 bit (weird hybrid thing the consoles do). So the 4050 should have similiar bandwidth as well if clocked higher.
 
Games made for consoles over time will push the hardware. Meaning for the PS5 and XBOX Series X the Vram and CPU threads. 12gb Vram and 16 threads.

If one have lesser hardware, less Vram, expect issues on ports besides the normal less than stellar ones. Plus the overhead on Windows just adds to the requirements.
Last time I checked the Series S exists. 80% of Steam users have 8 GB or less of vram. About 50% have 6 GB or less. THIS IS AN xx50 SERIES CARD. It's a 96 bit card. 6 GB kind of sucks but 12 GB would have been a bit ridiculous for the performance. We knew this was coming when the pushed xx60 cards down to 128 bit.

I think you guys are blowing this out of proportion.
 
Correct. Consoles should be your minimum spec. We are currently still in first maybe second generation of games for consoles so the engine and OS are still probably taking up more than they will once we get further out in their lifespan. When that happens VRAM usage will go up.
Hardware Unboxed linked a video with a UE5 dev literally saying you want 10 to 12 at least.


Seems like AI is the answer. It's had success upscaling, seems like it would do even better downscaling. Its great that UE5 developers are pushing 'photogametey', 1 million pixel geometry, normal map layering and whatever the heck else he said, but they know darn well that the game will need to be able to run a much wider level of hardware if they want to sell software.
 
For mainstream 1080p gamers, it all comes down to HOW you want to play. It sounds counter intuitive, but if you are shooting for higher fps, you actually need LESS vram as you will have to play at low/med settings instead of high ultra.

Take for example Harry Potter. At 1080p med, even a 3 GB cards looks to be sufficient. On ultra, 6 GB looks to be borderline and a no-go for the 5600xt. Add in RT and it's 12 GB or gtfo.

20230406_084536.jpg

https://www.techspot.com/review/2627-hogwarts-legacy-benchmark/
 
My youngest 2 kids are on older low memory cards, gtx 950 and a 1050ti, and I’m just recently looking to upgrade those since as they get older it’s more than Minecraft and Roblox these days. Of course for budgeting I’ll keep them at 1080p and at least a 6-8G vram target for the next cards. Pc gaming can be cheap as hell if you mix in used parts , 1080p and realistic expectations. Vram creep is definitely a thing and for 1440p and up of course shelling out $500+ for your new gpu is the reality now. Kind of sucks for us old heads but remember it’s PC gaming so we have decades of games and settings galore to tweak and get things playable. If that’s not your thing then buy a console.
 
Back
Top