So much for NVIDIA claiming 10 gigs of vram would not be a limitation

Status
Not open for further replies.
They just listed the vram to match with the 2070/2080 Ti. I doubt the game will require all of that vram or even close to it.
That would make perfect sense if they released the specs a few weeks ago but they've known about the current cards for a while now. You would think Nvidia would have at least told them to put 10 gigs instead of 11 when releasing the specs now.
 
That would make perfect sense if they released the specs a few weeks ago but they've known about the current cards for a while now. You would think Nvidia would have at least told them to put 10 gigs instead of 11 when releasing the specs now.

This is Ubisoft we’re talking about here. I’m surprised they didn’t manage to confuse ram with vram.
 
I wish people would stop complaining that 10gb vram is not enough.
Don't buy 3080 with 10gb then. There are alternatives. 2080Ti is cheaper and a bit slower, it has more vram. Buy that.
And I wish people would stop complaining about those that are complaining about 10 gigs not being enough.

If it doesn't bother you then great but it's going to be a talking point for quite a while now so deal with it if you're going to be on a public forum.
 
so the game won't run well with a 2080/3070? Nonsense. No game dev is watching ram usage enough to give a spec to the gigabyte of vram needed. 11GB is an odd number. The only reason they picked it was to essentially say to use a 2080Ti for best experience.
 
And I wish people would stop complaining about those that are complaining about 10 gigs not being enough.

If it doesn't bother you then great but it's going to be a talking point for quite a while now so deal with it if you're going to be on a public forum.

Your complaining won't change the amount of vram. You're just wasting energy and spreading negativity.
 
Your complaining won't change the amount of vram. You're just wasting energy and spreading negativity.
What a stupid thing to say. This is a public forum where we discuss things and while discussing things it can turn into a debate or an argument when looking at the pros and cons. That's how the world works when you discuss specifications of anything whether it's computers, cars, televisions or whatever else. So again if you don't want to see or hear any arguments then stay off of a public forum.
 
In my day we had 256mb and we were happy
In my day we had 8 Mb and were frigging ecstatic that it was a Voodoo2!

Kids these days. Pfft. I've gamed at 4K since the 980 Ti (which only had SIX Gb of vram!). But noooo, they have to whine on about how 10 Gb isn't enough when they could just buy the 24 Gb 3090 for an exorbitant cost and shut up already!
 
One thing that I have learned over the years is that recommended and minimum specs are often further from the truth than you might think.
 
In my day we had 8 Mb and were frigging ecstatic that it was a Voodoo2!

Kids these days. Pfft. I've gamed at 4K since the 980 Ti (which only had SIX Gb of vram!). But noooo, they have to whine on about how 10 Gb isn't enough when they could just buy the 24 Gb 3090 for an exorbitant cost and shut up already!

In my day GPUs were not even a thing..... 286 CGA FTW!

1600206600088.png
 
I remember when having a Diamond Stealth or Viper was hot stuff because pushing 2D graphics in 256 colors was HARD. Hell, I remember playing games in 16 colors on my Tandy 1000 like F-19 Stealth Fighter and all the Carmen Sandiego games.
I remember building a 386 with leftover parts (obsolete by then) from my dad's work. It was so darn cool playing Doom and Castle Wolfenstein! (Less cool having to learn command line and slogging through DOS 6.21 and (gasp) making sure jumpers were set correctly on the motherboard and drives) Then I tried Doom II and found out what a sideshow was...
 
I remember building a 386 with leftover parts (obsolete by then) from my dad's work. It was so darn cool playing Doom and Castle Wolfenstein! (Less cool having to learn command line and slogging through DOS 6.21 and (gasp) making sure jumpers were set correctly on the motherboard and drives) Then I tried Doom II and found out what a sideshow was...
After the Tandy 1000, we had a 386/25 built. I think we had 4MB of RAM then. Doom was not possible on it. It wasn't until we upgraded to a 486 DX2 80 before it would start to run better. The days of Turbo buttons that you would rarely ever turn off. That machine started out with 8MB of RAM. Back then sound cards were the thing to have. I remember convincing my dad that a Sound Blaster AWE32 was a must have.
 
I remember when having a Diamond Stealth or Viper was hot stuff because pushing 2D graphics in 256 colors was HARD. Hell, I remember playing games in 16 colors on my Tandy 1000 like F-19 Stealth Fighter and all the Carmen Sandiego games.

F19 Stealth fighter was the shit! I remember mastering carrier landings and winning the congressional. My teenage future was so bright and promising......sadly I ended up a professional manwhore for dollars :-(

After the Tandy 1000, we had a 386/25 built. I think we had 4MB of RAM then. Doom was not possible on it. It wasn't until we upgraded to a 486 DX2 80 before it would start to run better. The days of Turbo buttons that you would rarely ever turn off. That machine started out with 8MB of RAM. Back then sound cards were the thing to have. I remember convincing my dad that a Sound Blaster AWE32 was a must have.

Tandy, you were RICH! We had a franken 286 12mhz that averaged about 2fps!
Sierra games made that rig their bitch more than once!

I remember the day in '92 my old man brought home a Zeos 486 50mhz that he paid $2500 for and was OFF LIMITS to our mortal child hands. Yea okay dad, the second his ass was in his VAN to work down by the river I was loading Stunts, Aces of the Pacific, A10 tank killer, Red Baron on that puppy for some sweet ass fluid frame rates!
 
F19 Stealth fighter was the shit! I remember mastering carrier landings and winning the congressional. My teenage future was so bright and promising......sadly I ended up a professional manwhore for dollars :-(
Politics is a dog eat dog world. ;)
 
Meh. my 2070 Super will run everything I need it to run and will run Cyberpunk 2077 (if not I will cry). Playing Dual Universe 2k high and get a playable 30-40 fps in the large areas filled with ships and then 90-150 fps anywhere else. Can't complain.
 
We get reviews in 13 hours and hopefully this is put to rest!

But honestly, maybe there will be one game in a thousand you have to turn to high textures instead of ultra because of VRAM. I’d rather that than paying +$200-300 for double the VRAM.
 
We get reviews in 13 hours and hopefully this is put to rest!
LOL how in the hell could this issue possibly be put to rest just from the games that get tested tomorrow? If most of the reviewers follow the Nvidia guidelines they're not going to be able to test a game in a scenario that would even remotely have a problem with 10 gigs of vram. And the problem is not the games today, it's the upcoming games that people are worried about. I don't think it's going to be a problem in 99% of cases for the next year or two but it will eventually be a problem and sooner than some people would like to hear about. And anyone ignorantly claiming that 10 gigs will never be a problem won't have shit to say when Nvidia comes out with a 20 gig card and proclaims its advantages over the 10 gig version.

Anyway I'll be getting the 3080 10 gig as it's really my only upgrade since there's no way I'm going to pay 1500 or 1600 bucks for a 3090. The only way I would wait on the 20 gig version is if there was a concrete time frame on it which there isn't.
 
LOL how in the hell could this issue possibly be put to rest just from the games that get tested tomorrow? If most of the reviewers follow the Nvidia guidelines they're not going to be able to test a game in a scenario that would even remotely have a problem with 10 gigs of vram. And the problem is not the games today, it's the upcoming games that people are worried about. I don't think it's going to be a problem in 99% of cases for the next year or two but it will eventually be a problem and sooner than some people would like to hear about. And anyone ignorantly claiming that 10 gigs will never be a problem won't have shit to say when Nvidia comes out with a 20 gig card and proclaims its advantages over the 10 gig version.

Anyway I'll be getting the 3080 10 gig as it's really my only upgrade since there's no way I'm going to pay 1500 or 1600 bucks for a 3090. The only way I would wait on the 20 gig version is if there was a concrete time frame on it which there isn't.

Who buys a high end card and worries some niche case will make you drop a setting a notch in over two years, when the next gen is out?

When asked about VRAM Gamers Nexus clearly said it’s a non-issue and I believe them.

Would you really spend ~$1k for a 20GB version which has practically no gain? Hell at least the 3090 gives you an extra 20% all the time.
 
Who buys a high end card and worries some niche case will make you drop a setting a notch in over two years, when the next gen is out?

When asked about VRAM Gamers Nexus clearly said it’s a non-issue and I believe them.
Yeah nobody ever worries about that. I mean hell I don't think anybody else is even discussed the vram ever being a potential issue at all have they? It's clearly just me worrying for no reason. :rolleyes:

Really what in the hell are people like you going to say once they come out with a 3080 that doubles the vram? :p
 
Yeah nobody ever worries about that. I mean hell I don't think anybody else is even discussed the vram ever being a potential issue at all have they? It's clearly just me worrying for no reason. :rolleyes:

Really what in the hell are people like you going to say once they come out with a 3080 that doubles the vram? :p

I am getting a 3090....
 
I remember building a 386 with leftover parts (obsolete by then) from my dad's work. It was so darn cool playing Doom and Castle Wolfenstein! (Less cool having to learn command line and slogging through DOS 6.21 and (gasp) making sure jumpers were set correctly on the motherboard and drives) Then I tried Doom II and found out what a sideshow was...
6.21? I distinctly remember upgrading to 6.22, don't remember ever using 6.21, must've skipped right over it, lol. Yup, Wolfenstein, doom, og warcraft (ok, this wasn't 3d but it sure tied up the phone lines for days). Fun times. I actually still drop to the command prompt often and can still navigate dos without issue. Not that I have a need anymore, not is it a useful skill, but hey somethings are hard to forget ;). Heck I was still using edit until they finally removed it and I was forced to start using notepad. I still catch myself trying to open files from command prompt/termin in edit instead of notepad or nano. Anyways, 10GB... Who really knows at this point. We most likely won't know until benchmarks and/or larger ram cards come out and are tested. Seems a few games are alr set borderline, so I am sure in the next few years with 4k + RT they'll be hitting that limit. Hopefully Intel will have a working pcie 4.0 implementation.and it won't be so noticeable when it happens, lol.
 
So the cards haven't even been released yet and Watch Dogs Legion is recommending 11 gigs of vram for 4K ultra even without ray tracing. Pretty ironic since the game comes with the video card too. So much for NVIDIA claiming that 10 gigs would not be a limitation at 4K.

https://news.ubisoft.com/en-us/article/WCiLJPAN9QHWwb9JBc1Wj/watch-dogs-legion-pc-specs-revealed

you've never heard of exaggerated Recommended specs to get people to upgrade their hardware?...of all the upcoming games I certainly don't expect a Ubisoft game to push the envelope in terms of graphics hardware
 
you've never heard of exaggerated Recommended specs to get people to upgrade their hardware?...of all the upcoming games I certainly don't expect a Ubisoft game to push the envelope in terms of graphics hardware
LOL exaggerated specs to get people to buy high and hardware? If that happens it sure must be rare as hell because it's actually the opposite in reality for most recommended specs. By that I mean most recommended specs do not play the game on max settings and even 60 FPS. I can link you all day long to laughable recommended specs that will not even get you a good experience in many games.
 
I would be hard pressed to buy a 10GB for $700 for what might be a 2 year install.
I went from an 11GB 1080 TI to a 8GB 2080 Super and consider it a total and complete side-grade (thankfully I got it for free). Haven't hit any VRAM walls at 4K yet with 8GB - got any suggestions to test that dreaded VRAM wall with?
 
I went from an 11GB 1080 TI to a 8GB 2080 Super and consider it a total and complete side-grade (thankfully I got it for free). Haven't hit any VRAM walls at 4K yet with 8GB - got any suggestions to test that dreaded VRAM wall with?
No, and I don't think much will right now in real world gaming, but as I said, a two year install term is making me give this some thought if I am going to water cool especially. When I install a WC card I don't like to change that out frequently. I have a 2080 Ti WC'd right now and I am in no rush to get rid of it, now that prices have cratered.

I want to see the AMD reviews first, and then see if anything attractive shows up during Black Friday.

10GB is great for right now, but I am not sure it will be 8 months for now, and I do use a 4K display.

If you are doing 1440p or lower, go for it if it feels right.
 
I went from an 11GB 1080 TI to a 8GB 2080 Super and consider it a total and complete side-grade (thankfully I got it for free). Haven't hit any VRAM walls at 4K yet with 8GB - got any suggestions to test that dreaded VRAM wall with?
Wolfenstein Youngblood absolutely cannot run properly on 8 gigs of vram when fully maxed out with ray tracing even at only 1440p. And every time somebody tries to argue with me about that they don't realize that they don't actually have the game fully maxed out. That's because this is one of many games that don't actually run the highest possible settings when choosing the highest preset. If you were to have image streaming on Uber with ray tracing and all other settings maxed out at 1440p and DLSS on quality then the game will eventually lock up. You have to at a minimum turn image streaming down to ultra and even then you'll still get some hitching until you lower the DLSS quality to balanced.

Other than that Rise of the Tomb Raider is the only other game I know will have any issue with 8 gigs of vram on playable settings at 4K. It will hitch in a few areas of the game but it's not too bad but those same areas were perfectly smooth on the 1080 TI compared to the 2080 super. And both those games I just mentioned were also noticed by digital foundry so no that's not just me making up stuff as I have also tested it firsthand.
 
Status
Not open for further replies.
Back
Top