RPGWiZaRD
[H]ard|Gawd
- Joined
- Jan 24, 2009
- Messages
- 1,217
Oh God! NO! Not TWO games! (one of which was coded with AMD's help and isn't released yet and hasn't been tested)
Man, setting textures down a notch with (likely) no discernable difference in quality is gonna REALLY suck for the plebs with 11GB of VRAM or less for those (maybe) two games!
I hope someone tries to compare for example 3070 vs 2080 Ti at 4K vs 1440p/1080p that we know roughly how they compare like in average in other games until these supposedly larger VRAM 3000 series cards come out from Nvidia.
Somehow I don't think 3070 will perform a significantly worse but we'll see, I just find people are so quick to scream 10GB isn't enough just because A) a developer says so likely sponsored by AMD to say so B) they've seen game X use Y amount of RAM as some kind of proof when reality is VRAM allocation isn't as night and day and often extra RAM is used than actually needed
Maybe I haven't looked but hasn't anyone tried to debuncle the myth with VRAM use already, starts feeling this is the proper time to do a full-deep analysis on this subject in how much is needed before showing considerably performance loss in a couple of exceptionally demanding VRAM titles, I'd happily admit I'm wrong, I just want to see actual benchmark comparisons as this VRAM discussion is getting old and could be ended with a good test.
In my view, the GPU processing capability vs amount VRAM capacity seems much more of a deciding factor than the game's VRAM useage needs and is it a couple of different percentage differences or 10 or 20% etc? But yea I'd gladly see some analysis on it.
Last edited: