Battlefield 4 recommended system requirements: 3 GB VRAM

Varmint

[H]ard|Gawd
Joined
Apr 10, 2006
Messages
1,854
LOL at those who said 2GB would be enough for years. :D

OCLw0ed.jpg
 
Since we all know 'recommended' is the real minimum, and you should add at least 50% to that to play the game remotely smoothly...
 
... just a month ago I was given a hard time for telling someone that 4GB of VRAM was a good idea to future proof, rather than the 2GB model.
 
And the recommended cards are 7870's and 660's, which come with 2gb of v-ram.

The fps will be unplayable, by the time you are near 3gb usage with the majority of current cards (if all - applying to single card users).
 
... just a month ago I was given a hard time for telling someone that 4GB of VRAM was a good idea to future proof, rather than the 2GB model.

Hell, I'm getting tore up for recommending 6GB minimum (and 8GB-12GB on the next-gen cards) now that the consoles will actually have some memory on them.

If a game that runs on a current console with <512MB total memory can use >2GB on a desktop card, it's pretty easy to accept that games released next year or the year after will be able to make use of >6GB of VRAM.

I'd say your 4GB recommendation was shooting low, but you do what you can, right?
 
This is more a thread about, "LOL to those who said the 8800 couldn't play modern games!" than a "LOL to those that said 2GB would be more than enough for 10 years!" if you ask me.
 
This is more a thread about, "LOL to those who said the 8800 couldn't play modern games!" than a "LOL to those that said 2GB would be more than enough for 10 years!" if you ask me.

I'm pretty sure the person who created the thread stated clearly what the thread is about, LOL.
 
I will belive it when I see it, since BF3 works just fine on 1GB vram cards.
 
AMD 6 core??? Does that mean it is one of the first games that can use more than 4 cores?
 
AMD 6 core??? Does that mean it is one of the first games that can use more than 4 cores?

BF3 multiplayer, took advantage of extra cores/threads (intel six cores). This game will be no different.
 
AMD 6 core??? Does that mean it is one of the first games that can use more than 4 cores?

BF3 can use more than 4 cores, as can most games built on a common engine (UE, Crytek, Frostbite, to name a few...). Will it make a difference in playability? Don't know the answer to that- but I do know that there isn't a six-core AMD CPU using off-the-shelf cooling that can outrun my aging 2500k at 4.5GHz, so it's a moot point, really.
 
Well shit.......and windows 8 as well?? Dammit. I'll see when the beta is out I suppose.

Or is this amd sponsor/recommended high end thing?
 
Last edited:
Well shit.......and windows 8 as well?? Dammit. I'll see when the beta is out I suppose.

It doesn't say 8.1- so I don't think that there'd be any advantage to running an OS newer than fully patched Vista. Windows 8 may run the game statistically better, but I think it'd be a stretch to claim that anyone would notice the difference in actual gameplay.
 
Guess I am good on VRAM then.. Although my 660 Ti might not be enough.. Hoping for 60fps :O
 
Guess I am good on VRAM then.. Although my 660 Ti might not be enough.. Hoping for 60fps :O

Framerate's going to be dependent on settings (and the rest of your system)- but if you want to crank them...
 
Looks like my plan to upgrade to a 7950 3GB next month is right on time....
 
Hell, I'm getting tore up for recommending 6GB minimum (and 8GB-12GB on the next-gen cards) now that the consoles will actually have some memory on them.

If a game that runs on a current console with <512MB total memory can use >2GB on a desktop card, it's pretty easy to accept that games released next year or the year after will be able to make use of >6GB of VRAM.

I'd say your 4GB recommendation was shooting low, but you do what you can, right?

Unless you are doing SLI, 4gb or more is pretty useless considering you will run out of power before you do VRAM, unless maybe a Titan which should still be good on it's for a while. A GTX 770 with 2gb is a better investment than a GTX 760 with 4gb, people need to chill the fuck out with the VRAM. If someone is playing at 1080P they do not need 4gb+, even at 2560x1600 it isn't that useful since, again, the card will run out of power before it does VRAM.
 
people are acting like 3GB will be the new norm...the Battlefield games have always been the exception...what other games require 3GB VRAM?...modded Skyrim...no need to panic over 1 game...unless you plan on playing BF4 religiously for the next year then 2GB (or maybe even my GTX 580 1.5GB) will be fine for the masses (with a single monitor)
 
Hell, I'm getting tore up for recommending 6GB minimum (and 8GB-12GB on the next-gen cards) now that the consoles will actually have some memory on them.

If a game that runs on a current console with <512MB total memory can use >2GB on a desktop card, it's pretty easy to accept that games released next year or the year after will be able to make use of >6GB of VRAM.
This. A simple concept that historically has always favored PC future-proofing. As history goes, Console ports (to PC), require magnitudes more VRAM to max than their console counterparts.

Games that are made for the next-gen consoles (XB1 and PS4) will have up to 5GB of unified memory available. How much of that will be VRAM, and how will that scale to the PC? I don't know, but a good guess is far more than the 2GB available on most GPUs today, and even with my pair of GTX 680s with 4GB each, I doubt it will be enough for next-gen maxes on 1080p.

With next-gen games becoming increasingly open-world, there's little doubt in my mind that 2GB VRAM limit will be an increasing liability in the future. Of course it isn't going to be end of the world If I had a 2GB card, I just have to compromise certain settings to keep the VRAM below acceptable limits.

But then again, I'm [H]ard, not oft... so 8GB VRAM min is also my recommendation for next-gen.
 
The 6GB lolol circlejerkers are here in force I see. I am an AMD fan, currently have a 7950, but I can tell you this is more than likely an AMD ploy, it being an AMD Evolved game. The game will perform fine 1080p on 2GB I would expect.
 
Game is not very different than bf3 and this is all just corporate circle jerking as EAs expansion "game" is only going to require driver updates
 
Pshhh... 3gb VRAM is for pussies. I'm going to start recommending 12gb VRAM as the MINIMUM for your basic Internet browsing machine...
 
Was right about to pull the trigger on a W230ST with a 765m (they have 2GB VRAM).

Recommended is typically geared towards medium settings no? Would this not mean that such a laptop would be insufficient to play BF4 at medium? :eek:
 
God damn it my 590 only has 1.5 Gb per core memory. The card is beastly but is it possible I will need to upgrade! Oh lawd have mercy what am I gonna replace that beast with??
 
Was right about to pull the trigger on a W230ST with a 765m (they have 2GB VRAM).

Recommended is typically geared towards medium settings no? Would this not mean that
such a laptop would be insufficient to play BF4 at medium? :eek:

Depends on resolution. I highly doubt 3gb is necessary for a single display.
 
Was right about to pull the trigger on a W230ST with a 765m (they have 2GB VRAM).

Recommended is typically geared towards medium settings no? Would this not mean that such a laptop would be insufficient to play BF4 at medium? :eek:

You should be able to play high settings at 1080p with decent FPS
 
You should be able to play high settings at 1080p with decent FPS

It'll be a mix, I think- at 1600p and 2GB/GPU, I know I'll be cranking back both detail settings that affect VRAM usage and rendering-intensive settings. Hell, I can't even turn on MSAA in BF3 now...
 
3gb vram for what resolution? There's no way they would REQUIRE 3 GB vram just to play a game at 1080P unless this game is leaps and bounds beyond anything we've yet seen in the gaming world. If it's true then it could be the true next gen game but I highly doubt it.

Also if 3gb of vram is required for 1080P then I bet none of our cards could even play it with the graphics power we currently have. Hell Titan SLI would probably be a struggle as well.

I bet this is all marketing ploy to get you out to buy the next gen cards coming from AMD. :eek:
 
3gb vram for what resolution? There's no way they would REQUIRE 3 GB vram just to play a game at 1080P unless this game is leaps and bounds beyond anything we've yet seen in the gaming world. If it's true then it could be the true next gen game but I highly doubt it.

Also if 3gb of vram is required for 1080P then I bet none of our cards could even play it with the graphics power we currently have. Hell Titan SLI would probably be a struggle as well.

I bet this is all marketing ploy to get you out to buy the next gen cards coming from AMD. :eek:



It does not say 3gb Required.

It says 3gb Recommended.

512mb Required.
 
Playing with multiple monitors and running tri 7970s I used up 3gb of VRAM. Given the changes in BF4 I can see using up between 3 to 4gb of VRAM. I wouldnt be surprised if I exceed 4gb. Who knows though, Oct 29th is around the corner.
 
--holy mother of wall-of-text warning- there is no TL;DR yet!--


Guys, remember frame buffers- the amount of memory needed for the final rendered 'frame' that is output to a monitor or to an array of monitors- are actually extremely small, on the order of 32MB for 4k (that's 3840x2160x32bpp). It's not like increasing resolution by itself has any real, immediate effect on VRAM usage until the game starts performing hugely memory intensive operations on a per-pixel basis like MSAA, some shaders, and super-sampling based AA.

This is even more true for the incoming consoles, which will likely have the most efficient forms of AA enabled and accelerated so that it can always be left on.

Now, what's going to eat VRAM in next generation games is going to be assets and those per-pixel shader routines, both of which are necessary in order to take the next step towards photo-realistic real-time rendering, and neither of which we've really seen pushed to a consumer-level scale to date.

Essentially, if a game can even be released to run on the current console generation, including BF4, it does not fit into this category- so yeah, all of us, including myself, will probably be fine with our current generation cards.

BUT, and here's the real point- games designed around the resources available on next generation consoles will have FIFTEEN to TWENTY times the graphics resources than those designed around the current generation of consoles. That's a tremendous jump, and it's a far larger jump than we saw over the jump from the generation before last, on the PS2, Xbox, and Gamecube to the current PS3 and Xbox 360.

Now, expound for a moment on what happens when a new console generation arrives. It usually takes a year or two for developers to really get comfortable with the new systems and start bumping into their limits, even though consoles are usually between one and two generations behind the current PC technology, since the overall hardware and software architectures are usually so different, and because there's usually tremendous memory limitations that need to be creatively minimized.

Once the developers get comfortable, though, they can start focusing on PC games again. They know how to keep the game engines in check, but they can let the artists, level designers, character designers, and effects engineers (shader coders) go wild, since all of these things can be quickly scaled back selectively for each platform that the game will be released for. Hell, most commercial game engines these days can be used to scale the same game from high-end PCs down to the most limited consoles (Wii U) and even most tablets, and across platforms like Linux, Mac OS, iOS, and Android, at the same time.

So there's no real reason for the developers to hold back; they can set their 'creative' limits as high as they can afford, and then tailor the assets for each platform release, getting the most out of what each platform is capable of. And that's where the unprecedented resources available on these incoming consoles from Sony and Microsoft has me excited.

Essentially, if history holds true, developers will go buck wild with game assets for truly next-generation games. Cross platform games like BF3, Crysis 3, and even the much older Dragon Age II are good examples here. These games ran plenty well on the current geriatric consoles, yet the PC releases were and are capable of bringing even the most high-end PCs to their knees. You think that 1GB of VRAM is enough? Try turning on the high-resolution textures in DA2. It's not even a terribly complex or wide-open game like Skyrim.

And that's my point here- Game developers haven't had to develop 'to a point' for quite some time. If they have 3GB or 4GB of VRAM allocated for a particular game's graphics, they'll typically overshoot that by 50% during asset development and then scale it back as they put together the different versions for each platform they're releasing on. Almost all AAA cross-platform titles that hit the shelf in the last couple of years are examples of this process, where the PC versions shipped with far more graphics assets than the console versions.

And I don't expect this trend to stop- rather, since it's the dream of graphics artists to put as much detail as they can into the games that they work on, I expect it to only get 'worse' from a VRAM usage standpoint. If their final target is in the 3GB to 4GB range like I mentioned above, I expect them to build to a 6GB to 8GB asset range, and while that will be scaled down for the console releases, I fully expect developers to give PC gamers the option to try and play their games with all of the assets they created. And why shouldn't they? Everyone still wins- the graphics artists get to see more of their work in the final product, the developer gets to tout a better looking game, graphics card vendors will have a new, more demanding game that they can throw into bundles and use as a marketing tool to sell more cards, and we, the gamers, get better looking games!

Final point- this whole situation arises because memory is cheap, which itself arises both because foundries get better at making memory wafers all the time, and because memory die densities constantly increase as new nodes come online, which the foundries relentlessly pursue.

So you can expect that making an HD7870 or GTX770 with 8GB of RAM is actually pretty easy and cheap to do, as would an HD7970 or GTX780/Titan with 12GB. Actually, we know that it's easy because the professional versions of these cards already have double the memory of the consumer versions; so it really just comes down to cost!

And cost is far less of an issue than most think. Yes, GPU vendors usually charge significant premiums for models that have above-average amounts of memory, just like Apple will happily charge you $100 for that extra 32GB of flash on your iDevice even though it costs them all of $3 to $5 more to produce.

But there's plenty of other factors here that can quickly nullify the cost issue- demand, cost of goods, competition, and economy of scale all work together to bring the cost to the consumer down faster than you'd expect. In reality, when games start shipping that can actually make use of ~6GB of VRAM on the PC, the early adopters at that time (and now, if you bought an HD7970 or GTX Titan with 6GB) will pay a higher price, but competition will very quickly erase that premium.

So, here's what I expect. By the time that the holiday season rolls around next year, a smattering of games PC gamers actually want to play that can make use of 6GB to 8GB of VRAM per GPU will be on the market, and GPUs with 6GB+ VRAM per GPU will be available at reasonable prices, and will be based both on respins of the GPUs coming out this fall, at least from AMD, and on GPUs from both AMD and Nvidia coming out on TSMC's next node, whatever that turns out to be.

In the interim, and in summary, my advice to anyone that has a competent graphics setup today is to wait. When AMD releases their stuff in the next month or so, and then that stuff goes on sale for the holidays, that's the time to buy- just make sure you buy something with at least 6GB of VRAM, only settling for 4GB cards if you absolutely must. And good luck!
 
The first thing I notice about the recommended specs is that nowhere is a resolution listed. That kinda matters in terms of VRam...

Also funny to see everyone who talks about consoles conveniently forget that the VRam and System RAM are shared every time it suits their argument. Let's all forget that high-end computer systems have shipped with 8-16GB system ram for years now and circle-jerk each other about the comparatively small chunk of shared RAM found in the new consoles :rolleyes:
 
Back
Top