BF4 - Any GPU info yet?

y0bailey

Gawd
Joined
Mar 31, 2003
Messages
558
2 years ago when BF3 came out I finally got my PC specs semi-up-to-snuff.

Now I am sitting here anxiously awaiting BF4 (which I might not even buy if no VOIP), and planning my budget for an upgrade. The main upgrade will be GPU related.

My CPU is still a PhenomX4 @4.0ghz, but I honestly don't think this will be my bottleneck with that game.

Have I missed any info that should influence my decision? I know I am going to give it 1-2 months before I buy, and all I want is maxed out 1920x1080 @>60fps, but I am getting antsy in my panties.

Too early to tell?
 
2 years ago when BF3 came out I finally got my PC specs semi-up-to-snuff.

Now I am sitting here anxiously awaiting BF4 (which I might not even buy if no VOIP), and planning my budget for an upgrade. The main upgrade will be GPU related.

My CPU is still a PhenomX4 @4.0ghz, but I honestly don't think this will be my bottleneck with that game.

Have I missed any info that should influence my decision? I know I am going to give it 1-2 months before I buy, and all I want is maxed out 1920x1080 @>60fps, but I am getting antsy in my panties.

Too early to tell?

a single 570 likely will struggle. at 1920x1200 with my 570 i had problems and crazy dips due to vram issues and that was in BF3, considering BF4 is a new iteration of the graphics engine its safe to assume it will have more features that are a bit more cumbersome; i would get a 7970 if I were you especially considering how cheap you can find them. CPU should be okay.
 
Sweet jumping jesus that makes me incredibly depressed. My $ saving budget just went way the hell up.

They seriously expect the "average gamer" like myself to drop $500+ on a GPU to actually enjoy decent graphics?

The 3 buddies I built budget PC's for to join in BF3 are going to hate me.

haha its alpha man come on! nobody should take benchmarks from a game in that stage of development seriously
 
haha its alpha man come on! nobody should take benchmarks from a game in that stage of development seriously

Agreed. But still making me a little pessimistic about how much money I am going to have to dump into this thing.

Thanks for the info though folks...keep it coming!
 
Sweet jumping jesus that makes me incredibly depressed. My $ saving budget just went way the hell up.

They seriously expect the "average gamer" like myself to drop $500+ on a GPU to actually enjoy decent graphics?

The 3 buddies I built budget PC's for to join in BF3 are going to hate me.

How are you defining "decent graphics" however?

BF4 on the new consoles will only be running at 1280x720 for example without even going into the other possible differences in quality.

You do not really need to run at "max" settings for decent graphics. Just look at BF3 at the moment and the difference between FXAA on High vs. 4xMSAA on Ultra. Would you say the game looks horrible on the former? You do get a lot of performance savings though.
 
It's not the end of the world, you've managed to play bf3 with a 570 ?
I wouldn't be happy with that.

Grab a 770 when they are $350 or less or a 7970 ...

BF4 with one of those will be better than BF3 with a 570 most likely.
 
How are you defining "decent graphics" however?

BF4 on the new consoles will only be running at 1280x720 for example without even going into the other possible differences in quality.

You do not really need to run at "max" settings for decent graphics. Just look at BF3 at the moment and the difference between FXAA on High vs. 4xMSAA on Ultra. Would you say the game looks horrible on the former? You do get a lot of performance savings though.

I thought these consoles were locking it in at 1080p, but I haven't followed them very closely (I'm not buying one, see sig :)).

But yeah, I don't run any MSAA, not at 1600p. FXAA Medium is good enough, really, but I prioritize gameplay in my multiplayer shooters over graphics.
 
It's not the end of the world, you've managed to play bf3 with a 570 ?
I wouldn't be happy with that.

Grab a 770 when they are $350 or less or a 7970 ...

BF4 with one of those will be better than BF3 with a 570 most likely.

If an HD4870 1GB card could handle BF:BC2 at 1600p with the settings dialed down, I'll bet a GTX570 could handle BF4 at 1080p, with close attention paid to the settings. It won't be as pretty as the commercials, but it should be playable.

*The example above was pre-BF3 while I waited for my first HD6950 2GB card, and I was pleasantly surprised as to how well it ran.
 
I thought these consoles were locking it in at 1080p, but I haven't followed them very closely (I'm not buying one, see sig :)).

But yeah, I don't run any MSAA, not at 1600p. FXAA Medium is good enough, really, but I prioritize gameplay in my multiplayer shooters over graphics.

The target is 1080p for the next gen consoles but because there isn't really a hard forced requirement that I am aware of just like with the current gen. As such developers will be flexible depending on what they are trying to achieve given what they have to work with.

The likelihood is that DICE prioritized 60fps as well as maintaining the same feature set (64 players and map sizes for instance) over the extra resolution. The game will be further cut down for the current consoles (including no 64 player). BF3 for example also is cut down quite a bit compared to the PC version for the current consoles, the resolution is not even 720p.

This is something actually that is often overlooked when discussing optimizations. Significant optimization on games is actually done from a more content oriented perspective, by this I mean they choose trade offs on visuals based upon performance demands for example. The is a large reason why console games are perceived to run a lot better on lower hardware. Anyways this is off topic :p

Back to BF4 another thing to keep in mind is you shouldn't compare BF4 on max settings to BF3 on max settings, the former will be more demanding but it will also look better. If you like how your hardware runs BF3 at the moment it is not likely BF4 will look worse (at whatever settings it is capable of on said hardware).
 
I thought these consoles were locking it in at 1080p, but I haven't followed them very closely (I'm not buying one, see sig :)).

But yeah, I don't run any MSAA, not at 1600p. FXAA Medium is good enough, really, but I prioritize gameplay in my multiplayer shooters over graphics.

yeah i play at 1440p I would just leave MSAA off before i got my 2nd 670. if necessary I will leave it off again when BF4 comes out. super cumbersome with little benefit if you ask me

thats all preference though
 
I'm hoping they dump FXAA for SMAA options, their FXAA implementation in BF3 was horrible.
 
I'm hoping they dump FXAA for SMAA options, their FXAA implementation in BF3 was horrible.

It's certainly not pretty, but it's definitely better than nothing. And you're right, there are much better options out there with a similar performance hit. BF3 was early in that game, and I'm sure that DICE has gotten a chance to test out the effectiveness of the newer routines since it's release. I'm hopeful :).
 
I would jump on the Intel train before getting a new gpu... Haswell may not be the best oc'er but man it makes my old 1055T look like a joke and a half lol.
 
7970. Find them used all the time here for ~$300. I bought mine months ago for $320.

Awesome card. Handles anything and everything.
 
I'm hoping they dump FXAA for SMAA options, their FXAA implementation in BF3 was horrible.

Ask @repi on twitter if you're curious, he's an engine dev that responds quite often
 
If these benches hold true, it looks like I gotta start saving up for another 7970 or just hope AMD releases the 9000 series before BF4 comes out.
 
I thought these consoles were locking it in at 1080p, but I haven't followed them very closely (I'm not buying one, see sig :)).

But yeah, I don't run any MSAA, not at 1600p. FXAA Medium is good enough, really, but I prioritize gameplay in my multiplayer shooters over graphics.

^this. Who the hell runs that much anti-aliasing in a competitive MP game? If the jaggies bug you just inject SMAA or FXAA with radeonPro/nvidia etc.

frames >>>>>>> graphics
 
^this. Who the hell runs that much anti-aliasing in a competitive MP game? If the jaggies bug you just inject SMAA or FXAA with radeonPro/nvidia etc.

frames >>>>>>> graphics

I haven't tried SMAA yet, but I haven't really been bothered to 'tune' my BF3 experience recently. I have fast, sharp vision, but the reality is that I'm not looking for jaggies. I hit the ground running and I stay moving, and at 1600p with FXAA Medium set in game, they just don't bother me, even though I can readily see them.
 
^this. Who the hell runs that much anti-aliasing in a competitive MP game? If the jaggies bug you just inject SMAA or FXAA with radeonPro/nvidia etc.

frames >>>>>>> graphics

A PC that has exceeded the monitors refresh rate. No need to turn down settings at that point.
I still come in the top five in points and often in first playing with max settings. I don't to gamer athlete extremes to increases frames.

I think when speaking about FPS, are we talking a 60Hz monitor or 120-144Hz monitor. Big difference between the two for in-game settings.
 
A PC that has exceeded the monitors refresh rate. No need to turn down settings at that point.
I still come in the top five in points and often in first playing with max settings. I don't to gamer athlete extremes to increases frames.

I think when speaking about FPS, are we talking a 60Hz monitor or 120-144Hz monitor. Big difference between the two for in-game settings.

It's all relative- but the main point is that the 'median-low' framerate needs to be at or near the refresh rate of the monitor; different review houses express that differently, but that's probably the best generic explanation.

I do believe that is the perspective you're coming from; I'm just clarifying for the sake of the thread.
 
haha its alpha man come on! nobody should take benchmarks from a game in that stage of development seriously

Okay, but unless miracles will happen in the drivers department, it seems that a single 7970/770 or even a 780 won't handle this game @1080p with 4xMSAA (>60fps).
 
Okay, but unless miracles will happen in the drivers department, it seems that a single 7970/770 or even a 780 won't handle this game @1080p with 4xMSAA (>60fps).

Alpha- feature complete, but not optimized or debugged. There will be miracles in the drivers, but that likely won't happen until after the game is released; the real improvements will be in the engine tuning now that they have it running fairly well.
 
So I just pre-ordered and pre-downloaded, and I am excited.

I still haven't figured out my upgrade yet. Waiting for some actual good info from [H] between then GTX770, R9 290, R9 280X to hit my $300-400 sweet spot at 1080p.
 
Well, Dice developer Johan has been at both AMD's show and nvidia's. So I expect both versions to be well optimized.

He is also the most vocal Mantle supporter, so I expect Mantle to be significant in the Frostbite engine.
 
So I just pre-ordered and pre-downloaded, and I am excited.

I still haven't figured out my upgrade yet. Waiting for some actual good info from [H] between then GTX770, R9 290, R9 280X to hit my $300-400 sweet spot at 1080p.

In that sample, the AMD cards should certainly be faster. A custom-cooled R9 290 non-X will probably hit the sweet spot of price, performance, and noise.

A GTX770 with 4GB of VRAM and the Titan cooler, though, is a pretty compelling argument if you want a blower and/or guaranteed G-Sync support.
 
In that sample, the AMD cards should certainly be faster. A custom-cooled R9 290 non-X will probably hit the sweet spot of price, performance, and noise.

A GTX770 with 4GB of VRAM and the Titan cooler, though, is a pretty compelling argument if you want a blower and/or guaranteed G-Sync support.

My fear is that the 290 with custom cooling will be months away, and I find it hard to believe it won't pop in over $400. It was included on the list as a "wishful thinking," but realistically it probably won't make the cut.

I am just keeping my fingers crossed that my 570 won't be too disappointing and make me rush-purchase a 280X. Because it is taking a lot of willpower to not buy a 280x today.
 
Well, Dice developer Johan has been at both AMD's show and nvidia's. So I expect both versions to be well optimized.

He is also the most vocal Mantle supporter, so I expect Mantle to be significant in the Frostbite engine.

Until we see some real investigation into Mantle, I'm remaining skeptical. All we have right now is the set of marketing slides AMD released and platitudes from developers.

My worries are two-fold-

First, it's starting to look like Mantle is more focused on quick porting to AMD hardware from consoles to the PC- or easier co-development of releases- which is great for developers, but mostly meaningless to gamers.

Further, after seeing just how hard AMD pushed the Hawaiian Islands GPU with >300W load power draw and a 95c target temperature, I'm not sure that Mantle, even if it makes the cards more efficient, will actually result in better performance. If it winds up just getting more work to the GPU faster, which has been the main selling point so far concerning reduced driver overhead, then the GPU may just have to throttle more to stay within it's thermal envelope, and brings the possibility of no net gain in performance, or even a performance loss.

The exciting part remains, though: whether or not Mantle makes a difference or takes off, it's gotten everyone's attention focused on driver overhead again, and companies have already gestured that they're working on improving their products in a similar fashion.
 
Sweet jumping jesus that makes me incredibly depressed. My $ saving budget just went way the hell up.

They seriously expect the "average gamer" like myself to drop $500+ on a GPU to actually enjoy decent graphics?

The 3 buddies I built budget PC's for to join in BF3 are going to hate me.

For "decent graphics" at 1080p a 280X-7970 ($300 plus Mantle performance boost down the line) would do the job, but you mentioned "maxed out at 60+ FPS" well then yes, you're talking about high end video cards, decent and maxed out are pretty different things, keep in mind BF4 will be one of the most demanding games out there. From the beta I can tell you, the game in medium settings looks incredible.
 
Until we see some real investigation into Mantle, I'm remaining skeptical. All we have right now is the set of marketing slides AMD released and platitudes from developers.

My worries are two-fold-

First, it's starting to look like Mantle is more focused on quick porting to AMD hardware from consoles to the PC- or easier co-development of releases- which is great for developers, but mostly meaningless to gamers.

Further, after seeing just how hard AMD pushed the Hawaiian Islands GPU with >300W load power draw and a 95c target temperature, I'm not sure that Mantle, even if it makes the cards more efficient, will actually result in better performance. If it winds up just getting more work to the GPU faster, which has been the main selling point so far concerning reduced driver overhead, then the GPU may just have to throttle more to stay within it's thermal envelope, and brings the possibility of no net gain in performance, or even a performance loss.

The exciting part remains, though: whether or not Mantle makes a difference or takes off, it's gotten everyone's attention focused on driver overhead again, and companies have already gestured that they're working on improving their products in a similar fashion.



I agree the thermals right now are higher than some are used to. That's a major reason I'm very curious on the AIB custom cards, which should be available before Mantle is enabled in the game. That should be enough for the card not to need to throttle.

The other part of Mantle I'm intrigued about is CPU usage. With (mostly) bypassing the HAL, the CPU may (should?) also have less to do and be more efficient. This would mean even more performance on the CPU front also.
 
Further, after seeing just how hard AMD pushed the Hawaiian Islands GPU with >300W load power draw and a 95c target temperature, I'm not sure that Mantle, even if it makes the cards more efficient, will actually result in better performance. If it winds up just getting more work to the GPU faster, which has been the main selling point so far concerning reduced driver overhead, then the GPU may just have to throttle more to stay within it's thermal envelope, and brings the possibility of no net gain in performance, or even a performance loss.

FYI- 290x is not using over 300w.
Mantle will not cause a performance loss vs non-Mantle APIs.
 
FYI- 290x is not using over 300w.
Mantle will not cause a performance loss vs non-Mantle APIs.

Reviewers have indicated that it does hit or exceed 300W- and AMD hasn't published the card's TDP. It does have the 150W 8-pin, 75W 6-pin, and PCIe connection necessary to draw 300W according to the PCIe spec.

And please let me know if you have a way of testing Mantle performance, thanks!
 
Reviewers have indicated that it does hit or exceed 300W- and AMD hasn't published the card's TDP. It does have the 150W 8-pin, 75W 6-pin, and PCIe connection necessary to draw 300W according to the PCIe spec.

And please let me know if you have a way of testing Mantle performance, thanks!

At peak, yes it is possible in some situations.
In normal gaming situations you are looking at <300w.

Can you please run/test Mantle and show me this potential non/negative gain in performance. Thanks!
 
Back
Top