Battlefield 3 Multiplayer Performance and IQ Review @ [H]

Lol Kyle has always reminded me of Randal from Clerks.

"This job would be great, if it weren't for the fucking customers"

Lol! I didn't know Kyle even cared! I thought they just did reviews because they "liked" doing them. Might want to not make comments available on your reviews from now on! :p
 
lol on all the comments on Kyle's remark, for one it sounded more like a joke, for two his statement for the most part reflects the previous remarks rather accurately. :D

I dig these type of reviews while maybe less objective in nature they seem to give a good feel for what you will get for a given gpu point.
 
It's the best BF3 bench/review since its the only one to focus on 64 player MP. Who buys BF3 for SP? Really.

[H] is easily the best review site out there since they approach it from a gamers perspective.
 
I think Kye is bang on the money. don't like the review? fine, don't like the game even? DON'T post a comment in the thread related to that very game. 'Nuff said.
 
only because i previously used it and now still have it lying around,
any idea if this: http://hardforum.com/showpost.php?p=1034705931&postcount=1
would perform the FXAA actions for me?

as far as i know and can tell BF3 does not use PhysiX at all, it relies on your CPU's FPU for most if not all Physics calculations. FXAA is a shader based AA method AMD has their own version of this and all it pretty much does is slighty blur the entire frame to reduce the appearance of jaggy edges.
 
Good review with honest gameplay assessment. Kyle, have you by chance heard from AMD on the non~smooth fast mouse movement or slowdowns around debris? Using the rig in my sig and the 11.11 with CAP2 at 1920x1200 I still notice it as well. I would have thought my rig pushing a single monitor would have been able to provide much smoother gameplay even with everything at ultra.. blur disabled(it's annoying). I'm in the process of trying different settings to see if it's anything specific but have not found a smoking gun yet. Hopefully AMD addresses the issue soon.
 
Anyone getting low GPU usage on Crossfire Setups? Im only getting 50% with the rig in my sig in BF3. Other benthmarks runn both at 100%. Running 11.11a + 11.11 CAP 2
 
Anyone getting low GPU usage on Crossfire Setups? Im only getting 50% with the rig in my sig in BF3. Other benthmarks runn both at 100%. Running 11.11a + 11.11 CAP 2

With BF3, it's probably your CPU that is holding back your GPU's.
 
Happy Thanksgiving. I am grateful for this game. Bugs and all. At least it pushes the hardware. Unlike other weak sauced, limp wristed, ... ... m ...... w. .....sucky....3

Thanks guys. Good review.
 
Anyone getting low GPU usage on Crossfire Setups? Im only getting 50% with the rig in my sig in BF3. Other benthmarks runn both at 100%. Running 11.11a + 11.11 CAP 2

Your CPU is bottlenecking your video cards. Q9550 is showing it's age with BFBC2 and BF3.
 
If you're running around engaging people at 20 yards with your F2000 using IRNV so the entire world is green and yellow, graphics settings don't matter.

Trying to pick someone out of the background at 900m for the perfect headshot, low settings don't cut it.
I wasn't saying graphics settings don't matter, nor was I saying that low settings are always better. I was saying that if you are forced to choose between eye candy and frame rate, the latter *is* always better, and that lowering detail (but not resolution) settings to maintain 60fps (or more, on 120hz LCDs or on CRTs) is a much more popular choice for a competitive online FPS.

The higher your frame rate, the fewer milliseconds it takes to see that guy standing up or zooming down his scope, the faster your reactions can be to enemy movements and the more accurately you can lead them. That part is not a matter of opinion; it's just fact.

Do note that low settings do not cut down view distance in this game. And in my experience, I'm going to get a lot more kills (and have more fun) running at 60fps minimum without ambient occlusion than I would running at 45fps and *with* FXAA, HBAO or SSAO. Your opinion may differ, but in a competitive FPS, I think you'll find that the vast majority of players will be maximizing frame rates over things like antialiasing and ambient occlusion. That is why I have issue with the methodology in this article.

And before you come at me with the "I want to enjoy the game and not just get kills" argument like someone else already did: that's fine. Enjoy your single player campaign with infinitely respawning AI robots to plink down like you're in a shooting range. We'll be over here playing against real people.
 
I wasn't saying graphics settings don't matter, nor was I saying that low settings are always better. I was saying that if you are forced to choose between eye candy and frame rate, the latter *is* always better, and that lowering detail (but not resolution) settings to maintain 60fps (or more, on 120hz LCDs or on CRTs) is a much more popular choice for a competitive online FPS.

The higher your frame rate, the fewer milliseconds it takes to see that guy standing up or zooming down his scope, the faster your reactions can be to enemy movements and the more accurately you can lead them. That part is not a matter of opinion; it's just fact.

Do note that low settings do not cut down view distance in this game. And in my experience, I'm going to get a lot more kills (and have more fun) running at 60fps minimum without ambient occlusion than I would running at 45fps and *with* FXAA, HBAO or SSAO. Your opinion may differ, but in a competitive FPS, I think you'll find that the vast majority of players will be maximizing frame rates over things like antialiasing and ambient occlusion. That is why I have issue with the methodology in this article.

And before you come at me with the "I want to enjoy the game and not just get kills" argument like someone else already did: that's fine. Enjoy your single player campaign with infinitely respawning AI robots to plink down like you're in a shooting range. We'll be over here playing against real people.

i can't agree more +1
 
And before you come at me with the "I want to enjoy the game and not just get kills" argument like someone else already did: that's fine. Enjoy your single player campaign with infinitely respawning AI robots to plink down like you're in a shooting range. We'll be over here playing against real people.

You can still play against real people and enjoy quality visuals - not everyone is in it just to get the highest possible scores. It's a game, not a job.
 
You can still play against real people and enjoy quality visuals - not everyone is in it just to get the highest possible scores. It's a game, not a job.
Work hard, play hard. If I get killed because my frame rate isn't up to par, that doesn't suddenly become acceptable just because the dude that shot me has some cool shadows underneath him, or a collar smoothed out by antialiasing. On a site so dedicated to squeezing performance out of games, I'm a little baffled at responses like this.

Please continue to attack my position and motivation when all I'm suggesting is consideration for both perspectives.
 
You can still play against real people and enjoy quality visuals - not everyone is in it just to get the highest possible scores. It's a game, not a job.

I play to win. It's how I was brought up, sports... games... etc. No half efforts.
 
Excellent review, [H]. It really covered everything I was hoping to learn/find out about the game; job well done, again. To the lot of you arguing over this "play to win" nonsense, to each his own, but it's just a video game (and this is coming from an ex-CAL player).
 
Work hard, play hard. If I get killed because my frame rate isn't up to par, that doesn't suddenly become acceptable just because the dude that shot me has some cool shadows underneath him, or a collar smoothed out by antialiasing. On a site so dedicated to squeezing performance out of games, I'm a little baffled at responses like this.

Please continue to attack my position and motivation when all I'm suggesting is consideration for both perspectives.

Were you really suggesting consideration of both sides when you said "if you are forced to choose between eye candy and frame rate, the latter *is* always better"? Didn't really sound like it - sounded more like you think your way is the only right way.

In any case, aren't you going to be limited by the LCD refresh rate, vice the game frame rate (unless you are using a 120Jz LCD)?
 
Last edited:
Were you really suggesting consideration of both sides when you said "if you are forced to choose between eye candy and frame rate, the latter *is* always better"? Didn't really sound like it - sounded more like you think your way is the only right way.

In any case, aren't you going to be limited by the LCD refresh rate, vice the game frame rate (unless you are using a 120Jz LCD)?

no your not, the reason horizontal tearing occurs is because for each screen refresh your video card is sending multiple frames to the display or they are out of sync. because LCD's refresh top to bottom updating in rows you end up seeing part of 1 frame then part of another and so on depending on how high your frame rate is. if you think im wrong about LCD's refreshing pixel state in this manner i can show you photo's taken with high shutter speed that proves this (i've done extensive input lag testing on LCD TV's in my quest to find a TV with low enough input lag to use as a PC monitor)
 
Were you really suggesting consideration of both sides when you said "if you are forced to choose between eye candy and frame rate, the latter *is* always better"? Didn't really sound like it - sounded more like you think your way is the only right way.
You have the right of it - I do think my way is the right way. That's how opinions work, and a whole ton of people who play competitive FPS games helped me form that opinion. Others' opinions may differ, but I'd love it if the serious players that pick hardware for online FPS games could depend on [H] for results that meet their needs, especially since most other sites lazily benchmarked the single player mode.

In any case, aren't you going to be limited by the LCD refresh rate, vice the game frame rate (unless you are using a 120Jz LCD)?
The numbers in this article show quite a few results coming in with minimums in the mid-40s - and worse.
 
no your not, the reason horizontal tearing occurs is because for each screen refresh your video card is sending multiple frames to the display or they are out of sync. because LCD's refresh top to bottom updating in rows you end up seeing part of 1 frame then part of another and so on depending on how high your frame rate is. if you think im wrong about LCD's refreshing pixel state in this manner i can show you photo's taken with high shutter speed that proves this (i've done extensive input lag testing on LCD TV's in my quest to find a TV with low enough input lag to use as a PC monitor)

No matter how it is refreshing, if the monitor has already drawn the position of the enemy player (say the middle of the screen) it is going to take 16 ms before it will refresh it again (60 Hz), no matter how many frames are being rendered by the GPU - so that enemy's position isn't going to change on your screen any faster. It may move farther when it does update (say, 2 frame's worth instead of just one, if you have 120 FPS) but it isn't going to update any sooner.

You have the right of it - I do think my way is the right way. That's how opinions work, and a whole ton of people who play competitive FPS games helped me form that opinion. Others' opinions may differ, but I'd love it if the serious players that pick hardware for online FPS games could depend on [H] for results that meet their needs, especially since most other sites lazily benchmarked the single player mode.

Normally I don't agree with the way [H] does their testing either, but in this case, if you want to know what card to get, can't you just extrapolate from what they have provided? If a GTX 580 gives middling FPS with Ultra settings at 2560x1600, then you can kind of figure that it is going to be better than that a High settings, right? I get that you don't know exactly what settings you can use, but if you are going to just turn them down and down and down until you get the FPS you want, does it really matter?
 
No matter how it is refreshing, if the monitor has already drawn the position of the enemy player (say the middle of the screen) it is going to take 16 ms before it will refresh it again (60 Hz), no matter how many frames are being rendered by the GPU - so that enemy's position isn't going to change on your screen any faster. It may move farther when it does update (say, 2 frame's worth instead of just one, if you have 120 FPS) but it isn't going to update any sooner.

if your lucky and the timing is right it could be sending new frame data right in the right place giving you a very slight advantage. were talking about less than 16ms here so its honestly not a big deal but every bit can count when it comes to lag of any sort with online FPS games because you have lots of lag to contend with, network lag often being the biggest one but you also have input lag on the PC (this game will actually tell you how long it is taking your CPU and GPU to render frames in MS, the faster it can render them the more frames it can create thus reducing the input lag) then you also have the input lag of your digital display to contend with on top of that.

all of these things stack on top of each other so when you look at them separately none of them seem like a big deal until you realize that you need to add them up to get the real lag figure

example network lag on a good local server will be say 30 ms
your running the game with Vsync on and you PC can keep up with this all the time 16.6 ms
your display in total between image processing lag and pixel lag 10ms

56.6 ms of input lag, not terrible but not stellar either esp when you think about back in the old days of Quakeworld and 3dfx cards in the real SLI on a CRT monitor the only delay you had to contend with was network lag, you were getting 100-200+ FPS all the time and had an analog display that truly has less than 1ms of lag.

my point is if i can shave 8ms of input lag off my game by running without vsync with slighty less than the max settings my video cards can handle i prefer to run this way for 2 reasons #1 that slight lowering of input lag could help from time to time, #2 i know i'll never get degraded performance when the going gets tough in game and when frame rates dip into the 30's and 40's it gets choppy and you get killed trying to figure out what just happened because now your system is taking 30-40ms or more to render a frame and don't tell me you cant visually see that happen :)
 
Great review. I've been very happy with my CF setup. I can run everything maxed (no MSAA) at 1920x1200 and get high 90's-mid 130's. Depending on the conditions. I've occassionally seen it dip below 90 but it almost never does that. I'm considering upgrading to 2560x1600 and I'm hoping that can keep the same settings and still get around 60fps. Is that possible?

Here are a couple screens I took. Downsized obviously.

bf320111126114031929.jpg


bf320111126114044032.jpg


bf320111126114123782.jpg
 
Great review. I've been very happy with my CF setup. I can run everything maxed (no MSAA) at 1920x1200 and get high 90's-mid 130's. Depending on the conditions. I've occassionally seen it dip below 90 but it almost never does that. I'm considering upgrading to 2560x1600 and I'm hoping that can keep the same settings and still get around 60fps. Is that possible?

Here are a couple screens I took. Downsized obviously.

With two 6970s at 2560x1600 the game is fluid in multiplayer except the very largest explosions (always above 60fps) but I've had to turn deferred AA and HBAO off to do this. Haven't experimented with motion blur.
 
I wonder what those AMD problems are the review mentions.
I run an Asus 6950 2GB unlocked to 6970 shaders and overclocked to 900Mhz on the core and 1400 on the memory.
When I put all my settings on Ultra except no MSAA, high FXAA, no motion blur and at a resolution of 1920x1080 my fps never dips below 30 and is often above 60. I haven't noticed any sudden lag when stuff explodes and debris flies around even on 64 player servers with a lot of chaos.
I bet I could up the minimum fps to 60 by turning some of the settings down as well.

I'm really puzzled by these findings. (And why do those poor AMD GPUs constantly get paired up with much more expensive Nvidia ones. :p )
 
^^^ I've seen what he is talking about. Even with a steady 59fps ( I use a console command to limit mine to 59fps as with vsync alone I get micro studder) when the mouse is moved very quickly or there is a lot of "action" with debris it gets choppy sometimes. Which is odd with the steady fps.. but it's noticeable. I hope AMD is working on this issue.

Oh.. and about the AMD cards getting matched up against more expensive NV models. Makes sense as the card comparisons are based on top tier.. vs. top tier and then matched equally on down the line. He does list the prices for each card in the article so folks are aware of the $$$ disparity. Anyone intelligently shopping a GPU will be well aware of the $$$ differences. Most 580's cost about the same as I paid for my two 6950's.. and my xfire crushes what a single 580 can do. That said.. BF3 is my main game ATM and I needed to do a lot of homework to get rig of the micro studder. I nearly gave up and was looking at a pair of 570's or single 580 before finding the fix.
 
Last edited:
Do you guys have FOV at 90? And the post-processing is it supposed to be on low, medium or high? I dont want it to be blurry so i just left it on medium. And I also notice slowdowns when tank explodes near me or if there is lots of fires, otherwise its in the 50s at worst 39 or something.
 
I just purchased 2 6950's to run Bf3 at 2560x1600 resolution and have been having some problems. For the most part it's still playable, but I've been unable to set everything to ultra. I set texture to ultra and all others to high with deferred AA off and I get probably 60 to 70 fps average. The problem is when dealing with large levels of destruction or viewing large scenes of action it drops down into the 40s and worst case I've seen is 30s. I tried changing everything to high and had the exact same drops. The 40s never seemed so bad before but it feels much worse here. The other thing is if i change my resolution to 1920x1080 I get pretty much the same minimum frame rates and drops with just a bit of a boost in the averages.

I'm running an i5 750, SSD, and 8gigs of ram so I don't think that is holding it back. Based on the article I'm wondering if it would be worth it to look at 2 of the gtx 570s 2.5gb from evga for better performance? The 570 wasn't reviewed in SLI let alone the 2.5gb version, but I'm wondering if anyone is using that setup for a 2560x1600 display? I can't really afford 2 gtx 580s but I could probably stretch for the 570s if they would be worth the upgrade over this setup. How far off would the performance of the 570s be compared to the 580s in the article?
 
I just purchased 2 6950's to run Bf3 at 2560x1600 resolution and have been having some problems. For the most part it's still playable, but I've been unable to set everything to ultra. I set texture to ultra and all others to high with deferred AA off and I get probably 60 to 70 fps average. The problem is when dealing with large levels of destruction or viewing large scenes of action it drops down into the 40s and worst case I've seen is 30s. I tried changing everything to high and had the exact same drops. The 40s never seemed so bad before but it feels much worse here. The other thing is if i change my resolution to 1920x1080 I get pretty much the same minimum frame rates and drops with just a bit of a boost in the averages.

I'm running an i5 750, SSD, and 8gigs of ram so I don't think that is holding it back. Based on the article I'm wondering if it would be worth it to look at 2 of the gtx 570s 2.5gb from evga for better performance? The 570 wasn't reviewed in SLI let alone the 2.5gb version, but I'm wondering if anyone is using that setup for a 2560x1600 display? I can't really afford 2 gtx 580s but I could probably stretch for the 570s if they would be worth the upgrade over this setup. How far off would the performance of the 570s be compared to the 580s in the article?

your dual core i5 is probably being the bottleneck, outdoor scenes and object destruction are very hard on the CPU in this game esp in multiplayer. go check out my benchmarks i did on my i5-2500k at various speeds and core counts.
http://hardforum.com/showthread.php?t=1654043
in that thread also check out the performance graph screen shots i took in game showing the CPU bottleneck esp when i took the screen shot outdoors reguardless of what speed i was running my CPU at.
to display this graph in game open the console with the ~ key and put in render.perfoverlayvisible 1
you can use the tab key to auto complete the command and it also shows up as you start typing it in the console so it makes it easier to do.
remember the graph is displaying how long it takes for your cpu and gpu to render each frame so higher is not better, lower is.
 
your dual core i5 is probably being the bottleneck, outdoor scenes and object destruction are very hard on the CPU in this game esp in multiplayer. go check out my benchmarks i did on my i5-2500k at various speeds and core counts.
http://hardforum.com/showthread.php?t=1654043
in that thread also check out the performance graph screen shots i took in game showing the CPU bottleneck esp when i took the screen shot outdoors reguardless of what speed i was running my CPU at.
to display this graph in game open the console with the ~ key and put in render.perfoverlayvisible 1
you can use the tab key to auto complete the command and it also shows up as you start typing it in the console so it makes it easier to do.
remember the graph is displaying how long it takes for your cpu and gpu to render each frame so higher is not better, lower is.

The i5 750 isn't dual core...
 
I just purchased 2 6950's to run Bf3 at 2560x1600 resolution and have been having some problems. For the most part it's still playable, but I've been unable to set everything to ultra. I set texture to ultra and all others to high with deferred AA off and I get probably 60 to 70 fps average. The problem is when dealing with large levels of destruction or viewing large scenes of action it drops down into the 40s and worst case I've seen is 30s. I tried changing everything to high and had the exact same drops. The 40s never seemed so bad before but it feels much worse here. The other thing is if i change my resolution to 1920x1080 I get pretty much the same minimum frame rates and drops with just a bit of a boost in the averages.

I'm running an i5 750, SSD, and 8gigs of ram so I don't think that is holding it back. Based on the article I'm wondering if it would be worth it to look at 2 of the gtx 570s 2.5gb from evga for better performance? The 570 wasn't reviewed in SLI let alone the 2.5gb version, but I'm wondering if anyone is using that setup for a 2560x1600 display? I can't really afford 2 gtx 580s but I could probably stretch for the 570s if they would be worth the upgrade over this setup. How far off would the performance of the 570s be compared to the 580s in the article?

The 6950s are not enough to run at that res with ultra everything. If you can replace that with 570s at least then that would be better, though more expensive. If you look at benchmarks online, you need two 580s to be able to run everything at max with AA. AMD GPUs do not perform well when AA is on.
 
Last edited:
The i5 750 isn't dual core...

you are correct. i forgot there was a quad core i5 before the sandy bridge ones.

as Haiku214 just said he probably needs to turn down the graphics settings more then, i know if i turn them up too high on my 2x 560 Ti setup i will run out of VRAM and that causes massive lag spikes
 
I know other people have microstuttering problems, but I want to see if there are others that have it with my setup.

I'm running an i5 2500k @ 4.5ghz and GTX 570 SLIs @ 1080p. I normally get 80fps outdoors on large 64 player games but sometimes it stutters severely making it impossible to play. The stutter lasts about 2 seconds and those 2 seconds are crucial as I may die. It goes from 80fps down to 30fps and jumps erratically making it almost unplayable. This happens at the most intense battles, explosions of tanks, jets and what not, so I think it's understandable for it to lag. I, however, want to find the culprit.

Is it because my CPU isn't able to keep up with the physics of so many things happen hence the stutter? My GPU isn't able to keep it up? Or is my connection the culprit? I'm only running a 2mbps connection, ie. I download at 300kb/s.

I tried disabling SLI and I average around 40-50fps and same thing happens. It drops down to an unplayable 10fps at these situations when it stutters. I just want to figure out what's wrong since I spent $700 on my GPUs and I can't even play it smoothly at times. The most important of times when I need my monster rig to kick in.
 
Back
Top