Battlefield 3 Open Beta Performance and Image Quality @ [H]

Running a 2500k at 4.7Ghz with a stock 6870 at 1920x1080 and on ultra (with blur off/0, post processing AA off) I fraps 31 average outdoors 50s indoors.

I might hold out until the 7000 series.
 
How much vram did those nvidia cards have? For 560ti there is 1GB and 2GB, for 570 there is 1.25GB and 2.5GB, and for the 580 there is 1.5GB and 3GB. It would be interesting to compare those cards and see if vram is running out...
 
Anyone get a Crash to desktop (without any error) when switching to full screen mode with crossfire enabled?

Win 7 x64, 16gb ram, dual 6950's, 11.10 preview drivers, no cat's installed
 
Have been pretty suprised how well my P2 920 x4 and nvidia gtx260oc have been running this game at 1680x 1050, most of the settings are medium with a few things turned off (mostly stuff the card doesn't support). Can't wait till i get to build a new rig for it though, if it looks this good on my dinosaur of a computer I can just imagine how good it looks on something more modern.
 
Anyone get a Crash to desktop (without any error) when switching to full screen mode with crossfire enabled?

Win 7 x64, 16gb ram, dual 6950's, 11.10 preview drivers, no cat's installed

No crash to desktops in all the playing for me. I had one serious slow-down and one lock-up.
 
No CTDs in Tri-Fire, The only problem i have is the Flashing Polygons, that and to me the Colors arent as Vibrant as the Alpha, Does anyone notice that as well?
 
With my specs below, what would you think I'd be for this game at a 1680x1050 res? CPU or GPU limited? I am planning on upgrading the GTX285 to a 560TI in order to get a DX11 image quality bump. Would that be a good move?

Currently I'm running at 720p with everything on high and low AA w/ HBAO stuff off. This is smooth, but its freaking 720p. =(

I also get alot of tearing / screen flashes behavior, along with a random line connecting my viewpoint to some vector in the distance.
 
No CTDs in Tri-Fire, The only problem i have is the Flashing Polygons, that and to me the Colors arent as Vibrant as the Alpha, Does anyone notice that as well?

High/Ultra turn down the contrast. Try running low and see if that compares to the alpha.
 
I'm running 460's in SLI and the game is very smooth with everything maxed except AA, though I have post AA set to High. The primary issue I have is that the game is not yet optimized for 3 monitors. Even before pushing the graphic settings the game was completely unplayable for me when I enabled the 3 screens, also most of the UI elements (minimap, ammo counter, etc.) would not show up. Once I switched it to run on a single screen everything worked and the game ran smoothly.

Thanks for this info! I run 460's in SLI as well, haven't tried the Beta yet!
 
That's a pretty outdated CPU as well, to be honest. Why would you SLI such low end cards? You'd be better off with one faster card, like a GTX570.

i hardly call the c2quad extreme edition cpu outdated... i see no reason to replace the cpu at all, thats not where the bottleneck is on my cfg with the game, the gpu is....and are u gonna pay for my gtx 570's x2 ??? card is exp as shit....

GTX 550 Ti by no way are low end cards bro.... i had low end cards before and those are not low end.. why buy one when i can have 2 at same price of the one..
 
Four years old is outdated, it's the 6 series, which means literally, a current i5 2500K is going to be nearly 70% faster than it, before it gets overclocked.
The 9600GT actually sat in a higher position in the product lineup as the GTX550Ti does now.
Here's nvidia's current 5 series lineup:


GT520 - Bottom end
GT545 - Very low end
GTX550Ti - low-midrange
GTX560 - midrange
GTX560Ti - upper-midrange
GTX570 - high-end
GTX580 - top-end single
GTX590 - top-end dual

As you see, the 550Ti is pretty far down.
Also, I wasn't telling you to go with two 570s, just one would be just as fast as two of the 550Tis, without the complications of SLI to deal with.

As a comparison, here's it placed against the old Geforce 9 series lineup:
9300GS - bottom end
9400GT - low end
9500GT - low-midrange
9600GSO - low-midrange
9600GT - midrange
9800GT - high-end single
9800GTX - top-end single
9800GX2 - top-end dual
 
Makes sense. When the full game's released though, I dread to think what performance is going to be like in the worst maps. This is one of the most demanding multiplayer games I've ever seen.

Really it is. Never thought an MP would require SLI run over 60fps.
Only can hope the game performs better for retail or I may just buy two 6970s.
 
My Experience With BF3 Beta

I have 2 6970's and a 6950 in CrossfireX. The 6950 is running with the 6970 Bios Hack. My setup FPS never drop below 110 and Max @ 199 playing at 1080P resolution on Ultra in the open field. I did notice and it became very annoying that in the open field I get weird artifacts when CrossfireX is enabled. In the subway everything is sweet and my FPS is between 135 and Max @ 199. I found when I disabled crossfire the artifacts went away.(So did my great frame rate) I thought maybe one of my cards was dying so I tested every card by itself and in different crossfire configs. It only happens to me when I have crossfire enabled. Weather its 2 cards or 3.

I was glad to get a quick feel for the gameplay which I like and to know that my system is will allow me to play the game in all its glory except for PhysX. I know its a Beta and just wanted to post my thoughts. Also when I started to get artifacts it allowed me to glitch under the board in the open field which was very annoying. Some people like ranking up by cheating but not me. It just made me stop playing the stupid thing till the retail comes out. I know once they get most of the bugs worked out and we get some decent drivers the game will be epic.

Specs:
CrossfireX two MSI 6970's and MSI 6950 @ 920/1400 (Normally runs @ 960/1450 But BF3 Beta crashes) fine for every other game I play. I used to have 2 HIS 6950 and the MSI 6950 but lost the 2 HIS to water cooling leak and replaced the damaged cards with true 6970's because I needed reference cards for my water blocks.
MB Asus Maximus IV Extreme
CPU 2600K @5.1Ghz Rock Solid (Can never get 5.2 Stable for more than 20hrs)
Memory 8GB Corsair Dominator GT @1600 7-8-7-20
Hard Drive 2 1TB WD Black (I will complete the system with SSD's once I get some more money)
OS Win7 Ultimate X64
Samsung 55' LED LCD
 
Really it is. Never thought an MP would require SLI run over 60fps.
Only can hope the game performs better for retail or I may just buy two 6970s.

Actually, I disagree. I hope the level of detail involved with Ultra is so demanding that it CANT be run on a single card at 19x12. I believe that tech like this could spur hardware evolution and reignite another race for the fastest. It seems like over the last few years the graphics industry has really slowed down and the competition is gone. Granted its because we are really down to just two players now but still...every 6 months for awhile there we were getting new cards that pushed the envelope just a bit more. Hell, multimonitor gaming was a response from AMD for games not needing a doubling every 6 months to a year in graphics horsepower so they tried to evolve to give reason for another generation. Software like this could give reason for a doubling or two now.

My beta experiance is that trifire 6970s can max everything and run well over 100 fps at 1900x1200...though I haven't tried enabling MLAA...just been using the x4 MSAA ingame option. I didnt play a whole lot with 5760x1200 but I was getting pretty consistantly around 40fps with everything but AA on but I was experiancing microstutter at that low of an FPS so I decided to back off that resolution.
 
It's not really for a lack of demand, moore's law is coming to an end, it's getting harder to produce ever smaller silicon, and new big architectures without smaller silicon, and well, look at Fermi...
 
if it ends up looking much better then I have no issue with high end cards struggling.

I will certainly bitch about low framerates if we still have crap like this though in the final game. really wtf? :eek:

 
Last edited:
if it ends up looking much better then I have no issue with high end cards struggling.

I will certainly bitch about low framerates if we still have crap like this though in the final game. really wtf? :eek:


Hoping they have tessellation in the retail to make those sandbags at least appear like sandbags and not a wall with a texture on it.
 
I do not even care if they use tessellation on it but it certainly needs to look like a modern 3d image instead of a 2d cardboard cutout. if Far Cry 2 could give us a modern looking sandbag then surely a brand new 2011 game hyped to push the graphics can.


image upload
 
Last edited:
Wow a 6950 is beating a GTX 570. I may reutrn my 570 and pick up a 6950...
 
Actually, I disagree. I hope the level of detail involved with Ultra is so demanding that it CANT be run on a single card at 19x12. I believe that tech like this could spur hardware evolution and reignite another race for the fastest. It seems like over the last few years the graphics industry has really slowed down and the competition is gone. Granted its because we are really down to just two players now but still...every 6 months for awhile there we were getting new cards that pushed the envelope just a bit more. Hell, multimonitor gaming was a response from AMD for games not needing a doubling every 6 months to a year in graphics horsepower so they tried to evolve to give reason for another generation. Software like this could give reason for a doubling or two now.

Except it is a little risky doing it in a primarily multiplayer game, especially if, as seems to be the case, lower quality settings give you a qualitative advantage in the game.

Hoping they have tessellation in the retail to make those sandbags at least appear like sandbags and not a wall with a texture on it.

I think the broken off trees are worse. Really, really crappy.
 
H3llsGamingRig.jpg
 
You might be able to run BF3 at a very low res and low setting with your setup their :)
 
settings.jpg


Getting this framrate for those settings @ 4800x900
min max avg
39, 114, 62.315

Looks like an additional card (at least) is needed for 5760x1080 and higher settings =/
 
Last edited:
I experienced stuttering with the nvidia beta drivers, especially in the last section of the metro map- the final part in the town after the subway part. I have gtx460 in sli on a 1680 x 1050 monitor. It doesn't stutter with the regular nvidia drivers but I lost about 10 fps (from 60 fps to about 50fps with everything on Ultra) and I have water textures in places where it shouldn't be.
 
I am pretty sure that the only ultra setting that is working in the demo are shadows all the the other settings don't change picture quality between high and ultra.
 
After playing BF3 beta there's no way in heck I'd get this game.

I feel like I'm playing Plan of Attack mod for HL1 or 2.

Bring on CSGlobal w/e
 
The metro map is a horrible example imho
The Caspian map was a whole different feeling, that's the one they should be using
 
Can anyone explain why you'd want both MSAA and MLAA (deferred and post-processing) turned on at the same time? I thought MLAA was a replacement for MSAA - so I'm not sure what benefit you get from having two AA routines running.

I want to know the answer to that as well. MLAA actually blurs the screen, which defeats the purpose of MSAA.
 
I run the game on Ultra @1920x1080 and it gives me consistent 50+ fps.Then i try custom setting with all AA off,8xAF and SSAO,which increases the min fps to 60.

The trouble is that after 15-20 mins,the GPU starts to go on fullload,giving me stuttering and freezing,especially when i face the enemies.Then it runs smoothly for 30 seconds to a minute then starts the circle of stuttering and freezing again.VRAM also goes up to 980MB.

Anyone got the same trouble?I use Catalyst 11.9,can't try 11.5 because the game requires 11.7 or higher and i hate the 11.7.
 
I run the game on Ultra @1920x1080 and it gives me consistent 50+ fps.Then i try custom setting with all AA off,8xAF and SSAO,which increases the min fps to 60.

The trouble is that after 15-20 mins,the GPU starts to go on fullload,giving me stuttering and freezing,especially when i face the enemies.Then it runs smoothly for 30 seconds to a minute then starts the circle of stuttering and freezing again.VRAM also goes up to 980MB.

Anyone got the same trouble?I use Catalyst 11.9,can't try 11.5 because the game requires 11.7 or higher and i hate the 11.7.

Running rig in my sig...2 580 sli's on my dell u3011. All maxed out. I will getr around 50-70 fps using fraps. After awile of gaming though, it feels like a memory leak or something else as my fps go into the tank and I hover around 20-25. then out of nowhere it will pick back up, but then dip down again...have not checked vram useage though...I hope that whql drivers and the final game with the first few patches will iron everything out...
 
well you may want to check other reviews because that is only happening here. or did I miss the sarcasm?

im sure i7 920 is to blame here, amd cards always ran better with older/slower cpu's.

I always gave AMD props for that, their gpu's require less cpu speed and cores to get them working at highest peek.

fermi is nvidia's biggest cpu hog so far, thing requires a 3rd cpu core in some games where amd card show no cpu usage at all after 2 cores.
 
btw, is it illegal these days to compare image quailty between amd and nvidia? :confused::confused::confused:

i dont see anybody doing it anymore.
 
Back
Top