StarWars Battlefront performance Preview

Stoly

Supreme [H]ardness
Joined
Jul 26, 2005
Messages
6,713
Guru3d has a performance preview with top-mid level cards

Impressive AMD results, mainly the 290x/290 besting the GTX980/GTX970 respectively.

And the Fury X about as fast as a 980Ti. If only it was the same in more games...
 
Wouldn't be surprised if NVIDIA had a driver update around launch that fixes this.
 
Those graphics look really amazing, although it looks like my 2 280xs should be able to manage 1080p reasonably well. It's been a long time since I've seen screenshots and wanted to buy a game just because of how it looks!

Still waiting for reviews though. Oh wait, they're running an open beta. I'm getting in on that action for sure.
 
Those graphics look really amazing, although it looks like my 2 280xs should be able to manage 1080p reasonably well. It's been a long time since I've seen screenshots and wanted to buy a game just because of how it looks!

Still waiting for reviews though. Oh wait, they're running an open beta. I'm getting in on that action for sure.

Yup, I'm running 2x 280x cards also which the Frostbyte engine usually can run both at 100% in Crossfire. Eager to try this one!
 
Looks like my old 4GB 670 should be around 50fps at 1080p with max settings. I've been itching to upgrade to a 970/980 but I think I can hang until Pascal.
 

The guru3d review already uses that driver...

The original post was referring to the game's official release date (November-something). As for "fixing" the game, he's probably comparing it to BF4's performance. No doubt AMD & EA have spent the last 2 years optimizing GCN to the limits for this game, it clearly shows in the benchmarks.

It's too bad they only use AMD GPUs over there at EA, otherwise we might have got some Maxwell optimizations. Nvidia needs to work on their outreach program! :p
 
The original post was referring to the game's official release date (November-something). As for "fixing" the game, he's probably comparing it to BF4's performance. No doubt AMD & EA have spent the last 2 years optimizing GCN to the limits for this game, it clearly shows in the benchmarks.

It's too bad they only use AMD GPUs over there at EA, otherwise we might have got some Maxwell optimizations. Nvidia needs to work on their outreach program! :p

Well, to be fair Johan Andersson was on a nvidia panel presentation a couple years ago so I'm sure nvidia is working with them plenty also. Wouldn't be surprised if the engine was pretty highly optimized all around. Maybe the additional effects are more compute heavy which would favor AMD such as Directcompute.
 
Ohes noes, my GTX 960 is slower than the AMD equivalent in a beta! It's the end of the world as we know it! They'll optimize the drivers shortly, just like they did with Battlefield 4.

Until then, I'm fine. Last time I checked 46fps all ultra was plenty playable.
 
Boycott EA. We demand unbiased developers in the PC gaming industry, this is unacceptable.
AMD favoritism must end.

Electronic Arts' recommendation of a GeForce GTX 970 will give gamers 60 frames per second and above at 1920x1080, with medium detail settings, in the Walker Assault's 40-player battles. Higher detail levels will require more powerful GPUs, such as the GeForce GTX 980 and 980 Ti, as will higher resolutions.
 
960 gets trashed pretty hard. :eek:

Otherwise, most of the stuff should run this title quite well.
 
Isn't this game when you boil it down bf4 with blasters and light sabers? I would think it would perform similarly.
 
Isn't this game when you boil it down bf4 with blasters and light sabers? I would think it would perform similarly.

Looks a lot better than BF4 to me. Worlds look more detailed. Well it is Star Wars so maybe I saw it through rose colored glasses when I played the Alpha awhile ago.
 
Ohes noes, my GTX 960 is slower than the AMD equivalent in a beta! It's the end of the world as we know it! They'll optimize the drivers shortly, just like they did with Battlefield 4.

Until then, I'm fine. Last time I checked 46fps all ultra was plenty playable.

Pretty easy to OC a 960 as well, I haven't met one that didn't do at least 1500MHz
 
Pretty easy to OC a 960 as well, I haven't met one that didn't do at least 1500MHz

OCing might help average a bit but AMD's frame times appear to be much more consistent. That's what I really care about - buttery smooth.
 
Pretty easy to OC a 960 as well, I haven't met one that didn't do at least 1500MHz

Oh, that's no issue. Mine's overclocked quite happily to 1628. So if those benchmarks use stock speeds, then the actual game should play at 60 fps on my card :D

And I wasn't kidding when I said the previous Battlefield game had massive AMD optimization on-release:

http://www.techspot.com/review/734-battlefield-4-benchmarks/page3.html

2qpEYL5.png


Is that a lowly HD 7870 rolling all over the GTX 670, even though the card is 20% faster in almost every other game? Why yes!

Is that a R9 290X rolling all over the Titan, a card that normally matches it? why, yes!

Performance has increased about 35% for the GTX 770, and 45% for the GTX 760 since then, while AMD has improved about 20%:

http://www.techpowerup.com/reviews/Gigabyte/GTX_950_Xtreme_Gaming/10.html

NwdpnSd.gif


I expect something similar from this game. AMD gets the early start on optimizations, then the difference disappears after a couple months.
 
Last edited:
Wouldn't be surprised if NVIDIA had a driver update around launch that fixes this.

over time AMD improves performance more than nVidia does through drivers, there was a % increase in performance over time chart somewhere a while back that showed AMD about 10% higher performance gain on average over time than nVidia from time of release.
 
Ohes noes, my GTX 960 is slower than the AMD equivalent in a beta! It's the end of the world as we know it! They'll optimize the drivers shortly, just like they did with Battlefield 4.

Until then, I'm fine. Last time I checked 46fps all ultra was plenty playable.

I guess when you are in as rough a spot as AMD, you will take any crumb tossed your way. Even if it's just in benchmarks from unreleased games. As long as those benchmarks are from "fair" sites of course.
 
Is that a lowly HD 7870 rolling all over the GTX 670, even though the card is 20% faster in almost every other game? Why yes!

Is that a R9 290X rolling all over the Titan, a card that normally matches it? why, yes!
Everybody's in a fuss over DX12, so the Battlefront benchmarks are getting more play. I remember the initial BF4 benchmarks but at the time nobody really cared because it was an AMD game in the first place.

Performance has increased about 35% for the GTX 770, and 45% for the GTX 760 since then, while AMD has improved about 20%
No you're confused, Nvidia cripples drivers over time.

Kinda makes me wonder what changed from BF4 to Battlefront to cause Nvidia to lose so much performance or vice versa.
 
Kinda makes me wonder what changed from BF4 to Battlefront to cause Nvidia to lose so much performance or vice versa.

Two years of engine refinement, perhaps?

You can be sure they have not been standing still. This is probably more like a Frostbite 3.5 than the 3.0 that debuted on BF4. Would also explain why cards like the 285 that can almost play BF4 at 1440p smoothly are stuck at 1080p.

This is as-opposed to Battlefield Hardline, which was released only as year after, and felt like an expansion.
 
Last edited:
Any crossfire numbers? About to download this and I'll give it a whirl
 
Impressive AMD results, mainly the 290x/290 besting the GTX980/GTX970 respectively...

A game built on an engine made in partnership with AMD is favoring AMD hardware over Nvidia...shocking! :D
 
Can anyone that is playing the Beta tell me if you can change that insanely narrow 75 degree FoV to something more reasonable like 90~100? I get motion sick if I use such a narrow FoV :(..
 
If this is the only game you play, then it looks like AMD is great bang for buck.

I'm on a 980ti, but I don't begrudge anyone for getting good performance from AMD cards... weirdly it doesn't affect me at all.
 
Why don't the reviewers use non-reference cards for main reviews. I don't see many people on this forum using a reference 980 GTX Ti.
 
Why don't the reviewers use non-reference cards for main reviews. I don't see many people on this forum using a reference 980 GTX Ti.

because then they have to test every non-reference card out there in the market for every product, they should and have to use only reference cards to give REFERENCE numbers for everyone. If not then we will have things like "why are you guys so biased? using a classified 980TI just to inflate numbers and make the Fury X look worse" "why using an Asus Strix 970? it will make the 290 look worse with inflated numbers, god damn biased nvidia paid sites". reference cards allow reference numbers and reference performance, you have any aftermarket cooled card? great you will receive much better numbers than the game review isn't that good?.
 
Back
Top