StarWars Battlefront performance Preview

I see 1050/1500 on the graph for the 390x. Fury X leads Titan X by 0.6 fps at 4k, that's basically margin of error. My point stands, bring in an aftermarket 980 and even a mild OC on Titan X or 980 Ti and AMD would lose handily because their cards are already pretty much maxed out from the factory. Considering this, it will only get worse for them after a few driver updates.

Here you go: http://www.pcgameshardware.de/Star-...950/Specials/Beta-Technik-Benchmarks-1173656/

1.3Ghz 980 vs 1070Mhz 390X and they are practically tied at 1080p. 1Ghz 290 vs 1.3Ghz 970 and 970 looses at 1080p. 1.3Ghz 980Ti is 15% faster than normal Fury, too bad they didn't have Fury X. 1080p + 200% res scale (effectively 4K) and that 15% cap on custom 980Ti shrinks to just 2% vs Fury and 390X is 11% faster vs 980.

And the maxed out from factory is only partly valid with Fury lineup, those custom 290 / 390 series do have some OC potential left in them (MSI 390X repeatedly goes to 1150-1200 range, not to some lousy 1100mhz which I could hit with stock 290X without voltage tweaking) and 79xx / 280 series can OC really well.

But why do I even bother with this obvious Nvidia fanboy :confused:

The good thing is that the performance is good across the whole board even if AMD seems to have a small edge in this game. That hasn't happened in a while so good for them. This is how games should be done and not with glue and gum. First make the game run properly and then add those fancy effect libraries :p
 
Last edited:
At 1440p the 980 gets 54.9 fps avg, the 290x gets 55.3 fps, that's not even 1 fps higher. And as I said bout the 390x, it's just a factory OC'd 290x with some tweaks, so a simple 980 OC (or factory OC'd version) would make up that small difference. That's why I said it looks more or less to be a wash and it's bad news for AMD considering these are all stock NVIDIA cards with nowhere to go but up with the OC headroom they got. The Fury line is pretty much already maxed so that's out of the equation and at best you see 390x top out at 1100ish so that's not enough to catch a 1500 mhz clocked 980. These reviews only tell one part of the story, stock cards, not every factor which is typically considered by smart consumers. You can argue that not all cards OC alike but I haven't seen any 980s that couldn't hit 1450-1500 MHz.

You just repeated yourself mostly and ignored the 970 v 390.
 
Works great here. never dips below 60. hits 70 in a lot of spots. all settings on high.
i5 2400@ 3.1,16gb ram,samsung 500 ssd and Gtx 970. 1440p gsync (rog swift)

Enjoying the game. lots of fun.:)
 
Here you go: http://www.pcgameshardware.de/Star-...950/Specials/Beta-Technik-Benchmarks-1173656/

1.3Ghz 980 vs 1070Mhz 390X and they are practically tied at 1080p. 1Ghz 290 vs 1.3Ghz 970 and 970 looses at 1080p. 1.3Ghz 980Ti is 15% faster than normal Fury, too bad they didn't have Fury X. 1080p + 200% res scale (effectively 4K) and that 15% cap on custom 980Ti shrinks to just 2% vs Fury and 390X is 11% faster vs 980.

And the maxed out from factory is only partly valid with Fury lineup, those 290 / 390 series do have some OC potential left in them and 79xx / 280 series can OC really well.

But why do I even bother with this obvious Nvidia fanboy :confused:

The 390X models that are around 1070-1100 mhz are pretty much close to maxed out, if you check oc.net the guys there even on water aren't cracking above 1200 and the guys on air are in the 1100s. It's even worse for the Fury X as the scaling when it's OC'd is beyond terrible. Crank that 980 to even 1400 MHz and it would leave the 390x behind since they're already neck and neck. And 1080p + 200 res scaling, really? I don't even know why they put that there since every card was under 60 fps, who is gonna bother? Even guys with 4k displays will set lower resolutions to get playable fps. The only NVIDIA card in the mid range I see not doing as well as it should is 970. AMD's best card in all this is the 290x which has held up really well.
 
I swear that this is the last time I will answer to this troll but why the heck would anyone want to lower resolution from 4K and not just lower some other settings like ambient occlusion etc. to get playable fps.

Why even buy 4K monitor if you are going to play at 1080p unless you like the blurry picture which comes when you have some other resolution except 1080p on a 4K monitor?
 
I swear that this is the last time I will answer to this troll but why the heck would anyone want to lower resolution from 4K and not just lower some other settings like ambient occlusion etc. to get playable fps.

Why even buy 4K monitor if you are going to play at 1080p unless you like the blurry picture which comes when you have some other resolution except 1080p on a 4K monitor?

Glad you recognize that the resolution scaling numbers are worthless.
 
Have to retract my last statement of not answering to joker so here we go.

Ofc I recognize that those res. scaling numbers are worthless when the best card gets 45fps which is borderline playable (for some). Why can't review sites test games with playable settings at 4K so that even some cards would be able to have that 60fps avg framerate like how [H] does it and also my favorite review site (muropaketti: finnish hardware site) does it.

I added those results to my post because they were there but it seems I should have also added 1440p results but I was lazy. I only answered to your post in the first place because you said that AMD lineup would lose handily against just aftermarket or mildly OC'ed Nvidia cards and judging from PCGH results, that doesn't happen. Yes aftermarket 1.5Ghz 980 will most likely be little faster than lets say 1150mhz MSI 390X but my bet would be that the difference is measured in single digits. Maxwell also stops scaling properly at around 1450mhz range and Hawaii does that around 1200mhz range but most of the Hawaii chips can't even hit that mark.
 
Last edited:
Glad you recognize that the resolution scaling numbers are worthless.

???, Nvidia or AMD can GPU scale and keep the monitor at native resolution. Meaning 1080p on a 4K monitor would look, well like a 1080p monitor and not like a blurry mess if you have the monitor do the scaling. (Some monitors scale just fine as a note.)
 
Battlefront is being a bit misunderstood (or you can save overrated) in terms of actually how well "optimized" it is (this term is getting misused a lot).

It looks very good due to a combination of design and technology due to synergy. Everything is a good fit basically.

You don't even need to compare to Dragon Age Inquisition. If you look at it in more detail Battlefront really is being asked to do much less than Battlefield 4. The game, at least what is available in the beta so far, is much more static with less going on. The environment compared to BF4 is extremely sparse as well in terms of objects. 64 vs 40 players should also tell you about the scale of both games. In BF4 you'd have more players, more vehicles in combat with dynamic destruction of an environment littered with buildings, trees, and other objects. Battlefront, at least Battle of Hoth, is really a very sparse and static open snow field with comparatively nothing on it (very detailed textures due to the new technology though), much less players and much less vehicles (you'd could have more tanks alone on a BF4 map then all the vehicles on Hoth combined).

But this is a great game in terms of showing the importance in design and how that transfers to visuals as opposed to just ratcheting up fidelity.

Edit: I just want to add the one thing I don't like about DICE's direction since Bad Company is that we seem to be going backwards in terms of dynamic environments. Even moving from Bad Company 2 to Battlefield 3/4 they ended up scaling back how dynamic the environments were in terms of destruction. The difference doesn't really show up though if you are just taking screenshots or admiring the visuals, but in terms of the actual feel when playing it makes a large difference in my opinion. Battlefront really feels rather static basically by comparison.

Exactly. It feels like I'm playing quake ctf with a texture mod. Game puts me to sleep, but it is very pretty :D

Also, as anticipated it runs at 60fps on my GTX 960 @ 1630.

I went back to playingBF4 after I finished this snooze fest.
 
Is there an in-game FPS counter or do you need to use Fraps/Afterburner to see?
 
oh man, the setup was flawless... well done, you two.

/applaud

The 390X models that are around 1070-1100 mhz are pretty much close to maxed out, if you check oc.net the guys there even on water aren't cracking above 1200 and the guys on air are in the 1100s. It's even worse for the Fury X as the scaling when it's OC'd is beyond terrible. Crank that 980 to even 1400 MHz and it would leave the 390x behind since they're already neck and neck. And 1080p + 200 res scaling, really? I don't even know why they put that there since every card was under 60 fps, who is gonna bother? Even guys with 4k displays will set lower resolutions to get playable fps. The only NVIDIA card in the mid range I see not doing as well as it should is 970. AMD's best card in all this is the 290x which has held up really well.

are you going to continue? dude is nvidia paying you something at least? :confused:
 
I'm running 2560x1600 with ultra settings getting 105-135 fps at 2560x1600, looks and runs great. The game does crash though.
 
The 390X models that are around 1070-1100 mhz are pretty much close to maxed out, if you check oc.net the guys there even on water aren't cracking above 1200 and the guys on air are in the 1100s. It's even worse for the Fury X as the scaling when it's OC'd is beyond terrible. Crank that 980 to even 1400 MHz and it would leave the 390x behind since they're already neck and neck. And 1080p + 200 res scaling, really? I don't even know why they put that there since every card was under 60 fps, who is gonna bother? Even guys with 4k displays will set lower resolutions to get playable fps. The only NVIDIA card in the mid range I see not doing as well as it should is 970. AMD's best card in all this is the 290x which has held up really well.


WTF you post the thread about power savings and saving the environment.... then proceed to throw all that out the window in this post talking about just maxing out the OC which just sucks down the power 290/290x style...980 OC adds 70-100W more power draw.
 
Is there a major difference between keeping resolution default but doing supersampling (like 150% resolution) vs doing 1.5X DSR?
 
I'm running 2560x1600 with ultra settings getting 105-135 fps at 2560x1600, looks and runs great. The game does crash though.

Oh good! I am not the only one!

I ran into Crashes several times. I was worried it was my newly OCed Processor, but my other game worked great.

Using the latest catalyst beta drivers (the ones that fixed the memory leak).
 
I'm not crashing and I don't really need to Crossfire the cards, but I would like Crossfire to be working properly to test higher DSR.
 
WTF you post the thread about power savings and saving the environment.... then proceed to throw all that out the window in this post talking about just maxing out the OC which just sucks down the power 290/290x style...980 OC adds 70-100W more power draw.

LOL adding a 100 W to a 980 still wouldn't come close to a stock 290x or 390x: http://tpucdn.com/reviews/MSI/R9_390X_Gaming/images/power_maximum.gif

Is there a major difference between keeping resolution default but doing supersampling (like 150% resolution) vs doing 1.5X DSR?

Visually, it's really debatable in this game. I tested the difference between 1440p and 1440p + 130% and found it wasn't a big enough difference to warrant the drop in performance. Also, for NVIDIA users, the SLI utilization isn't really good right now, with both my cards at 1.4 GHz, I don't really cross above 70% utilization very often which means there's a lot of untapped performance left over.

gOffhIC.jpg
 
That's a burn load, FurMark probably. There's no way a 390X consumes 400W+ by itself in actual games.
 
That's a burn load, FurMark probably. There's no way a 390X consumes 400W+ by itself in actual games.

It's max power consumption but their test uses furmark. Peak is metro last light: http://tpucdn.com/reviews/MSI/R9_390X_Gaming/images/power_peak.gif 370W for stock 390x at peak vs 184W for 980. Still a big difference even if you toss in another 70-100W like he claims. There are new games that do push the GPU close to 100% utilization if they're coded right, I know a single GPU of mine will stay around 90-100% in GTA V almost the entire time with certain settings.

Oh good! I am not the only one!

I ran into Crashes several times. I was worried it was my newly OCed Processor, but my other game worked great.

Using the latest catalyst beta drivers (the ones that fixed the memory leak).


If you're crashing, it's because your overclock is unstable. Mine is rock solid and hasn't crashed once so you should probably dial back the CPU OC setting or add more voltage.
 
Last edited:
Is there an in-game FPS counter or do you need to use Fraps/Afterburner to see?

Open console, type PerfOverlay.DrawFps / Graph 1

One does the obvious, the other plots both the frame times of the CPU and GPU.
 
Had my first BSOD in six months today while playing. Bumped my vcore on my 5960x from 1.35 to 1.38V. (4.6Ghz core/uncore) We'll see if it happens again.
 
Anybody getting a weird screen tearing effect? Like a small tear, maybe just a few pixels. I am getting this with VSYNC on.
 
Runs fine on 8.1 win and a 290 DD at 290x clocks. About the same as bf4. Still not buying unless a bunch of changes and improvements are done.
 
windows 10 is free.... you should blame nvidia for their mediocre w10 drivers.

As long as you paid for every single copy of Windows 7 you have installed.

And yes, you can still have a "genuine" install with it on multiple machines.

Ask me how I know :D
 
windows 10 is free.... you should blame nvidia for their mediocre w10 drivers.
Uh huh.

https://www.reddit.com/r/Amd/comments/3o7wmq/lets_do_something_about_these_dx11_issues_weve/

https://www.reddit.com/r/Amd/comments/3o3u6f/looking_to_make_a_list_on_the_r9_2853xx_driver/

My favorite part are the people saying "Don't spam social media it makes AMD look like they have bad drivers" in a thread specifically about bad drivers... nice.

For fair play, here's the Nvidia equivalent.

https://www.reddit.com/r/nvidia/comments/3nvg44/i_cant_stand_these_shit_drivers_man/

I think I'll keep my house on Windows 7 for a bit longer.
 
Uh huh.

https://www.reddit.com/r/Amd/comments/3o7wmq/lets_do_something_about_these_dx11_issues_weve/

https://www.reddit.com/r/Amd/comments/3o3u6f/looking_to_make_a_list_on_the_r9_2853xx_driver/

My favorite part are the people saying "Don't spam social media it makes AMD look like they have bad drivers" in a thread specifically about bad drivers... nice.

For fair play, here's the Nvidia equivalent.

https://www.reddit.com/r/nvidia/comments/3nvg44/i_cant_stand_these_shit_drivers_man/

I think I'll keep my house on Windows 7 for a bit longer.


You better with nvidia, smooth sailing on my 290 and windows 10.
 
Do you have a link saying this??
Yes, WDDM 2.0. It's not like they can really do anything about it.
Unless you're expecting a miracle now, after Windows 10 is already released? Which makes no sense.

If you have an AMD card then you should upgrade to Win10 immediately. Same is true for Nvidia apparently (their performance also went up based on the Battlefront benchmark) but by a much smaller amount.
 
The better question is "where's the fire?" Why did people rush to downgrade to a half baked Windows 10 before there are even any playable DX12 games? By the time the next Deus Ex comes out there will already have been a major update, and hopefully MS will have also pulled their heads out on the privacy issues.

So I don't mind that Nvidia and AMD stay focused on 7 and 8 drivers, 10 is like 5% of PC's and DX12 games are MIA.
 
Last edited:
Back
Top