StarWars Battlefront performance Preview

Can anyone that is playing the Beta tell me if you can change that insanely narrow 75 degree FoV to something more reasonable like 90~100? I get motion sick if I use such a narrow FoV :(..

Frostbite gives you the vertical FOV, unless they've suddenly decided to change for this game.

At 16:9, that 70 fov is effectively 102 horizontal.
90 would be 121.

It's not very narrow at all.
 
Ohes noes, my GTX 960 is slower than the AMD equivalent in a beta! It's the end of the world as we know it! They'll optimize the drivers shortly, just like they did with Battlefield 4.

Until then, I'm fine. Last time I checked 46fps all ultra was plenty playable.

Sarcastic defense mechanism to hide butt hurt.

That Avalanche guy needs to be put on suicide watch as well - that was brutal.
Both Nvidia and AMD run well on this game. AMD just runs a little better. Everyone should be happy. This game engine also scales well and is not a vram hog like so many other games while still having great visuals. What more can one ask for?
 
Does the frostbyte 3 engine work in linux/have any linux plans? I read some articles talking about the possibility of a linux port but that was from 2013. Would be interesting to see how this performs in something other than windows
 
I wouldn't say the game is bias to AMD if your judging it by the performance of the 290x ... AMD's partners have redesigned the layout for Hawaii as to not throttle and used faster clocked memory (Samsung 1350Mhz) and some layouts use two 8 pin connectors for a real 375 watts of useable power and the gpu's are even clocked higher like1020Mhz..

So if you take what partners have done for Hawaii and add driver performance by AMD then it's not so hard to believe why the 290x can now compete with the 980GTX and win some benchmarks at a lot cheaper price as my Sapphire Tri X 290x New Edition cost me $269 and has the improved layout.
 
AMD is the leader in pre-release or beta benchmarks (like DX12)...congrats!...once final versions hit then Nvidia almost always surpasses AMD :D
 
AMD is the leader in pre-release or beta benchmarks (like DX12)...congrats!...once final versions hit then Nvidia almost always surpasses AMD :D
Eventually, maybe. If you think Nvidia is going to roll out a magical driver on launch day that boosts their performance by 20%ish then you're going to be disappointed.
But who knows, launch is a month away, maybe they have an ace up their sleeve. :rolleyes:
 
Eventually, maybe. If you think Nvidia is going to roll out a magical driver on launch day that boosts their performance by 20%ish then you're going to be disappointed.
But who knows, launch is a month away, maybe they have an ace up their sleeve. :rolleyes:

Battlefront like all DICE games is sponsored by AMD, so this will probably be one of the rare AAA titles that AMD can brag about performance...that being said Nvidia performance is excellent across the board as well so everyone wins but AMD just gets bragging rights...I do expect Nvidia to close the gap before launch because that's what Nvidia does...this beta will only help them with that...no doubt Nvidia will have new Game Ready drivers on Day 1
 
Can anyone that is playing the Beta tell me if you can change that insanely narrow 75 degree FoV to something more reasonable like 90~100? I get motion sick if I use such a narrow FoV :(..

Its adjustable in the games video settings up to 110.
 
Just played a quick round of the wave based survival since the server appear to be jacked, looks pretty nice and runs really well. I set a 125% resolution scale at 1440 and it was completely smooth, I imagine it might drop a little in a proper game though.
 
Getting FPS between 40-50 on my setup at Ultra settings at 4K which seems decent enough and the graphics look quite nice. However I'm not impressed with the game.
 
Wouldn't be surprised if NVIDIA had a driver update around launch that fixes this.

If by "driver" you mean free NVidia swag delivered to Hillbert's door, then yes, this preview will be soon "fixed" :)
 
I expect something similar from this game. AMD gets the early start on optimizations, then the difference disappears after a couple months.

AMD is the leader in pre-release or beta benchmarks (like DX12)...congrats!...once final versions hit then Nvidia almost always surpasses AMD :D

Sorry, you have it backwards gentleman. Both Tahiti and Hawaii have shown great increases in performance over time.
Here is Dying light when it first came out:
http://www.hardocp.com/article/2015/03/10/dying_light_video_card_performance_review/4#.VhaOuXpViko
A GTX 970 was beating up on a 290x!

Now more recently - an R9 390 matching a GTX 970:
http://www.hardocp.com/article/2015...sipation_8gb_video_card_review/6#.VhaOeXpViko
 
It's a GameWorks game and ran quite poorly on AMD hardware at launch. Driver improvements over time will of course improve performance, it's pretty typical. There were some BF4 benchmarks linked on the previous page that show the same results for Nvidia.

Vanilla Dying Light was incredibly CPU bottlenecked with the max view distance slider, Techland ended up cutting that slider in half with one of the earlier patches. The game still runs very badly on AMD CPUs.
 
Eventually, maybe. If you think Nvidia is going to roll out a magical driver on launch day that boosts their performance by 20%ish then you're going to be disappointed.
But who knows, launch is a month away, maybe they have an ace up their sleeve. :rolleyes:

Except NVIDIA doesn't need a 20% boost.
 
Well I just played the beta, my first Battlefield game in over 10 years.
It's garbage. People pay $60 for this? Yikes.
 
Hitting 80-90+ fps on 1 980ti(99% usage)....sli enabled, but not using second card. Guess I will have to make a profile

1440p - 144hz
Ultra preset
110% FOV
100% resolution scale

EDIT: Couldnt find profile for Battlefront, but found a tutorial on what to do to find it....
 
Last edited:
Its different but Im having fun with it. I like running around with vader :)
 
Well I just played the beta, my first Battlefield game in over 10 years.
It's garbage. People pay $60 for this? Yikes.

More like 100+ $$$ as you need all the packs or season passes or elite something junk to get content cut from day edition.
 
Sorry, you have it backwards gentleman. Both Tahiti and Hawaii have shown great increases in performance over time.
Here is Dying light when it first came out:
http://www.hardocp.com/article/2015/03/10/dying_light_video_card_performance_review/4#.VhaOuXpViko
A GTX 970 was beating up on a 290x!

Now more recently - an R9 390 matching a GTX 970:
http://www.hardocp.com/article/2015...sipation_8gb_video_card_review/6#.VhaOeXpViko

I'm talking about an AMD Gaming Evolved game here. You would expect Nvidia to be slower out the gate, and then speed up at a much higher rate than AMD. AMD just had a few months head start on engine optimizations is all.

AMD has also closed the gap in The Witcher 3, another Gameworks title. I'm not saying they don't optimize, just saying that Nvidia will close the gap here as well.
 
Very informative article.

Especially considering I'll be picking up either a 380 or 960 within the next week.....

Still on the fence about which way to go though.
 
I'm pretty sure the 380 beats up a 960, but I've been out of the game for about a year now..
 
The speed differences look insignificant on the high end cards. The 290x does look like the best bang for buck though.

Well if the movie is a tremendous smash hit (likely) then this game will most likely sell like candy - especially on the consoles.
 
Can someone please explain to my why this game looks SO good ALSO performs so well?

Dragon Age Inquisition did not look this good and performed much worse, on the same engine.


Is it that the guys have had more time to optimize and tune frostbite 3? Was there just a lot more stuff going on in a game like dragon age inquisition visually?


I suppose a lot of the detail is confined to detailed vistas and decent looking rocks and terrain. But I want an rpg to run this well and look this good, that is where this kind of fidelity is needed most. Neither the witcher 3 nor dragon age inquisition ran this smoothly.
 
Finally able to try this game. I like it, it's gorgeous and smooooooth to play!

Looks like just a single 280x is enough to push 60 fps @ 1080 on Ultra, Impressive!
 
Typed PCLab randomly into Google, benchmarks up:

http://pclab.pl/art66213.html

Similar to Guru3D's results, AMD has a respective lead of about 10%.

A lead where? Top cards are Titan X and 980 Ti. Fury doesn't really have an NVIDIA equivalent so the closest thing would be an aftermarket 980 which isn't measured in this test. The 390x leads the 980 but the 390x is just a 290x with some tweaks + OC so again, the lack of an aftermarket 980 here skews the results. Toss in overclocking and AMD again loses, I'm pushing 1440p + 130% scaling @1.4 ghz w/my Titan X's and never dropping below 60 fps.
 
I'm a little surprised how well an OC'd 960 runs this game (granted, 1920x1080). It never dips below 60FPS
 
A lead where? Top cards are Titan X and 980 Ti. Fury doesn't really have an NVIDIA equivalent so the closest thing would be an aftermarket 980 which isn't measured in this test. The 390x leads the 980 but the 390x is just a 290x with some tweaks + OC so again, the lack of an aftermarket 980 here skews the results. Toss in overclocking and AMD again loses, I'm pushing 1440p + 130% scaling @1.4 ghz w/my Titan X's and never dropping below 60 fps.
Check the clocks, it looks like they're running everything at stock. So their 300-series cards have their factory OC's removed.
Fury X leads the 980 Ti @ 1440p and both the 980 Ti & TX @ 4K.
 
Last edited:
A lead where? Top cards are Titan X and 980 Ti. Fury doesn't really have an NVIDIA equivalent so the closest thing would be an aftermarket 980 which isn't measured in this test. The 390x leads the 980 but the 390x is just a 290x with some tweaks + OC so again, the lack of an aftermarket 980 here skews the results. Toss in overclocking and AMD again loses, I'm pushing 1440p + 130% scaling @1.4 ghz w/my Titan X's and never dropping below 60 fps.

R9 390 > 970 pretty comprehensively, that's a lead. In fact it's trading blows with the 980 depending on resolution. Even the 290x is faster than the 980 above 1080p. Top cards are pretty much a wash but it's painfully (for you it seems) obvious that AMD leads in this particular game regardless of what elaborate set of criteria you'd like to invent to make this not the case.

(I'm not trying to say this is some significant event, it is an AMD game and it is in beta but to say AMD doesn't lead at this stage is just...)
 
Last edited:
Check the clocks, it looks like they're running everything at stock. So their 300-series cards have their factory OC's removed.
Fury X leads the 980 Ti @ 1440p and both the 980 Ti & TX @ 4K.

I see 1050/1500 on the graph for the 390x. Fury X leads Titan X by 0.6 fps at 4k, that's basically margin of error. My point stands, bring in an aftermarket 980 Ti and even a mild OC on Titan X or 980 Ti and AMD would lose handily because their cards are already pretty much maxed out from the factory. Considering this, it will only get worse for them after a few driver updates.

R9 390 > 970 pretty comprehensively, that's a lead. In fact it's trading blows with the 980 depending on resolution. Even the 290x is faster than the 980 above 1080p. Top cards are pretty much a wash but it's painfully (for you it seems) obvious that AMD leads in this particular game regardless of what elaborate set of criteria you'd like to invent to make this not the case.

(I'm not trying to say this is some significant event, it is an AMD game and it is in beta but to say AMD doesn't lead at this stage is just...)

At 1440p the 980 gets 54.9 fps avg, the 290x gets 55.3 fps, that's not even 1 fps higher. And as I said bout the 390x, it's just a factory OC'd 290x with some tweaks, so a simple 980 OC (or factory OC'd version) would make up that small difference. That's why I said it looks more or less to be a wash and it's bad news for AMD considering these are all stock NVIDIA cards with nowhere to go but up with the OC headroom they got. The Fury line is pretty much already maxed so that's out of the equation and at best you see 390x top out at 1100ish so that's not enough to catch a 1500 mhz clocked 980. These reviews only tell one part of the story, stock cards, not every factor which is typically considered by smart consumers. You can argue that not all cards OC alike but I haven't seen any 980s that couldn't hit 1450-1500 MHz.
 
Last edited:
390X is 1050/1500 stock according to this: https://www.techpowerup.com/gpudb/2663/radeon-r9-390x.html

My point stands, bring in an aftermarket 980 and even a mild OC on Titan X or 980 Ti and AMD would lose handily because their cards are already pretty much maxed out from the factory
Yeah but you could make the same argument about literally every benchmark. There are very few sites who do OC vs OC comparisons.
People who aren't retarded know how well each GPU overclocks and should be able to calculate performance themselves. I'd rather see how everything performs at stock to use as a baseline for comparison.
 
Can someone please explain to my why this game looks SO good ALSO performs so well?

Dragon Age Inquisition did not look this good and performed much worse, on the same engine.


Is it that the guys have had more time to optimize and tune frostbite 3? Was there just a lot more stuff going on in a game like dragon age inquisition visually?


I suppose a lot of the detail is confined to detailed vistas and decent looking rocks and terrain. But I want an rpg to run this well and look this good, that is where this kind of fidelity is needed most. Neither the witcher 3 nor dragon age inquisition ran this smoothly.

Battlefront is being a bit misunderstood (or you can save overrated) in terms of actually how well "optimized" it is (this term is getting misused a lot).

It looks very good due to a combination of design and technology due to synergy. Everything is a good fit basically.

You don't even need to compare to Dragon Age Inquisition. If you look at it in more detail Battlefront really is being asked to do much less than Battlefield 4. The game, at least what is available in the beta so far, is much more static with less going on. The environment compared to BF4 is extremely sparse as well in terms of objects. 64 vs 40 players should also tell you about the scale of both games. In BF4 you'd have more players, more vehicles in combat with dynamic destruction of an environment littered with buildings, trees, and other objects. Battlefront, at least Battle of Hoth, is really a very sparse and static open snow field with comparatively nothing on it (very detailed textures due to the new technology though), much less players and much less vehicles (you'd could have more tanks alone on a BF4 map then all the vehicles on Hoth combined).

But this is a great game in terms of showing the importance in design and how that transfers to visuals as opposed to just ratcheting up fidelity.

Edit: I just want to add the one thing I don't like about DICE's direction since Bad Company is that we seem to be going backwards in terms of dynamic environments. Even moving from Bad Company 2 to Battlefield 3/4 they ended up scaling back how dynamic the environments were in terms of destruction. The difference doesn't really show up though if you are just taking screenshots or admiring the visuals, but in terms of the actual feel when playing it makes a large difference in my opinion. Battlefront really feels rather static basically by comparison.
 
Last edited:
390X is 1050/1500 stock according to this: https://www.techpowerup.com/gpudb/2663/radeon-r9-390x.html


Yeah but you could make the same argument about literally every benchmark. There are very few sites who do OC vs OC comparisons.
People who aren't retarded know how well each GPU overclocks and should be able to calculate performance themselves. I'd rather see how everything performs at stock to use as a baseline for comparison.

I think we misunderstood one another, I meant the 390X is an OC'd 290x, not that the cards in the review were overclocked. The problem I have with these stock vs stock reviews is they ignore half the reason people purchase cards based on Maxwell which is the generous overclocking. I'm not sure why reviewers are so reluctant to do OC vs OC. They don't even have to max those clocks out, just do what they think would be a typical one that anyone could achieve. OC results in addition to baseline results give a much clearer picture than just one or the other.
 
Back
Top