StarWars Battlefront performance Preview

Discussion in 'Video Cards' started by Stoly, Oct 7, 2015.

  1. socK

    socK 2[H]4U

    Messages:
    3,669
    Joined:
    Jan 25, 2004
    Frostbite gives you the vertical FOV, unless they've suddenly decided to change for this game.

    At 16:9, that 70 fov is effectively 102 horizontal.
    90 would be 121.

    It's not very narrow at all.
     
  2. Nihilus1

    Nihilus1 Limp Gawd

    Messages:
    441
    Joined:
    Jun 7, 2015
    Sarcastic defense mechanism to hide butt hurt.

    That Avalanche guy needs to be put on suicide watch as well - that was brutal.
    Both Nvidia and AMD run well on this game. AMD just runs a little better. Everyone should be happy. This game engine also scales well and is not a vram hog like so many other games while still having great visuals. What more can one ask for?
     
  3. workshop35

    workshop35 Gawd

    Messages:
    576
    Joined:
    Nov 24, 2013
    Does the frostbyte 3 engine work in linux/have any linux plans? I read some articles talking about the possibility of a linux port but that was from 2013. Would be interesting to see how this performs in something other than windows
     
  4. fanboy

    fanboy [H]ard|Gawd

    Messages:
    1,057
    Joined:
    Jul 4, 2009
    I wouldn't say the game is bias to AMD if your judging it by the performance of the 290x ... AMD's partners have redesigned the layout for Hawaii as to not throttle and used faster clocked memory (Samsung 1350Mhz) and some layouts use two 8 pin connectors for a real 375 watts of useable power and the gpu's are even clocked higher like1020Mhz..

    So if you take what partners have done for Hawaii and add driver performance by AMD then it's not so hard to believe why the 290x can now compete with the 980GTX and win some benchmarks at a lot cheaper price as my Sapphire Tri X 290x New Edition cost me $269 and has the improved layout.
     
  5. polonyc2

    polonyc2 [H]ard as it Gets

    Messages:
    16,736
    Joined:
    Oct 25, 2004
    AMD is the leader in pre-release or beta benchmarks (like DX12)...congrats!...once final versions hit then Nvidia almost always surpasses AMD :D
     
  6. TaintedSquirrel

    TaintedSquirrel [H]ardForum Junkie

    Messages:
    9,419
    Joined:
    Aug 5, 2013
    Eventually, maybe. If you think Nvidia is going to roll out a magical driver on launch day that boosts their performance by 20%ish then you're going to be disappointed.
    But who knows, launch is a month away, maybe they have an ace up their sleeve. :rolleyes:
     
  7. polonyc2

    polonyc2 [H]ard as it Gets

    Messages:
    16,736
    Joined:
    Oct 25, 2004
    Battlefront like all DICE games is sponsored by AMD, so this will probably be one of the rare AAA titles that AMD can brag about performance...that being said Nvidia performance is excellent across the board as well so everyone wins but AMD just gets bragging rights...I do expect Nvidia to close the gap before launch because that's what Nvidia does...this beta will only help them with that...no doubt Nvidia will have new Game Ready drivers on Day 1
     
  8. HybridHB

    HybridHB [H]ard|Gawd

    Messages:
    1,248
    Joined:
    Dec 15, 2002
    Its adjustable in the games video settings up to 110.
     
  9. Kor

    Kor 2[H]4U

    Messages:
    2,176
    Joined:
    Mar 31, 2010
    Just played a quick round of the wave based survival since the server appear to be jacked, looks pretty nice and runs really well. I set a 125% resolution scale at 1440 and it was completely smooth, I imagine it might drop a little in a proper game though.
     
  10. ChronoDetector

    ChronoDetector 2[H]4U

    Messages:
    2,577
    Joined:
    Apr 1, 2008
    Getting FPS between 40-50 on my setup at Ultra settings at 4K which seems decent enough and the graphics look quite nice. However I'm not impressed with the game.
     
  11. harmattan

    harmattan [H]ardness Supreme

    Messages:
    4,240
    Joined:
    Feb 11, 2008
    If by "driver" you mean free NVidia swag delivered to Hillbert's door, then yes, this preview will be soon "fixed" :)
     
  12. Yakk

    Yakk [H]ardness Supreme

    Messages:
    5,811
    Joined:
    Nov 5, 2010
  13. Nihilus1

    Nihilus1 Limp Gawd

    Messages:
    441
    Joined:
    Jun 7, 2015
    Sorry, you have it backwards gentleman. Both Tahiti and Hawaii have shown great increases in performance over time.
    Here is Dying light when it first came out:
    http://www.hardocp.com/article/2015/03/10/dying_light_video_card_performance_review/4#.VhaOuXpViko
    A GTX 970 was beating up on a 290x!

    Now more recently - an R9 390 matching a GTX 970:
    http://www.hardocp.com/article/2015...sipation_8gb_video_card_review/6#.VhaOeXpViko
     
  14. TaintedSquirrel

    TaintedSquirrel [H]ardForum Junkie

    Messages:
    9,419
    Joined:
    Aug 5, 2013
    It's a GameWorks game and ran quite poorly on AMD hardware at launch. Driver improvements over time will of course improve performance, it's pretty typical. There were some BF4 benchmarks linked on the previous page that show the same results for Nvidia.

    Vanilla Dying Light was incredibly CPU bottlenecked with the max view distance slider, Techland ended up cutting that slider in half with one of the earlier patches. The game still runs very badly on AMD CPUs.
     
  15. FrgMstr

    FrgMstr Just Plain Mean Staff Member

    Messages:
    48,410
    Joined:
    May 18, 1997
    I literally just snorked here in the waiting room.
     
  16. DPI

    DPI Nitpick Police

    Messages:
    10,957
    Joined:
    Apr 20, 2013
    Except NVIDIA doesn't need a 20% boost.
     
  17. TaintedSquirrel

    TaintedSquirrel [H]ardForum Junkie

    Messages:
    9,419
    Joined:
    Aug 5, 2013
    Well I just played the beta, my first Battlefield game in over 10 years.
    It's garbage. People pay $60 for this? Yikes.
     
  18. BroHamBone

    BroHamBone [H]ard|Gawd

    Messages:
    2,021
    Joined:
    Apr 6, 2013
    Hitting 80-90+ fps on 1 980ti(99% usage)....sli enabled, but not using second card. Guess I will have to make a profile

    1440p - 144hz
    Ultra preset
    110% FOV
    100% resolution scale

    EDIT: Couldnt find profile for Battlefront, but found a tutorial on what to do to find it....
     
    Last edited: Oct 8, 2015
  19. jackstar7

    jackstar7 Gawd

    Messages:
    568
    Joined:
    Jan 7, 2012
    It's not Battlefield... but that's okay, sounds like it isn't for you.
     
  20. HybridHB

    HybridHB [H]ard|Gawd

    Messages:
    1,248
    Joined:
    Dec 15, 2002
    Its different but Im having fun with it. I like running around with vader :)
     
  21. AceCR42

    AceCR42 2[H]4U

    Messages:
    2,948
    Joined:
    Jan 10, 2008
    Battlefront != Battlefield.
     
  22. defaultluser

    defaultluser [H]ardForum Junkie

    Messages:
    12,465
    Joined:
    Jan 14, 2006
    Yeah, the only reason I'm interested is it's free. I'm not sure I'd drop then cash for this versus Battlefield 4, which I am still playing.

    But if it's awesome, maybe I will :D
     
  23. Michaelius

    Michaelius [H]ardness Supreme

    Messages:
    4,684
    Joined:
    Sep 8, 2003
    More like 100+ $$$ as you need all the packs or season passes or elite something junk to get content cut from day edition.
     
  24. Savoy

    Savoy 2[H]4U

    Messages:
    2,471
    Joined:
    Oct 1, 2005
    I was lmao running along side Vader helping him wreck people. Was funny and fun at the same time. :D
     
  25. defaultluser

    defaultluser [H]ardForum Junkie

    Messages:
    12,465
    Joined:
    Jan 14, 2006
    I'm talking about an AMD Gaming Evolved game here. You would expect Nvidia to be slower out the gate, and then speed up at a much higher rate than AMD. AMD just had a few months head start on engine optimizations is all.

    AMD has also closed the gap in The Witcher 3, another Gameworks title. I'm not saying they don't optimize, just saying that Nvidia will close the gap here as well.
     
  26. horrorshow

    horrorshow [H]ardness Supreme

    Messages:
    7,053
    Joined:
    Dec 14, 2007
    Very informative article.

    Especially considering I'll be picking up either a 380 or 960 within the next week.....

    Still on the fence about which way to go though.
     
  27. KazeoHin

    KazeoHin [H]ardness Supreme

    Messages:
    7,843
    Joined:
    Sep 7, 2011
    I'm pretty sure the 380 beats up a 960, but I've been out of the game for about a year now..
     
  28. defaultluser

    defaultluser [H]ardForum Junkie

    Messages:
    12,465
    Joined:
    Jan 14, 2006
    Game is boring. there's not even a prone button!
     
  29. TaintedSquirrel

    TaintedSquirrel [H]ardForum Junkie

    Messages:
    9,419
    Joined:
    Aug 5, 2013
    Typed PCLab randomly into Google, benchmarks up:

    http://pclab.pl/art66213.html

    Similar to Guru3D's results, AMD has a respective lead of about 10%.
     
    Last edited: Oct 8, 2015
  30. noko

    noko [H]ardness Supreme

    Messages:
    4,324
    Joined:
    Apr 14, 2010
    The speed differences look insignificant on the high end cards. The 290x does look like the best bang for buck though.

    Well if the movie is a tremendous smash hit (likely) then this game will most likely sell like candy - especially on the consoles.
     
  31. tybert7

    tybert7 2[H]4U

    Messages:
    2,634
    Joined:
    Aug 23, 2007
    Can someone please explain to my why this game looks SO good ALSO performs so well?

    Dragon Age Inquisition did not look this good and performed much worse, on the same engine.


    Is it that the guys have had more time to optimize and tune frostbite 3? Was there just a lot more stuff going on in a game like dragon age inquisition visually?


    I suppose a lot of the detail is confined to detailed vistas and decent looking rocks and terrain. But I want an rpg to run this well and look this good, that is where this kind of fidelity is needed most. Neither the witcher 3 nor dragon age inquisition ran this smoothly.
     
  32. Yakk

    Yakk [H]ardness Supreme

    Messages:
    5,811
    Joined:
    Nov 5, 2010
    Finally able to try this game. I like it, it's gorgeous and smooooooth to play!

    Looks like just a single 280x is enough to push 60 fps @ 1080 on Ultra, Impressive!
     
  33. 5150Joker

    5150Joker 2[H]4U

    Messages:
    3,139
    Joined:
    Aug 1, 2005
    A lead where? Top cards are Titan X and 980 Ti. Fury doesn't really have an NVIDIA equivalent so the closest thing would be an aftermarket 980 which isn't measured in this test. The 390x leads the 980 but the 390x is just a 290x with some tweaks + OC so again, the lack of an aftermarket 980 here skews the results. Toss in overclocking and AMD again loses, I'm pushing 1440p + 130% scaling @1.4 ghz w/my Titan X's and never dropping below 60 fps.
     
  34. DPI

    DPI Nitpick Police

    Messages:
    10,957
    Joined:
    Apr 20, 2013
    I'm a little surprised how well an OC'd 960 runs this game (granted, 1920x1080). It never dips below 60FPS
     
  35. TaintedSquirrel

    TaintedSquirrel [H]ardForum Junkie

    Messages:
    9,419
    Joined:
    Aug 5, 2013
    Check the clocks, it looks like they're running everything at stock. So their 300-series cards have their factory OC's removed.
    Fury X leads the 980 Ti @ 1440p and both the 980 Ti & TX @ 4K.
     
    Last edited: Oct 9, 2015
  36. AllBlackFan

    AllBlackFan [H]Lite

    Messages:
    89
    Joined:
    Sep 16, 2011
    R9 390 > 970 pretty comprehensively, that's a lead. In fact it's trading blows with the 980 depending on resolution. Even the 290x is faster than the 980 above 1080p. Top cards are pretty much a wash but it's painfully (for you it seems) obvious that AMD leads in this particular game regardless of what elaborate set of criteria you'd like to invent to make this not the case.

    (I'm not trying to say this is some significant event, it is an AMD game and it is in beta but to say AMD doesn't lead at this stage is just...)
     
    Last edited: Oct 9, 2015
  37. 5150Joker

    5150Joker 2[H]4U

    Messages:
    3,139
    Joined:
    Aug 1, 2005
    I see 1050/1500 on the graph for the 390x. Fury X leads Titan X by 0.6 fps at 4k, that's basically margin of error. My point stands, bring in an aftermarket 980 Ti and even a mild OC on Titan X or 980 Ti and AMD would lose handily because their cards are already pretty much maxed out from the factory. Considering this, it will only get worse for them after a few driver updates.

    At 1440p the 980 gets 54.9 fps avg, the 290x gets 55.3 fps, that's not even 1 fps higher. And as I said bout the 390x, it's just a factory OC'd 290x with some tweaks, so a simple 980 OC (or factory OC'd version) would make up that small difference. That's why I said it looks more or less to be a wash and it's bad news for AMD considering these are all stock NVIDIA cards with nowhere to go but up with the OC headroom they got. The Fury line is pretty much already maxed so that's out of the equation and at best you see 390x top out at 1100ish so that's not enough to catch a 1500 mhz clocked 980. These reviews only tell one part of the story, stock cards, not every factor which is typically considered by smart consumers. You can argue that not all cards OC alike but I haven't seen any 980s that couldn't hit 1450-1500 MHz.
     
    Last edited: Oct 9, 2015
  38. TaintedSquirrel

    TaintedSquirrel [H]ardForum Junkie

    Messages:
    9,419
    Joined:
    Aug 5, 2013
    390X is 1050/1500 stock according to this: https://www.techpowerup.com/gpudb/2663/radeon-r9-390x.html

    Yeah but you could make the same argument about literally every benchmark. There are very few sites who do OC vs OC comparisons.
    People who aren't retarded know how well each GPU overclocks and should be able to calculate performance themselves. I'd rather see how everything performs at stock to use as a baseline for comparison.
     
  39. limitedaccess

    limitedaccess [H]ardness Supreme

    Messages:
    7,493
    Joined:
    May 10, 2010
    Battlefront is being a bit misunderstood (or you can save overrated) in terms of actually how well "optimized" it is (this term is getting misused a lot).

    It looks very good due to a combination of design and technology due to synergy. Everything is a good fit basically.

    You don't even need to compare to Dragon Age Inquisition. If you look at it in more detail Battlefront really is being asked to do much less than Battlefield 4. The game, at least what is available in the beta so far, is much more static with less going on. The environment compared to BF4 is extremely sparse as well in terms of objects. 64 vs 40 players should also tell you about the scale of both games. In BF4 you'd have more players, more vehicles in combat with dynamic destruction of an environment littered with buildings, trees, and other objects. Battlefront, at least Battle of Hoth, is really a very sparse and static open snow field with comparatively nothing on it (very detailed textures due to the new technology though), much less players and much less vehicles (you'd could have more tanks alone on a BF4 map then all the vehicles on Hoth combined).

    But this is a great game in terms of showing the importance in design and how that transfers to visuals as opposed to just ratcheting up fidelity.

    Edit: I just want to add the one thing I don't like about DICE's direction since Bad Company is that we seem to be going backwards in terms of dynamic environments. Even moving from Bad Company 2 to Battlefield 3/4 they ended up scaling back how dynamic the environments were in terms of destruction. The difference doesn't really show up though if you are just taking screenshots or admiring the visuals, but in terms of the actual feel when playing it makes a large difference in my opinion. Battlefront really feels rather static basically by comparison.
     
    Last edited: Oct 9, 2015
  40. 5150Joker

    5150Joker 2[H]4U

    Messages:
    3,139
    Joined:
    Aug 1, 2005
    I think we misunderstood one another, I meant the 390X is an OC'd 290x, not that the cards in the review were overclocked. The problem I have with these stock vs stock reviews is they ignore half the reason people purchase cards based on Maxwell which is the generous overclocking. I'm not sure why reviewers are so reluctant to do OC vs OC. They don't even have to max those clocks out, just do what they think would be a typical one that anyone could achieve. OC results in addition to baseline results give a much clearer picture than just one or the other.