BioWare: Anthem Won't Run at 1080p/60 FPS on PS4 Pro, Xbox One X

Discussion in 'HardForum Tech News' started by Megalith, Jan 20, 2019.

  1. Megalith

    Megalith 24-bit/48kHz Staff Member

    Messages:
    13,004
    Joined:
    Aug 20, 2006
    The most powerful consoles available today won’t be running Anthem in 60 FPS, as BioWare has opted to prioritize visuals rather than framerate. Lead producer Michael Gamble suggests his team may revisit the latter “maybe as a future thing,” but at the very least, both PS4 Pro and Xbox One X versions can run the game in 4K at 30 FPS.

    Whether BioWare can optimize it in the future to hit 60 FPS and 1080p on Xbox One X and PS4 Pro, perhaps sacrificing some visual fidelity in the process, remains to be seen. For those anxiously awaiting the demo, Gamble did confirm that it would be around 30 GB to download. The VIP access demo runs from January 25th to 27th with a public demo going live on February 1st. Anthem is out on February 22nd for Xbox One, PS4 and PC.
     
  2. horrorshow

    horrorshow [H]ardness Supreme

    Messages:
    6,409
    Joined:
    Dec 14, 2007
    Are the Jaguar cores really THAT bad??

    I pictured the Xbone X tearing it up at 1080p/60....
     
    Ironchef3500 and Red Falcon like this.
  3. RobCalleg

    RobCalleg [H]Lite

    Messages:
    96
    Joined:
    Nov 15, 2018
    Imagine trying to fly around turning your camera in 360 directions to spot enemies and shit at 30 FPS. That’s like playing Xenoverse 2 at 30fps.

    Playable but not worth playing.
     
    Sufu and Ironchef3500 like this.
  4. Jim Kim

    Jim Kim 2[H]4U

    Messages:
    3,231
    Joined:
    May 24, 2012
    The only way we will see Anthem, or any other game for that matter, in Ultra Textures running at 60+ frames per second in 4k will be when the Mad Box from Slightly Mad Studios hit's the market.
    This new console will blow your socks off and amaze everyone, I promise.

    The bad part about the Mad Box is its expected to only hit 30 fps in 8k. :(
     
  5. Red Falcon

    Red Falcon [H]ardForum Junkie

    Messages:
    9,832
    Joined:
    May 7, 2007
    If the consoles were GPU-bound, then 4K @ 30fps wouldn't have been possible, but since both 1080p and 4K are limited to 30fps, that just shows how CPU-bound the current-gen consoles really are; not saying this is bad, the hardware limitation just is what it is.
    Not trying to beat a dead horse as this has been discussed numerous times, but 2020 can't come fast enough for the next consoles equiped with Ryzen or equivalent-class CPUs to remove that 30fps limitation once and for all.
     
  6. TangledThornz

    TangledThornz Gawd

    Messages:
    583
    Joined:
    Jun 12, 2018
    Current consoles aren't close to being 1080p ready. Next gen perhaps.
     
  7. dewbak75

    dewbak75 Limp Gawd

    Messages:
    251
    Joined:
    Nov 12, 2012
    "1080p ready"? Did you mean 4K-ready?
     
    Vercinaigh and Nightfire like this.
  8. Kor

    Kor 2[H]4U

    Messages:
    2,197
    Joined:
    Mar 31, 2010
    Always were, always will be. That and Bioware aren't anywhere near as technically proficient as other developers.
     
    defaultluser and Vercinaigh like this.
  9. Red Falcon

    Red Falcon [H]ardForum Junkie

    Messages:
    9,832
    Joined:
    May 7, 2007
    There are quite a few titles that run at native (not upscaled) 1080p at 30fps or 60fps, so it isn't like the consoles aren't capable of doing so.
    The CPU is just such a limiter on keeping the GPU fed data, even with completely optimized programming and code, that consistently hitting 60fps (depending on the game) is beyond its physical capabilities.

    The GPUs themselves in the Pro and X consoles are more than capable of 1080p 60fps and even 4K 60fps, but in most demanding games it isn't possible without upscaling and/or soft-limiting the frame rate to 30fps.
     
    Last edited: Jan 21, 2019
    Revdarian likes this.
  10. Red Falcon

    Red Falcon [H]ardForum Junkie

    Messages:
    9,832
    Joined:
    May 7, 2007
    The Jaguar is about 4% faster than Piledriver clock-for-clock, but being stuck at between 1.6GHz and 2.3GHz in the consoles isn't doing it any favors.
    Outside of that, the CPU really isn't that bad for what it is, considering it was a low-power CPU from 2013 designed for embedded systems and thin clients - it just wasn't enough for AAA gaming beyond 30fps most of the time.
     
    lostin3d and Revdarian like this.
  11. bigdogchris

    bigdogchris [H]ard as it Gets

    Messages:
    17,787
    Joined:
    Feb 19, 2008
    The inevitable PS5/Xbox Two version will.
     
  12. Krenum

    Krenum [H]ardForum Junkie

    Messages:
    15,262
    Joined:
    Apr 29, 2005
    Trying turning RTX & DLSS ON.....
     
    Sufu and Ironchef3500 like this.
  13. Lakados

    Lakados [H]ard|Gawd

    Messages:
    1,311
    Joined:
    Feb 3, 2014
    Honestly for a game like Anthem on a console if it did a consistent 45fps and never drops lower than that while looking as great as it does most would be happy, and the ones who are unhappy with that should be playing it on a PC I would rather the game looks great and ran slightly slower (assuming it never ever drops into the 30's) then look sub par and do 60.
     
  14. HeadRusch

    HeadRusch [H]ard|Gawd

    Messages:
    1,102
    Joined:
    Jun 8, 2007
    You guys aren't going to need 60fps to get bored flying around doing nothing for hours :)
     
    lostin3d and Krenum like this.
  15. gamerk2

    gamerk2 [H]ard|Gawd

    Messages:
    1,547
    Joined:
    Jul 9, 2012
    Yes. Case in point: The Jaguar cores in the PS4/XB1 can push ~105 GFLOPS. By comparison, the X360 CPU can push ~110 GFLOPS and the PS3 CPU can push ~240 (theoretical maximum) GFLOPS, with ~170 GFLOPS being more typical.

    So yeah, the CPUs in these consoles were chosen due to power/heat concerns, not performance.
     
  16. OblivionLord

    OblivionLord Limp Gawd

    Messages:
    168
    Joined:
    Apr 16, 2004
    Id rather have more content then a steady 60fps. I played Botw at less than 30fps on wiiu and it was an ok experience compared to switch which had a steady 30fps. Game to me just got repetitious after awhile and honestly i could care less playing it on cemu at 1080p or 4k.
     
    Nolan7689 likes this.
  17. Ironchef3500

    Ironchef3500 [H]Lite

    Messages:
    64
    Joined:
    Sep 26, 2006
  18. Ironchef3500

    Ironchef3500 [H]Lite

    Messages:
    64
    Joined:
    Sep 26, 2006
    Maybe we can get it unlocked above 30?
     
  19. MaZa

    MaZa 2[H]4U

    Messages:
    2,694
    Joined:
    Sep 21, 2008
    As mentioned, Jaguar is a glorified Piledriver, only a little bit more than a Bulldozer. Those CPU's were garbage when they were new, in 2018 they are a worthless steaming pile of manure. So yeah, they are really that bad and an overhaul was long overdue. This gen has been lukewarm, hell Pro and One X were what Ps4 and XBone should have been in the first place if you ask me. Next gen cannot come fast enough, then we will see a true upgrade and not just an improved graphics chip and memory like this current gen was.
     
  20. exlink

    exlink [H]ardness Supreme

    Messages:
    4,155
    Joined:
    Dec 16, 2006
    Those Jaguar cores are aging pretty poorly. They were mediocre (at best) at launch in 2013 and they only increased the clock speed a little for the mid-cycle refreshes in the PS4 Pro and Xbox One X. Hopefully both Sony and Microsoft won't skimp out on the CPU this next generation.
     
    Armenius likes this.
  21. lostin3d

    lostin3d [H]ard|Gawd

    Messages:
    1,911
    Joined:
    Oct 13, 2016
    Not too surprised with this. It wasn't until I got a 2080TI I was able to have a mostly smooth 60fps in MEA in 4k. The other rig with the 1080TI worked well for 1440p. Seems they're being pretty consistent performance-wise but lets hope the art is a bit more consistent than MEA. There were some parts I could understand the performance demands and others that seemed unwarranted. I'll also be curious about the Vram usage as MEA in DX11 consumed a lot in 4k when maxed so it'll be interesting to see how Anthem does whether DX11 or DX12. This too would affect console performance.
     
  22. Shadowed

    Shadowed Limp Gawd

    Messages:
    468
    Joined:
    Mar 21, 2018
    I can't wait for Ryzen to show up in consoles. Jaguar needs to die off.

    Jaguar really is holding back the gaming industry.
     
    Armenius likes this.
  23. Bigdady92

    Bigdady92 [H]ardness Supreme

    Messages:
    5,773
    Joined:
    Jun 20, 2001
    4k/30FPS? I'm fine with that. That's some GORGEOUS textures going across my screen at a frame rate my eyes can handle vs some nebulous number that doesn't mean jack shit to my gameplay.
     
  24. Boggins

    Boggins Limp Gawd

    Messages:
    193
    Joined:
    Dec 30, 2010
    Not surprising considering NO new high profile release plays at 60FPS on these consoles, and hasn't for... ever?
     
  25. Armenius

    Armenius I Drive Myself to the [H]ospital

    Messages:
    16,516
    Joined:
    Jan 28, 2014
    There are plenty of games for both refreshed consoles that run at 60FPS or has the option of doing so. Forza Motorsport 7 runs at 60 FPS in native 4K on the Xbox One X.
     
  26. darckhart

    darckhart Limp Gawd

    Messages:
    237
    Joined:
    Jun 15, 2013
    ugh I guess those of us without the refresh models are even worse off... I have noticed my ps4 non-pro struggling at 1080 for many games already sigh
     
  27. Merc1138

    Merc1138 2[H]4U

    Messages:
    2,089
    Joined:
    Sep 25, 2010
    Sure, but the problem with 30fps being the target, is that it's generally not the minimum. Looks fine most times, other times if it ends up looking like a "cinematic experience".
     
  28. oldmanbal

    oldmanbal [H]ard|Gawd

    Messages:
    2,042
    Joined:
    Aug 27, 2010
    i doubt they'll hit 30/1080 on ps4/obo hardware.
     
  29. Sindalis

    Sindalis n00b

    Messages:
    28
    Joined:
    Aug 27, 2014
    There is a reason i completely skipped this console generation. They aren't simply worth it and in many ways were worse than the prior gen (such as pure CPU power). Waiting till the next gen systems for me. Hopefully by then they can get at least 1080p/60 down as a standard.
     
  30. OblivionLord

    OblivionLord Limp Gawd

    Messages:
    168
    Joined:
    Apr 16, 2004
    In a controlled case, youd be able to tell. Usually if you are 'playing' a game then you arent going to notice much. Depends on how much of a videophile someone is.
     
  31. Red Falcon

    Red Falcon [H]ardForum Junkie

    Messages:
    9,832
    Joined:
    May 7, 2007
    Well, there are a few exclusives that are absolutely worth it, especially on the PS4/Slim/Pro - Spider-Man and Bloodborne were two 10/10 games that were both limited to 30fps on any PS4 model.
    Really, the refresh models only allow many games to run at a more-consistent 30fps if the games are soft-limited to 30fps, and with Bloodborne it really helps, but is certainly not a deal breaker.
     
  32. another-user

    another-user Gawd

    Messages:
    964
    Joined:
    Dec 27, 2006
    did we play the same game? in docked mode, Botw drops down to sub 20s in some areas.
     
  33. El Derpo

    El Derpo Limp Gawd

    Messages:
    220
    Joined:
    Dec 5, 2018
    Poorly optimized games are fun for everyone!
     
  34. defaultluser

    defaultluser I B Smart

    Messages:
    12,147
    Joined:
    Jan 14, 2006
    It wasn't just the power consumption. AMD didn't have any other CPU validated on bulk CMOS before Kaveri (2014 release date).

    People forget that before that, all AMD CPUs were Silicon On Insulator.

    It took Global Foundaries and AMD several years to unfuck themselves out of that mess. It's the number one reason why it took them four years after the merger to start producing good APUs (ATI uses bulk CMOS, and had to translate everything).

    So, if they wanted an APU they could build at TSMC in early 2013, they were forced to hack together two Jaguar modules, and just took the performance hit getting them to talk to each other.
     
    Last edited: Jan 22, 2019
    Armenius and Red Falcon like this.
  35. profiled

    profiled Limp Gawd

    Messages:
    148
    Joined:
    Feb 20, 2018
  36. focbde

    focbde Gawd

    Messages:
    561
    Joined:
    Jan 31, 2008
    I can understand their reasoning, for the masses, visuals sell games more than framerate. However the solution is simple - default to high details / 30fps, but have an option for medium details / 60fps or similar.
     
    Armenius likes this.
  37. gamerk2

    gamerk2 [H]ard|Gawd

    Messages:
    1,547
    Joined:
    Jul 9, 2012
    My point was, if the consoles were built to maximize performance rather then have minimal power/heat concerns, Sony and Microsoft would have gone Intel+NVIDIA, not AMD.
     
  38. defaultluser

    defaultluser I B Smart

    Messages:
    12,147
    Joined:
    Jan 14, 2006
    COST was the most important factor here. Power was secondary (you can down-clock a more powerful processor to get the same perf/watt, while still giving you the same performance as twice-as-many Jaguar cores).

    They wanted an APU, or they could have gone to ANYONE (i.e. Intel for CPU and Nvidia for GPU).

    An APU with a single shared memory space is a whole lot cheaper to manufacturer than a PS3 (with separate main and GPU memory).

    You can shrink the chips, but the PCB will always be the same complexity (if you have two separate memory spaces).

    It's only $10-20 per-unit, but when you sell a hundred million of them, you tend to notice these things. It also simplifies your future die shrinks (just one chip instead of two).

    And unifying memory type installed on each board means it's even easier to do cost reductions (like 8Gbit GDDR5 chips cut the number of memory chips on the PS4 PCB in half).

    Remember, they launched PS3 at $500, and they were not about to repeat that. $400 was their target, and they had to cut things that would not matter to most gamers to reach that price.
     
    Last edited: Jan 24, 2019