BioWare: Anthem Won't Run at 1080p/60 FPS on PS4 Pro, Xbox One X

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
The most powerful consoles available today won’t be running Anthem in 60 FPS, as BioWare has opted to prioritize visuals rather than framerate. Lead producer Michael Gamble suggests his team may revisit the latter “maybe as a future thing,” but at the very least, both PS4 Pro and Xbox One X versions can run the game in 4K at 30 FPS.

Whether BioWare can optimize it in the future to hit 60 FPS and 1080p on Xbox One X and PS4 Pro, perhaps sacrificing some visual fidelity in the process, remains to be seen. For those anxiously awaiting the demo, Gamble did confirm that it would be around 30 GB to download. The VIP access demo runs from January 25th to 27th with a public demo going live on February 1st. Anthem is out on February 22nd for Xbox One, PS4 and PC.
 
Imagine trying to fly around turning your camera in 360 directions to spot enemies and shit at 30 FPS. That’s like playing Xenoverse 2 at 30fps.

Playable but not worth playing.
 
The only way we will see Anthem, or any other game for that matter, in Ultra Textures running at 60+ frames per second in 4k will be when the Mad Box from Slightly Mad Studios hit's the market.
This new console will blow your socks off and amaze everyone, I promise.

The bad part about the Mad Box is its expected to only hit 30 fps in 8k. :(
 
Lead producer Michael Gamble was asked on Twitter if the game would support 1080p and 60 frames per second on the Xbox One X and PS4 Pro.

His response? “No, we’ve prioritized visuals. We can look at prioritizing frame rate maybe as future thing.” In a follow-up tweet responding to whether both consoles would output at 4K resolution and 30 frames per second, Gamble said, “Yes.” What this means is that regardless of the console, Anthem will be running at 30 frames per second which seems to apply to Xbox One and PS4 as well.

If the consoles were GPU-bound, then 4K @ 30fps wouldn't have been possible, but since both 1080p and 4K are limited to 30fps, that just shows how CPU-bound the current-gen consoles really are; not saying this is bad, the hardware limitation just is what it is.
Not trying to beat a dead horse as this has been discussed numerous times, but 2020 can't come fast enough for the next consoles equiped with Ryzen or equivalent-class CPUs to remove that 30fps limitation once and for all.
 
Current consoles aren't close to being 1080p ready. Next gen perhaps.
There are quite a few titles that run at native (not upscaled) 1080p at 30fps or 60fps, so it isn't like the consoles aren't capable of doing so.
The CPU is just such a limiter on keeping the GPU fed data, even with completely optimized programming and code, that consistently hitting 60fps (depending on the game) is beyond its physical capabilities.

The GPUs themselves in the Pro and X consoles are more than capable of 1080p 60fps and even 4K 60fps, but in most demanding games it isn't possible without upscaling and/or soft-limiting the frame rate to 30fps.
 
Last edited:
Always were, always will be. That and Bioware aren't anywhere near as technically proficient as other developers.
The Jaguar is about 4% faster than Piledriver clock-for-clock, but being stuck at between 1.6GHz and 2.3GHz in the consoles isn't doing it any favors.
Outside of that, the CPU really isn't that bad for what it is, considering it was a low-power CPU from 2013 designed for embedded systems and thin clients - it just wasn't enough for AAA gaming beyond 30fps most of the time.
 
Honestly for a game like Anthem on a console if it did a consistent 45fps and never drops lower than that while looking as great as it does most would be happy, and the ones who are unhappy with that should be playing it on a PC I would rather the game looks great and ran slightly slower (assuming it never ever drops into the 30's) then look sub par and do 60.
 
Are the Jaguar cores really THAT bad??

I pictured the Xbone X tearing it up at 1080p/60....

Yes. Case in point: The Jaguar cores in the PS4/XB1 can push ~105 GFLOPS. By comparison, the X360 CPU can push ~110 GFLOPS and the PS3 CPU can push ~240 (theoretical maximum) GFLOPS, with ~170 GFLOPS being more typical.

So yeah, the CPUs in these consoles were chosen due to power/heat concerns, not performance.
 
Id rather have more content then a steady 60fps. I played Botw at less than 30fps on wiiu and it was an ok experience compared to switch which had a steady 30fps. Game to me just got repetitious after awhile and honestly i could care less playing it on cemu at 1080p or 4k.
 
Are the Jaguar cores really THAT bad??

I pictured the Xbone X tearing it up at 1080p/60....

As mentioned, Jaguar is a glorified Piledriver, only a little bit more than a Bulldozer. Those CPU's were garbage when they were new, in 2018 they are a worthless steaming pile of manure. So yeah, they are really that bad and an overhaul was long overdue. This gen has been lukewarm, hell Pro and One X were what Ps4 and XBone should have been in the first place if you ask me. Next gen cannot come fast enough, then we will see a true upgrade and not just an improved graphics chip and memory like this current gen was.
 
Those Jaguar cores are aging pretty poorly. They were mediocre (at best) at launch in 2013 and they only increased the clock speed a little for the mid-cycle refreshes in the PS4 Pro and Xbox One X. Hopefully both Sony and Microsoft won't skimp out on the CPU this next generation.
 
Not too surprised with this. It wasn't until I got a 2080TI I was able to have a mostly smooth 60fps in MEA in 4k. The other rig with the 1080TI worked well for 1440p. Seems they're being pretty consistent performance-wise but lets hope the art is a bit more consistent than MEA. There were some parts I could understand the performance demands and others that seemed unwarranted. I'll also be curious about the Vram usage as MEA in DX11 consumed a lot in 4k when maxed so it'll be interesting to see how Anthem does whether DX11 or DX12. This too would affect console performance.
 
I can't wait for Ryzen to show up in consoles. Jaguar needs to die off.

Jaguar really is holding back the gaming industry.
 
4k/30FPS? I'm fine with that. That's some GORGEOUS textures going across my screen at a frame rate my eyes can handle vs some nebulous number that doesn't mean jack shit to my gameplay.
 
Not surprising considering NO new high profile release plays at 60FPS on these consoles, and hasn't for... ever?
 
Not surprising considering NO new high profile release plays at 60FPS on these consoles, and hasn't for... ever?
There are plenty of games for both refreshed consoles that run at 60FPS or has the option of doing so. Forza Motorsport 7 runs at 60 FPS in native 4K on the Xbox One X.
 
ugh I guess those of us without the refresh models are even worse off... I have noticed my ps4 non-pro struggling at 1080 for many games already sigh
 
4k/30FPS? I'm fine with that. That's some GORGEOUS textures going across my screen at a frame rate my eyes can handle vs some nebulous number that doesn't mean jack shit to my gameplay.
Sure, but the problem with 30fps being the target, is that it's generally not the minimum. Looks fine most times, other times if it ends up looking like a "cinematic experience".
 
There is a reason i completely skipped this console generation. They aren't simply worth it and in many ways were worse than the prior gen (such as pure CPU power). Waiting till the next gen systems for me. Hopefully by then they can get at least 1080p/60 down as a standard.
 
ugh I guess those of us without the refresh models are even worse off... I have noticed my ps4 non-pro struggling at 1080 for many games already sigh

In a controlled case, youd be able to tell. Usually if you are 'playing' a game then you arent going to notice much. Depends on how much of a videophile someone is.
 
There is a reason i completely skipped this console generation. They aren't simply worth it and in many ways were worse than the prior gen (such as pure CPU power). Waiting till the next gen systems for me. Hopefully by then they can get at least 1080p/60 down as a standard.
Well, there are a few exclusives that are absolutely worth it, especially on the PS4/Slim/Pro - Spider-Man and Bloodborne were two 10/10 games that were both limited to 30fps on any PS4 model.
Really, the refresh models only allow many games to run at a more-consistent 30fps if the games are soft-limited to 30fps, and with Bloodborne it really helps, but is certainly not a deal breaker.
 
Id rather have more content then a steady 60fps. I played Botw at less than 30fps on wiiu and it was an ok experience compared to switch which had a steady 30fps. Game to me just got repetitious after awhile and honestly i could care less playing it on cemu at 1080p or 4k.
did we play the same game? in docked mode, Botw drops down to sub 20s in some areas.
 
Yes. Case in point: The Jaguar cores in the PS4/XB1 can push ~105 GFLOPS. By comparison, the X360 CPU can push ~110 GFLOPS and the PS3 CPU can push ~240 (theoretical maximum) GFLOPS, with ~170 GFLOPS being more typical.

So yeah, the CPUs in these consoles were chosen due to power/heat concerns, not performance.

It wasn't just the power consumption. AMD didn't have any other CPU validated on bulk CMOS before Kaveri (2014 release date).

People forget that before that, all AMD CPUs were Silicon On Insulator.

It took Global Foundaries and AMD several years to unfuck themselves out of that mess. It's the number one reason why it took them four years after the merger to start producing good APUs (ATI uses bulk CMOS, and had to translate everything).

So, if they wanted an APU they could build at TSMC in early 2013, they were forced to hack together two Jaguar modules, and just took the performance hit getting them to talk to each other.
 
Last edited:
I can understand their reasoning, for the masses, visuals sell games more than framerate. However the solution is simple - default to high details / 30fps, but have an option for medium details / 60fps or similar.
 
It wasn't just the power consumption. AMD didn't have any other CPU validated on bulk CMOS before Kaveri (2014 release date).

People forget that before that, all AMD CPUs were Silicon On Insulator.

It took Global Foundaries and AMD several years to unfuck themselves out of that mess. It's the number one reason why it took them four years after the merger to start producing good APUs (ATI uses bulk CMOS, and had to translate everything).

So, if they wanted an APU they could build at TSMC in early 2013, they were forced to hack together two Jaguar modules, and just took the performance hit getting them to talk to each other.

My point was, if the consoles were built to maximize performance rather then have minimal power/heat concerns, Sony and Microsoft would have gone Intel+NVIDIA, not AMD.
 
My point was, if the consoles were built to maximize performance rather then have minimal power/heat concerns, Sony and Microsoft would have gone Intel+NVIDIA, not AMD.

COST was the most important factor here. Power was secondary (you can down-clock a more powerful processor to get the same perf/watt, while still giving you the same performance as twice-as-many Jaguar cores).

They wanted an APU, or they could have gone to ANYONE (i.e. Intel for CPU and Nvidia for GPU).

An APU with a single shared memory space is a whole lot cheaper to manufacturer than a PS3 (with separate main and GPU memory).

You can shrink the chips, but the PCB will always be the same complexity (if you have two separate memory spaces).

It's only $10-20 per-unit, but when you sell a hundred million of them, you tend to notice these things. It also simplifies your future die shrinks (just one chip instead of two).

And unifying memory type installed on each board means it's even easier to do cost reductions (like 8Gbit GDDR5 chips cut the number of memory chips on the PS4 PCB in half).

Remember, they launched PS3 at $500, and they were not about to repeat that. $400 was their target, and they had to cut things that would not matter to most gamers to reach that price.
 
Last edited:
Back
Top