Fallout 4 Benchmarks

is your draw distance @ 100%?

no, they were all at lowest. windowed borderless is also broken, framerate stuck at 15 fps. high framerate breaks subtitles and lip sync, didn't notice lockpicking being messed up though. i don't really know what the deal is honestly. i changed all my settings back to the ones i posted at the top of last page and my framerate in that same area dropped by.. 6.
 
look at my framerate. that's everything at lowest except textures, taa, and the extra stuff that doesn't cost anything. :/

That's the current state of buying games at release. Every blockbuster AAA is damn near guaranteed to be broken in some way that will keep you from enjoying yourself. That's why buying games at launch is a recipe for a headache. Why spend $60 for someone to annoy you?

I know that you really want to play certain games at release. You want to reward the developers for making your favorite game world come alive again. The best way to reward them and yourself is to have the self control to wait on the purchase. Read threads from people that can't wait and are suffering for being the early bird.

No way in hell I would play the game like that with all the settings on Low / Medium. I also wouldn't reward AMD or Nvidia for having bad drivers for a AAA game by upgrading to a faster video card that is obviously not needed for the quality of the textures shown. Since you bought it early, I would at least drop a note to Nvidia on their forums of the issue and Bethesda's.

Maybe a combination of patches and drivers will have it running well after soon after Christmas. I really doubt much will be done with Thanksgiving and Christmas coming up.

Sorry that the game runs like crap for you. :( It also runs like crap on the PS4 and XBONE if that makes you feel better. Gamebryo is trash.
 
Just played for roughly an hour and the performance is great on my 290X. Turned god rays to low and capped tessellation to 8x. Shadow Quality is set to High and everything else is at Ultra with TAA. Getting around 60-90fps at 1080p. I did not disable vsync but the framerate clearly shows I am getting 90 at max on my 144Hz monitor. When I switch over to 1440p however my framerate seems locked at 48fps.
 
Sorry that the game runs like crap for you. :( It also runs like crap on the PS4 and XBONE if that makes you feel better. Gamebryo is trash.

Ya, Bethesda really, REALLY needs a new engine. Probably a new engine development team. They just have waaay too many performance and stability issues. I love their games, but it gets real old all the issues.

If their games were solid, they'd be day 1 buys for me. As it stands, I won't be getting this until some time midway through next year because I want patches/mods to make it more playable.

Say what you will about EA, and I can say a lot, but Dragon Age Inquisition on Frostbite 3 looks gorgeous and runs quite well. I want a Bethesda game that looks n' runs like that.
 
If we look at PCgamehardware.de numbers for Dragon Age inquisition and Fallout 4 at 1080p using a common AMD measuring point the R9 290 Tri-X -

FO4: 62 fps
DAI: 49 fps

And DAI was an AMD partnered game using Mantle whereas FO4 is I guess what you might consider a Nvidia partnered game.

It also had it's fair share of issues (Frostbite games in general are not problem free either. Battlefront still has bugs in that were there in BF3) - http://kotaku.com/months-later-dragon-ages-pc-version-is-still-disappoin-1681031116
 
If we look at PCgamehardware.de numbers for Dragon Age inquisition and Fallout 4 at 1080p using a common AMD measuring point the R9 290 Tri-X -

FO4: 62 fps
DAI: 49 fps

And DAI was an AMD partnered game using Mantle whereas FO4 is I guess what you might consider a Nvidia partnered game.

It also had it's fair share of issues (Frostbite games in general are not problem free either. Battlefront still has bugs in that were there in BF3) - http://kotaku.com/months-later-dragon-ages-pc-version-is-still-disappoin-1681031116

people like to bitch a lot of how much "perfect" is DAI technically. however Bioware did a lot of tricks to make that that actually exist, each map have fixed weather, so even with the engine able to do global lighting and illuminations, they use fixed shadow, fixed godrays, etc etc. nothing in real time, that help a lot with the performance a game with the visuals that are present on DAI would be extremely heavy with real time lighting and weather cycles.
 
AMD: We can't run today's games adequately, but we have you covered for DX12*

*By covered we mean in one shitty game where we match NVidia's performance.

Back on topic, game seems to be running great on a 980 Ti with max settings (for the most part). Any SLi reports?
 
anyone expecting more from gamebryo than what bethesda has already shown they're capable of is a fool.



that's a slippery slope friend. 60 fps is unacceptable for shooters, period. especially in 2015.

Fallout is hardly a shooter. And what's unacceptable to you is ok for others. I'll happily play at 30 fps. I prefer visual quality over framerate everytime. It's not like my life depends on achieveing a perfect score in the game.
 
AMD: We can't run today's games adequately, but we have you covered for DX12*

*By covered we mean in one shitty game where we match NVidia's performance.

Back on topic, game seems to be running great on a 980 Ti with max settings (for the most part). Any SLi reports?

Ignored the rant, but in response to your question here is how you enable SLi. From the video it doesn't scale well, but makes a big difference in performance still. Nvidia drivers are crippled but this is better than nothing right?

How to enable SLi in Fallout 4.
https://www.youtube.com/watch?v=MmkZ_JkJibU
 
It's $46.99 at cdkeys.com for digital download. Not bad.

That price was steadily climbing as we closed to release date. It used to be 39.99, a few weeks back, then when I purchased it last week it was 43.99, now it's 46.99, maybe after release it will be 49.99.
 
That's the current state of buying games at release. Every blockbuster AAA is damn near guaranteed to be broken in some way that will keep you from enjoying yourself. That's why buying games at launch is a recipe for a headache. Why spend $60 for someone to annoy you?

I know that you really want to play certain games at release. You want to reward the developers for making your favorite game world come alive again. The best way to reward them and yourself is to have the self control to wait on the purchase. Read threads from people that can't wait and are suffering for being the early bird.

No way in hell I would play the game like that with all the settings on Low / Medium. I also wouldn't reward AMD or Nvidia for having bad drivers for a AAA game by upgrading to a faster video card that is obviously not needed for the quality of the textures shown. Since you bought it early, I would at least drop a note to Nvidia on their forums of the issue and Bethesda's.

Maybe a combination of patches and drivers will have it running well after soon after Christmas. I really doubt much will be done with Thanksgiving and Christmas coming up.

Sorry that the game runs like crap for you. :( It also runs like crap on the PS4 and XBONE if that makes you feel better. Gamebryo is trash.

Yeah. Gamebryo was decent for a while but sadly this is expected. I'm not really a Fallout fan and more of an Elder Scrolls fan so I likely won't be getting this. I still haven't played more than two hours into Fallout 3 but have more than 500+ hours into Skyrim and Oblivion independently with mods.

The other thing I wonder about is the OP. Does Prime1 actually play Fallout 4 or any video games or does he just post about when Nvidia does better than AMD? Sometimes the posts are informative, but the agenda is obviously clear. I guess along with Gamebryo, I couldn't have expected anything else.
 
Never seen him post anything but I hate AMD rhetoric. Which just makes him a troll and ignored.
 
people like to bitch a lot of how much "perfect" is DAI technically. however Bioware did a lot of tricks to make that that actually exist, each map have fixed weather, so even with the engine able to do global lighting and illuminations, they use fixed shadow, fixed godrays, etc etc. nothing in real time, that help a lot with the performance a game with the visuals that are present on DAI would be extremely heavy with real time lighting and weather cycles.

I wasn't aware of those specifics regarding DAI but I understand the general sentiment in that one needs to look beyond simply the surface to better understand what is going on.

I remember posting similar commentary regarding Battlefront when everyone was awed by the performance increase from BF4 but disregarding how it was scaled down considerably in many areas. Although to be fair not without some improvements in others.

Certainly I don't think using certain "tricks" is bad optimization, it can actually be great, but there should be an understanding of what is actually going on.

One thing with Bethesda's games, well the ones we are discussing, I think that is often overlooked is that how much interactive objects are in the environments. If you consider this and how static many other game worlds are people might understand how this affects certain challenges regarding performance.

Yeah. Gamebryo was decent for a while but sadly this is expected. I'm not really a Fallout fan and more of an Elder Scrolls fan so I likely won't be getting this. I still haven't played more than two hours into Fallout 3 but have more than 500+ hours into Skyrim and Oblivion independently with mods.

Why do you feel the engine has taken a step back, presumably from Skyrim?
 
I wasn't aware of those specifics regarding DAI but I understand the general sentiment in that one needs to look beyond simply the surface to better understand what is going on.

I remember posting similar commentary regarding Battlefront when everyone was awed by the performance increase from BF4 but disregarding how it was scaled down considerably in many areas. Although to be fair not without some improvements in others.

Certainly I don't think using certain "tricks" is bad optimization, it can actually be great, but there should be an understanding of what is actually going on.

One thing with Bethesda's games, well the ones we are discussing, I think that is often overlooked is that how much interactive objects are in the environments. If you consider this and how static many other game worlds are people might understand how this affects certain challenges regarding performance.



Why do you feel the engine has taken a step back, presumably from Skyrim?

I haven't played Fallout 4 and I'm not saying it's a step back but after seeing some animations from gameplay videos from the game it certainly looks a little dated. The comparison I like to make is with Killing Floor. A game from 2009 compared to 2015 is just night and day with physics and animations and I don't see that jump from Fallout 3 to 4. It's not exactly a fair comparison because Fallout is just so huge and being on Gamebryo I understand the world it's set in functions pretty well for that. I'm sure it will be a pretty fun game for Fallout fans regardless.
 
killing floor is a game from 2005 not 2009.

this game runs like ASS but i'm really enjoying the aesthetic that pbr creates. i've noticed that there is some serious input lag even at high framrates with gsync, feels like regular vsync and makes aiming insanely difficult. not sure if nvidia or gamebryo bug.
 
The mod was 2005, you're right. I don't really see how that changes my point. I'm sure after a few driver updates the game will run fine. It was just released.
 
Just played for roughly an hour and the performance is great on my 290X. Turned god rays to low and capped tessellation to 8x. Shadow Quality is set to High and everything else is at Ultra with TAA. Getting around 60-90fps at 1080p. I did not disable vsync but the framerate clearly shows I am getting 90 at max on my 144Hz monitor. When I switch over to 1440p however my framerate seems locked at 48fps.

Well that's awesome. Let us know if the game continues to run well for you as you progress through the game.
 
Installed the game tonight and played for an hour but I couldn't get passed the feeling that there was some glitchiness and choppiness despite the specs on my machine.

Did a few things to edit the performance and now the game is running a lot smoother and more fluid.

I'm running the game at 4k and the game was capped at 48fps. I edited the ini file and uncapped the fps, and also changed a few other things like FOV to 100. I find AA really not needed at 4k.

I also did a trick someone posted on Steam to enable a SLI since the game didn't seem to utilize the second 980 ti.

Now the game is running 60+ fps without a problem.

System specs

5930K at 4.4ghz
2 980ti SLI's
4k Monitor at 60hz.

I'm about to try it out on the acer xb270hu to see if the game runs okay at 144hz (Bethesda games tend to flip their shit at anything higher than 60).

Here's a screenshot - http://imgur.com/fOvwJyK
 
Certainly I don't think using certain "tricks" is bad optimization, it can actually be great, but there should be an understanding of what is actually going on.

Tricks are actually rather necessary and are what one also might call good optimization. We can't do photo realistic rendering in realtime on current systems. We can't do a full radiosity calculation for everything, etc, etc. So what developers need to do is figure out ways to make things look good, while staying in the constraints of the hardware people have.

Most of the games really famous for their visuals were that way. Doom is a great example. Nobody thought you could do any kind of realtime 3D back then and really, you couldn't. However iD figured out ways to fake it. They used a ton of clever tricks to make that game happen on what is some extremely limited hardware when you look back.

What impresses isn't the most technically correct implementation of things, but the one that gives the best experience to the end user. I use DAI as an example because the visuals in that game were good enough to make me go "wow" on multiple occasions.

The engine influences what tricks/optimizations are available as well. A really good engine is capable of doing a lot of things to help designers and artists make the most of what they have.
 
Both this game and Black Ops III runs poorly for me at 1440P and Ultra settings on my system. I think I will try to replace my overclocked GTX 980 with a 980 Ti.
 
I played for like an hour and if performance wasn't so temperamental, it'd be a lot more pleasant.

Like just walking around the house at the start, performance would go from 60 solid and then magically drop to like 45. Two seconds later it's back at 60 at the exact same place. There's spots where my FPS will dip too that I can't make any sense of - like I'm looking at the corner of a tiny room or something. Move camera a smidge or take a step and back to 60 it goes.

Other platforms don't exactly seem to fare much better.
 
Looks like some CPU optimization issues for AMD GPUs, overhead probably. Some 390 owners reporting they can't reach 100% GPU usage.
https://www.reddit.com/r/Amd/comments/3s8wa2/optimal_fallout_4_settings_for_r9_390_users/

And yet the game's performance issues are clearly Nvidia's fault.

join the gpu use fluctuating club my friend. something is dead wrong with how the drivers are handling 390s. See this in tons of games, along with lower fps than what the hardware SHOULD be giving. just wait for the omega.
R9 390 here. Can someone explain me why my GPU isn't cap at 100% while I play the game? This can't be a cpu bottleneck problem. http://imgur.com/jo0Ah6O This seems really odd...
friend of mine has a sapphire fury he gets 70-100fps all settings max on 1080p with 60% gpu usage
so cpu bottleneck is definately there and amd needs to fix it with a driver
Digital Foundry did a performance comparison b/w the 390 & 970 @1080p. They found similar issues to you.

At this point it seems to be an issue with both volumetric lighting tessellation (which can be disabled/reduced) and driver overhead.
 
Last edited:
I can understand why game developers don't like working with AMD. Every time a game performs poorly on AMD hardware (sadly all the time these days), AMD fans say that the game is bad or the game developer is bad. They also will go on to attack the website that did the benchmarks or the person who posted the benchmarks.

I have never seen a more toxic fan base and what's worse is that AMD actively promotes this kind of behavior.

Looking forward to getting this game, looks like it plays well on 80% of the cards being sold these days.
 
Looks like some CPU optimization issues for AMD GPUs, overhead probably. Some 390 owners reporting they can't reach 100% GPU usage.
https://www.reddit.com/r/Amd/comments/3s8wa2/optimal_fallout_4_settings_for_r9_390_users/

And yet the game's performance issues are clearly Nvidia's fault.

AMD is becoming totally irrelevant in the GPU space just like they became totally irrelevant in the CPU (x64) space. They are bleeding market share and selling obviously inferior products (look at performance vs power consumption) and unable to provide driver support and value added features (game sharing) that Nvidia has. The LARGEST title of Q4'15 just launched and they did not have optimized drivers released prior to the game going live.

I can understand why game developers don't like working with AMD. Every time a game performs poorly on AMD hardware (sadly all the time these days), AMD fans say that the game is bad or the game developer is bad. They also will go on to attack the website that did the benchmarks or the person who posted the benchmarks.

I have never seen a more toxic fan base and what's worse is that AMD actively promotes this kind of behavior.

Looking forward to getting this game, looks like it plays well on 80% of the cards being sold these days.

Fanboys on the AMD subreddit are claiming things are overly tessellated, tessellation is a core DX11 feature. In the same breadth, these fanboys claim that AMD will be the best at DX12 which is laughable given AMD's track record.
 
Fanboys on the AMD subreddit are claiming things are overly tessellated, tessellation is a core DX11 feature. In the same breadth, these fanboys claim that AMD will be the best at DX12 which is laughable given AMD's track record.
It's probably over-tessellated. We've seen it before.
Gotta check Kepler vs Maxwell in the benchmarks. That's a good indicator.

780 Ti beating 970. 770 beating 960. Everything looks clean from where I'm sitting, so...
 
Installed the game tonight and played for an hour but I couldn't get passed the feeling that there was some glitchiness and choppiness despite the specs on my machine.

Did a few things to edit the performance and now the game is running a lot smoother and more fluid.

I'm running the game at 4k and the game was capped at 48fps. I edited the ini file and uncapped the fps, and also changed a few other things like FOV to 100. I find AA really not needed at 4k.

I also did a trick someone posted on Steam to enable a SLI since the game didn't seem to utilize the second 980 ti.

Now the game is running 60+ fps without a problem.

System specs

5930K at 4.4ghz
2 980ti SLI's
4k Monitor at 60hz.

I'm about to try it out on the acer xb270hu to see if the game runs okay at 144hz (Bethesda games tend to flip their shit at anything higher than 60).

Here's a screenshot - http://imgur.com/fOvwJyK

how did you get 980 ti sli to work? the nvidia sli change to force alternative frame rendering 2 drops my fps. I also tried the nvidia inspector changes but it apparently does nothing. Did you compare vs single gpu?
 
how did you get 980 ti sli to work? the nvidia sli change to force alternative frame rendering 2 drops my fps. I also tried the nvidia inspector changes but it apparently does nothing. Did you compare vs single gpu?

To be honest I didn't fully test it. The game still gave some drops in concord and a few places to under 60fps so I ended just putting it at 1080 and now it hasn't had a hitch. I can't confirm full SLI usage, but I really dont see a major increase with 2 cards...I just didn't have a major crap fps loss like some people had.
 
Well that's awesome. Let us know if the game continues to run well for you as you progress through the game.


Played from 10PM MST until 3:30 AM MST on an old (2006) HP workstation with 2 Xeon dual core CPUs and a windforce 7950. Game ran great, can 't tell you my frame rates but over all it looked good I'll check frame rate tonight and report back.
 
how did you get 980 ti sli to work? the nvidia sli change to force alternative frame rendering 2 drops my fps. I also tried the nvidia inspector changes but it apparently does nothing. Did you compare vs single gpu?

Single 980 Ti is faster for me than alternate 2 SLI.
 
Looks like some CPU optimization issues for AMD GPUs, overhead probably. Some 390 owners reporting they can't reach 100% GPU usage.
https://www.reddit.com/r/Amd/comments/3s8wa2/optimal_fallout_4_settings_for_r9_390_users/

And yet the game's performance issues are clearly Nvidia's fault.






At this point it seems to be an issue with both volumetric lighting tessellation (which can be disabled/reduced) and driver overhead.

tainted, please go back to the anandtech forums.
 
how did you get 980 ti sli to work? the nvidia sli change to force alternative frame rendering 2 drops my fps. I also tried the nvidia inspector changes but it apparently does nothing. Did you compare vs single gpu?

I have 2xGTX Titan (similar to 700 series card) and AFR2 works wonder for me. Seems like this settings does not play well with the 900 series card. How does the stock SLI profile that comes with the latest driver do for you?
 
Going to throw this in, runs on Medium @ 1080p on my 750ti reasonably well, I haven't gotten very far in the game but it seems to hover around 40 fps and dips down to the low 30s here an there.

I think I am going to run into some frame drop sooner or later that will make me turn down some settings.

Overall, game is fun, gfx are decent for a Fallout game, character animations are subpar, but better than any Skyrim or Fallout 3/NV.
 
it seems like as with most games these days that it's fine on lower end stuff but it just fails to scale up properly. from what i saw my brother is able to hold a solid 50-60 at 1080p on his 570 even with godrays on (low.) i imagine that will probably drop into the 40s or mid-high 30s once he goes to the corvega factory.
 
I have 2xGTX Titan (similar to 700 series card) and AFR2 works wonder for me. Seems like this settings does not play well with the 900 series card. How does the stock SLI profile that comes with the latest driver do for you?

Seems like it works with the Keplar cards but not maxwell (Titan X sli probably will not work either). I think too many people are going by graphics card usage in sli but I clearly see downgrade in performance when running sli with both cards running 70-90% usage. I tried many different settings. The only sli profile that increased performance was the assassin's creed black flag profile, however it causes massive amount of flicker and graphics issues that make it not worth using. I am just going to play single card and downgrade some of the more minor graphics settings to get near 60 fps stable. Hoping a new sli profile will come out so i can start maxing everything and maybe even DSR a little.
 
Looks like some CPU optimization issues for AMD GPUs, overhead probably. Some 390 owners reporting they can't reach 100% GPU usage.
https://www.reddit.com/r/Amd/comments/3s8wa2/optimal_fallout_4_settings_for_r9_390_users/

Yep

Benchmarks in cpu dependent place:

http://www.purepc.pl/karty_graficzn...c_bombowa_optymalizacja_czy_niewypal?page=0,5

Same configuration in gpu bound place

http://www.purepc.pl/karty_graficzn...c_bombowa_optymalizacja_czy_niewypal?page=0,4
 
Back
Top