Fallout 4 Performance and Image Quality Preview @ [H]

Check to make sure SLI is actually working because the default profile is broken. I know you can use the FO3 SLI bits and force AFR2 but results have been mixed from what I've read.

It actually dropped my FPS when I did that so I went back to single card.
 
Anyone have a clue as to why the AMD cards fare so poorly in this game if it's a bad console port and AMD hardware powers those?

Because it was an fanboy wishful thinking to begin with that AMD chips being in consoles would somehow translate to any benefit to AMD GPU's in PC's? Its as silly as all the people that assumed Macs would be running x86 Windows apps inside OSX after the transition to Intel.

The low-level API's on consoles and DX11 on PC are completely different beasts, there's practically no crossover, regardless of the fact Microsoft marketing-named it "DirectX" on Xbox - its a highly customized fork.
 
AMD cards never liked Fallout or The elder scrolls tittles.. it has been always the same.. AMD tend to perform poorly in those titles.. why? nobody knows.

I find that Bethesda titles tend to be CPU bound, regardless of CPU. So with AMD drivers having much higher CPU overhead that might explain why.
 
Zarathustra[H];1041968512 said:
I read earlier in this thread that this engine relies on being at 60fps for physics to work correctly. If you disable vsync and let the fps run wild, apparenlty the physics calculations can go haywire.

Woudn't that be a kicker. Hardcoding to the max, lol.

If there was ever a crime against object-oriented design laws like high cohesion/low coupling between system components, tying in the entire game's physics engine to the refresh rate may get the death penalty. In fact, i don't even know of a game in recent memory where that was the case. Older games may go in fast forward mode with more powerful hardware, but actually break the mechanics of the engine itself??! ha!
 
Woudn't that be a kicker. Hardcoding to the max, lol.

If there was ever a crime against object-oriented design laws like high cohesion/low coupling between system components, tying in the entire game's physics engine to the refresh rate may get the death penalty. In fact, i don't even know of a game in recent memory where that was the case. Older games may go in fast forward mode with more powerful hardware, but actually break the mechanics of the engine itself??! ha!

id Software engine games are guilty of this and they are also part of Bethesda, go figure.
 
Cmon 770 4gb, don't fail me now. Just want a bit over 30fps at 4K at high-ish settings + ultra textures + smaa.

Handled gta5 no problem...
 
Cmon 770 4gb, don't fail me now. Just want a bit over 30fps at 4K at high-ish settings + ultra textures + smaa.

Handled gta5 no problem...

hardly believable.. that card it's barely able to do old stuff game at 2560x1440 at high settings with decent frame rate..
 
hardly believable.. that card it's barely able to do old stuff game at 2560x1440 at high settings with decent frame rate..

Alright silly goose, here's my gta5 fly through benchy results (using compy in sig):

Fps range 29-32

Dx11
3840x2160
VRAM 2987/4095
60hz
Fxaa on
Msaa off
Vsync on
Pop. Density max
Pop. Variety max
Distance scaling max
Texture quality very high
Shader quality very high
Shadow quality very high
Reflection quality high
Reflection smaa 8x
Water quality very high
Particles quality very high
Grass quality normal
Soft shadows softer
Post fx high
Motion blur strength very low
In game dof off
Af 16x
Ambient occlusion high
Tessellation high

Nvidia control panel settings (ones of interest, at least)
Shader cache on
Af sample optimization off
Negative lod bias clamp
Texturing filtering quality high quality
Triple buffering on

Not too shabby!

Edit 2: forgot to say exactly 0 of the 'advanced graphics settings' are turned on
 
Last edited:
Zarathustra[H];1041968512 said:
I read earlier in this thread that this engine relies on being at 60fps for physics to work correctly. If you disable vsync and let the fps run wild, apparenlty the physics calculations can go haywire.

I did not have time to test tonight but that is exactly what I think it is.
 
i've got framerate capped at 60hz for exactly that reason. same thing happened in Skyrim.
 
Another voice for the ReShade TAA tweak. Looks amazing with a very minor performance hit. Highly recommended.
 
Kindof wanting to get this game, Kindof wanting to wait for the bugs to be ironed out first....
 
Zarathustra[H];1041968512 said:
I read earlier in this thread that this engine relies on being at 60fps for physics to work correctly. If you disable vsync and let the fps run wild, apparenlty the physics calculations can go haywire.

not really, it relies on being at Monitor refresh rate for physics (and in fact everything its tied to refresh rate) to work correctly... it can be 60 or 120 or 144 or whatever you have, it just need to be as stable as possible to the refresh rate. it was always the same in FO3, FO:NewVegas or Skyrim.
 
not really, it relies on being at Monitor refresh rate for physics (and in fact everything its tied to refresh rate) to work correctly... it can be 60 or 120 or 144 or whatever you have, it just need to be as stable as possible to the refresh rate. it was always the same in FO3, FO:NewVegas or Skyrim.

This is incorrect, it bugs out above 60. One bug that happens specific to this game is you get stuck at computer terminals. The gameclock also goes haywire. These things happen even with vsync if the framerate is above 60.
 
I find that Bethesda titles tend to be CPU bound, regardless of CPU. So with AMD drivers having much higher CPU overhead that might explain why.

Nah the game hardly takes my older i5 above 75%. The game is bogging down some part of the graphics cards render pipeline. In the low FPS areas the GPU will only be at 50-60% and CPU will be around 50 as well.
 
Nah the game hardly takes my older i5 above 75%. The game is bogging down some part of the graphics cards render pipeline. In the low FPS areas the GPU will only be at 50-60% and CPU will be around 50 as well.

What that isn't showing is when the processor stalls having to go out into system memory.

The games have historically benefited from cache size and memory speeds. Some current preliminary data shows that is still the case with Fallout 4, particularly data sets that look at worst case areas.

What I would be interested in seeing is benchmarks done against a 5775c. Sadly it is a rare product and so that is unlikely.
 
Also, some AMD driver news, I was able to ask AMD today about upcoming optimizations for Fallout 4, there will be a driver update coming with Fallout 4 optimizations, I can't say what driver it will be, that is under NDA.

Did they mention it included a crossfire profile as well?
 
Zarathustra[H];1041969117 said:
What would be the benefit of that? Vsync is great. I much prefer it to any tearing

This.
In none competitive gaming its a must.

For a few people it can introduce a large mouse lag but that isnt the fault of vsync itself, there are quite a few ways to tweak it out if you have problems.
I have seen some people really struggle, I've always managed to get a good experience.
ymmv.
 
Zarathustra[H];1041969117 said:
What would be the benefit of that? Vsync is great. I much prefer it to any tearing

Some of us have no need for vsync as we've got this new tech called g-sync. ;) And even if I didn't have g-sync, the mouse lag from vsync alone would put me off from it.
 
For anyone with an AMD card wanting better performance, simply cap tessellation to 8x in the driver and set godrays to low. I'm getting about what the Fury X was getting in this article at 1440p on a 290x.

I'm amazed this was not tried as part of the review, its well known in the community that whatever (over?) tessellation being done in the game, is really not good for AMD setups. Like crysis2 invisible oceans all over again. It also has practically zero visual impact on a game with very low complexity objects in most cases.
Why was this not addressed?

Frankly, for a 2015 AAA title, I think this game looks crap, ~mid 2000 graphics (texture/object wise) with some fancy lighting and some modern effects. Good old, regurgitated morrowind engine.

Will be waiting some months, until good mods are available, drivers are working for both camps, again Nvidia gets a free pass in the community for not having SLI support but AMD is demonised on another Gamesdontworks title.

I think the card requirements for 'lowly' 1440p are a joke. Terrible optimisation from Methesda, when considering the crap that the engine pumps to the screen.

Other than that, loved the review with the sliders, was really nice way to compare various settings.
 
Nvidia gets a free pass in the community for not having SLI support but AMD is demonised on another Gamesdontworks title.
Yep this side of the fence is more civilised.
The attitude of AMD nutters has put me off them for life now.
My last card I had was a 290x which I had an issue that forced me to not use an AMD card.
Every time it is mentioned I get trolled by fanboys and I get called an NVidia fanboy.
No, I am anti "AMD fanboys".
They can keep AMD to themselves, way to make their favourite company less relevant :p
 
Nvidia gets a free pass in the community for not having SLI support

Brent have mentioned in a previous post that SLi requires an update in the game to enable it. Other than that, nVidia have delivered their end of the bargain by providing a driver update right on time for the game.
 
This seems to be working pretty well for me with SLI. I have had been playing for a couple hours before it crashed on me. F5 is your friend!

Fallout4-SLI%20Settings.png
 
Some of us have no need for vsync as we've got this new tech called g-sync. ;) And even if I didn't have g-sync, the mouse lag from vsync alone would put me off from it.

never understood the hate for vsync. screen tearing is so intrusive and distracting. enable vsync and triple buffering and be on your way.

cant remember the last time vsync introduced noticeable lag in a game.
 
Some of us have no need for vsync as we've got this new tech called g-sync. ;) And even if I didn't have g-sync, the mouse lag from vsync alone would put me off from it.

Placebo effect. There is no mouse lag from vsync, at least if you turn off triple buffering, and reduce pre-rendered frames.

You gain a little better response time as the framerates go up, but it is marginal. I mean, the total frame delay at 60fps in 16.67ms. Lets say you tun off vsync, and can race up to 100fps, now you have 10ms frame time. You have just had an improvement of 6.67ms, which is not visually perceivable to a human being. (you can detect differences in hearing, but that's a different topic). In fact, if you rendered at INFINITE FPS, and that render time disappeared all together, it still wouldn't be significant.

Instead you are stuck with a shitty small G-Sync monitor (wake me when they have a 40" 4k panel with g-sync).

That, and you are producing much more heat and noise. Much better to use adaptive vsync, let it go up to 60fps, and not above, as your GPU's will be quieter and produce less heat.

This is especially true if you are using multiple GPU's to reach those high framerates, as both SLI and CFX in AFR mode will always increase your input lag (or the portion of it contributed to by frame time) by an average of 2.5x with two gpu's (3.5x with 3 gpus' and 4.5x with 4 gpus), way more than you can compensate for by disabling vsync and cranking up the FPS.

60fps is enough, even for the most intense competitive gamer. Anything above that falls into placebo territory, just like all that audiophile junk.

IMHO:
  • Under 30fps: Looks terrible
  • Under 60fps, but over 30fps: Looks imperceptible from higher frame rates, but mouse feel is off due to frame time lag
  • 60 fps: Ideal
  • Over 60fps: Waste of money, power, heat and fan noise :p
 
never understood the hate for vsync. screen tearing is so intrusive and distracting. enable vsync and triple buffering and be on your way.

cant remember the last time vsync introduced noticeable lag in a game.

Yeah, it's all placebo effect.

Little kiddies who think they are super sensers, and can perceive humanly impossible differences at the single digit ms level.

Very similar to audiophile snobs. It's all in their heads.

That's not to say that input lag can't be a problem, but if you get a decent screen with low input lag, stick with one GPU (multi-GPU inherently worsens input lag) and make sure you can hit and stay at 60fps flat, you are good to go.

I - for one - can't wait to go back to a single GPU. I have high hopes for the next gens from AMD/Nvidia to drive my 4k screen with one GPU, in the games I enjoy.

I don't give a fuck. I'll pay next gen Titan prices to avoid SLI/CFX, and sell my 980ti's
 
Zarathustra[H];1041969495 said:
  • Under 60fps, but over 30fps: Looks imperceptible from higher frame rates, but mouse feel is off due to frame time lag
  • Over 60fps: Waste of money, power, heat and fan noise :p

you're way off base on these two but i suppose that's a conversation for another thread.
 
Same deal as Skyrim, although that game in practice (for me at least) was playable until about 90fps/90hz.

in addition to the crazy physics, it did some wonky things with in-game-time in Skyrim. traders were closed when they should have been open, stuff like that.
 
in addition to the crazy physics, it did some wonky things with in-game-time in Skyrim. traders were closed when they should have been open, stuff like that.

I played hundreds of hours of that, I never had any issue unless it saw 100+. It was pretty obvious when things started happening, I never had any problems with the day/night or mission timers until same 100+ mark. I was really hoping they would change that for FO4 but I guess they didn't learn much.
 
This game is also heavily reliant of RAM speed. There is a noticeable difference in frame rate between 1600 and 2400 ram.
 
Zarathustra[H];1041969509 said:
Yeah, it's all placebo effect.

Well I have to agree
Last time I noticed input lag was on my retro rig with Glide and Voodoo5 card. It was up to a SECOND of lag...driver issue for sure. Had to revert to original drivers for the card.
 
Back
Top