Fallout 4 Performance and Image Quality Preview @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,624
Fallout 4 Performance and Image Quality Preview - Fallout 4 is out on the PC, in this preview we will take a look at performance between GeForce GTX 980 Ti and Radeon R9 Fury X as well as some in-game feature performance comparisons. We'll also take a look at some in-game feature screenshots and find out what settings are best for an enjoyable gaming experience.
 
Thanks again [H] for doing this preview. The god rays and TAA comparisons were exactly what I was looking for.
 
Those IQ sliders are quite awesome, especially when you zoom your browser window in.
 
Nice,

Looking forward to the full review.

I bought a Fallout 3 & New Vegas pack in an extreme sale in early 2013, and played though 3, but never started New Vegas.

I liked 3, but I wasn't as crazy about it as some of you on here. (I found the S.T.A.L.K.E.R. series to be similarly open-world FPS/RPG crossover in style, but much much more compelling, in a dark grim way)

I'll probably pick up this installment once it falls in price to the ~$10 level. I have to admit to being drawn in just a little bit more due to it being set locally though :p I wonder what landmarks might be recognizable :p
 
nice.

I do agree about it could have been released in 2012. What is really disappointing is how the game performs on GOOD hardware. IE why you need a 980Ti to play at 1440p?

credit to nvidia for having a game day driver. AMD ugh? per usual LATE to the party.

I am disappointed in the game because it could have been so much better. I will still play the crap outta it cause it is a fun game. For now though it sits in my steam library with 2 hours played, decided I'm just going to wait for AMD to release proper drivers for the game.
 
nice.
I do agree about it could have been released in 2012. What is really disappointing is how the game performs on GOOD hardware. IE why you need a 980Ti to play at 1440p?

What games DO play well at 1440p on a 980ti? (I mean, Civ 5 does, but that hardly counts)

I found that my 2013 Titan could play many titles well at 2560x1600, but most were slightly unsatisfactory in performance.

Now with two 980ti's in SLI at 4k resolution, I find I'm at about the same level. Many titles are OK, but the (to me) requirement of everything turned to max, and never dropping below 60fps, is still not met in most.
 
Nice run-thrus. Interesting that the FPS dips so low on these cards at 1440p because the graphics in NO way merit framerates that low. As stated already, this game could have been released in 2012.

As a side note, Crossfire is broke in this game out of the box but I hear you can use the Skyrim profile as a fix.
 
"New consoles will make PC games better because they are easier to port" ~ Everyone who doesn't know better
 
Zarathustra[H];1041965275 said:
What games DO play well at 1440p on a 980ti? (I mean, Civ 5 does, but that hardly counts)

I found that my 2013 Titan could play many titles well at 2560x1600, but most were slightly unsatisfactory in performance.

Now with two 980ti's in SLI at 4k resolution, I find I'm at about the same level. Many titles are OK, but the (to me) requirement of everything turned to max, and never dropping below 60fps, is still not met in most.

nearly all games? I've been playing at 1440p on my R9-290 just fine.
 
Zarathustra[H];1041965275 said:
What games DO play well at 1440p on a 980ti? (I mean, Civ 5 does, but that hardly counts)

I found that my 2013 Titan could play many titles well at 2560x1600, but most were slightly unsatisfactory in performance.

Now with two 980ti's in SLI at 4k resolution, I find I'm at about the same level. Many titles are OK, but the (to me) requirement of everything turned to max, and never dropping below 60fps, is still not met in most.

im playing at 2560x1440 with the specs in my sig. i have everything on ultra except god rays are normal and AO is normal. I get about 45-55 fps in wasteland area and 60+ indoors.
 
nice.

I do agree about it could have been released in 2012. What is really disappointing is how the game performs on GOOD hardware. IE why you need a 980Ti to play at 1440p?

credit to nvidia for having a game day driver. AMD ugh? per usual LATE to the party.

I am disappointed in the game because it could have been so much better. I will still play the crap outta it cause it is a fun game. For now though it sits in my steam library with 2 hours played, decided I'm just going to wait for AMD to release proper drivers for the game.

Is this not yet another TWImTBP game?

I cannot wait to try this game in a year after the bugs get hammered out. As for now, I have not had time to game in nearly 15 weeks. When I finally get a little time in December, I will need to finish GTAV and start Forza 6
 
Great article.

From the images Ultra Godrays look like over AA'd edges, really blurry, vastly reducing detail.
But I havent seen it in action which may give a different impression.

edit: forgot to mention...
Its a shame they didnt include options to use DSR and VSR for AA because I have found 1440p DSR @1080p to be very effective and uses less GPU than 4xMSAA.
Useful for people with lower end hardware in this game.

Fixed - Kyle
 
Last edited by a moderator:
I think the game's graphics look good with TAA (I'm assuming from the name Temporal AA). I was a bit hesitant to use it once I noticed it blurs the scene in my peripheral vision when I move, but I find I don't care as long as when I stop, everything is clear. FXAA in the game is much worse, with hideous aliasing in foliage.

I do need to do a quick test tonight between that and DSR 1440p to see which on has the clearest textures. I do stand behind my post last night where I said the textures were better than Skyrim's official HD pack, even with TAA enabled.
 
it seems weird that the game requires so much power to run.

do you plan to look at cpu and vram usage in the full review?
 
The fact that they force V-Sync ON and have no option in-game to disable it tells me these people still don't understand the full spectrum of PC gaming... Some of us want to choose options for the absolute lowest input latency because we are competitive minded. I started my gaming life playing CS on a team that was sponsored by Subway and Alienware back in 2002. The idea of always having stupid high (100-200+) fps with high refresh rate for the absolute lowest input latency has always been my first priority in any game. If I cant play modern games at the IQ I want with 100+ fps, I'll just wait for the hardware that can. Luckily 2xTitanX's on water is still doing just fine @ 1080p @ 150hz (cuz 4k is still for suckers).
 
Zarathustra[H];1041965357 said:
LOL, then set your refresh rate to 24hz and enable vsync :p

His frame rate might dip below 24 and get stuck at 12fps for the "Ultra cinematic feel." :D

@Kyle and the crew. Thanks for the preview. I hope that the readership here will be enlightened by the preview and skip this title for now until some patches are issued for the game.
 
Fuck this game. I have to manually edit the config files to be able to move my mouse, but doing so makes the game reset its settings and max everything out (which gives me a constant frame rate of 15) in such a manner that nothing I do afterwards will change it except deleting the config files altogether and letting it respawn new ones. I'm very close to just asking steam for a refund and never bothering to look at it again. Goddamn bethesda.
 
5930k with dual sli 980ti and the game still feels better at 1080p. I'm not complaining though. Bethesda games always has a bunch of shit to deal with. Skyrim was a major shit storm for issues if you broke 2k res and 60hz. But after modding the shit out of it it was amazing.


I enjoy the game now and probably will extra enjoy it later with mods.
 
it seems weird that the game requires so much power to run.

do you plan to look at cpu and vram usage in the full review?

But it doesn't need that much power to run. [H] just didn't bother to test anything except the top-end cards (PREVIEW), and last time I checked the jump up from 1440p to 4k was too massive for the 980 Ti to handle solo.

Read the other reviews out there: on medium you can run on a GTX 750 Ti, or a GTX 660 at 1080p, and the GTX 950 runs high just fine. I'm currently running my GTX 960 overclocked at 1080p Ultra.

As you can see from the reviews, the biggest performance sucker is ultra godrays. In the worst case, performance fell by 40% compared to running with High godrays (I assume everything else was left Ultra). Last time I checked, you need a beefcake GPU to max-out godrays in every other game they appear in.

I'll be happy to crank Godrays down a notch or two so I can run DSR 1440p, if it results i n sharper textures. Thanks for doing this performance test [H] - now I know what to kick to the curb first :D
 
Last edited:
Hey Kyle/Steve, any chance you could whip up comparison shots of some of the more telling Godray scenes with lower Godray settings? As big of a hit as the Godray option seems to be, it would be interesting to see what's lost with the lower settings.

FXAA in the game is much worse, with hideous aliasing in foliage.

Agreed. I find that the "crawling jaggies" from having no-AA or FXAA to be much worse than the "Vaseline" effect of TAA.

I ran a combination of DSR (4k down to 1440p) and SMAA in FNV and that did wonders for anti-aliasing and sharpness, so I might have to give that a try.
 
It's basically Skyrim with a fallout total conversion done, plus all the console port fun we've come to know in this industry.
 
nice preview. i appreciate the god rays and AA comparison shots. to my eyes TAA looks better, though obviously with it's own downfalls. like you said it's a subjective thing so thanks for the comparisons so i can come to my own conclusion. i definitely share your disappointment in the lack of AA options. game is damn fun though.
 
Last edited:
The fact that they force V-Sync ON and have no option in-game to disable it tells me these people still don't understand the full spectrum of PC gaming... Some of us want to choose options for the absolute lowest input latency because we are competitive minded. I started my gaming life playing CS on a team that was sponsored by Subway and Alienware back in 2002. The idea of always having stupid high (100-200+) fps with high refresh rate for the absolute lowest input latency has always been my first priority in any game. If I cant play modern games at the IQ I want with 100+ fps, I'll just wait for the hardware that can. Luckily 2xTitanX's on water is still doing just fine @ 1080p @ 150hz (cuz 4k is still for suckers).

wth are you talking about? you sound like a child speaking of things that can't even understand,you things high FPS are rates are just about have 2x Titan X's water cooled? you think input lag its just about FPS?. LOLin' really bad...

So you are concerned about Vsync?.. so you never played any fallout before or new vegas or skyrim?. they have always used the same physics tied to refresh rate engine, that's why they use that refresh rate cap, not 60fps as many say, the cap its the refresh rate of your monitor... everything in those games are always tied to refresh rate, motion, movement speed, physics, particles, etc... that's why the game behave crazy when you disable vsync ( .ini file) and have FPS all over the place...

in the other hand thanks Brent for the preview, as always great.. waiting for the full review.
 
wth are you talking about? you sound like a child speaking of things that can't even understand,you things high FPS are rates are just about have 2x Titan X's water cooled? you think input lag its just about FPS?. LOLin' really bad...

So you are concerned about Vsync?.. so you never played any fallout before or new vegas or skyrim?. they have always used the same physics tied to refresh rate engine, that's why they use that refresh rate cap, not 60fps as many say, the cap its the refresh rate of your monitor... everything in those games are always tied to refresh rate, motion, movement speed, physics, particles, etc... that's why the game behave crazy when you disable vsync ( .ini file) and have FPS all over the place...

in the other hand thanks Brent for the preview, as always great.. waiting for the full review.

Bingo, this guy gets it. Also, babochee, let me know who your 150 hz monitor gives you a competitive advantage over in this game. :rolleyes:
 
My desktop rig has no issues playing in 1080p, but for some reason my Lenovo Y50 is set to low automatically. What gives?!
 
wth are you talking about? you sound like a child speaking of things that can't even understand,you things high FPS are rates are just about have 2x Titan X's water cooled? you think input lag its just about FPS?. LOLin' really bad...

So you are concerned about Vsync?.. so you never played any fallout before or new vegas or skyrim?. they have always used the same physics tied to refresh rate engine, that's why they use that refresh rate cap, not 60fps as many say, the cap its the refresh rate of your monitor... everything in those games are always tied to refresh rate, motion, movement speed, physics, particles, etc... that's why the game behave crazy when you disable vsync ( .ini file) and have FPS all over the place...

in the other hand thanks Brent for the preview, as always great.. waiting for the full review.

No, he does have a point.

A part of the overall input latency is driven by the frame rate, simply due to the fact that mouse position and keyboard presses are sort of "frozen in time" when you start rendering a frame, until that frame is displayed on screen. The portion attributable to frame rendering can be measured as exactly 1/FPS, so for 60FPS that's 16.67ms. All else being equal, the higher the framerate, the lower the input lag, and vie versa.

Problem is, all is not always equal.

Add in SLI, and you instantly get an increase in input lag due to rendering at the same framerate. It's variable depending on when input was recorded. Worst case it could triple if input is recorded JUST after one GPU starts rendering, in which case it would be output three frames later (or 50ms at 60FPS!!) Best case is 2 frames (or 33.33ms at 60FPS), so it averages ~2.5 frames worth of input lag, vs 1 frame of input lag for a single GPU.

What this means is you have to have 150FPS in SLI to match the input lag of a single GPU at 60FPS, all else being equal. And that's just the average. Some movement will have higher input lag, requiring about 180fps to make sure your max input lag in SLI is no worse than a single GPU at 60fps...

Considering SLI rarely scales at 100% (and never above it) this means that adding SLI for the sole reason of input lag is a rather silly thing to do. You will never improve input lag by adding SLI. No way under any circumstance.

This didn't used to be the case with SFR. SFR didn't add input lag, but it also scaled horribly, so both AMD and Nvidia did away with it.

In some cases you can make SLI work for you though, but only in DX9 titles, as SLI still typically allows using SLI AA modes. With SLI AA, you get a small performance increase (and much better AA) compared to doing AA in one pass, and SLI AA does not add any input lag, so it may actually improve it, by raising the frame rate.


Now that' we've established that multi GPU AFR modes make input lag much worse, rather than better, lets also discuss the fact that the input lag in the render pipeline is not the only source of input lag. The input lag goes through the entire pipeline. Your mouse has some input lag, latency in the game engine/CPU has a whole bunch of input lag, as do non CRT displays.

Overall those other sources will typically add up to be a MUCH larger source of input lag than the render pipeline itself.

So then the question becomes, does it really matter? Should we be worried about input lag? This is a discussion that gets about as thorny as the discussion regarding whether high end audiophile gear actually sounds better than run of the mill decent hifi stuff.

If we refer to the research on the subject, we find that human beings generally visually interpret anything within 100ms of total delay to be instant. Sound is very different. We are much more capable of detecting timing being off in sound than visually. This is why it is so easy to tell if a drummer is off, even a few ms.

But sound is a different topic, we are concerned with display output right now.

So we know that the total input lag should ideally be below 100ms to be undetectable.

This means:

Monitor input lag + render pipeline input lag + CPU/Game Engine input lag + mouse/keyboard input lag all together must be 100ms or smaller.

From my perspective you can certainly put together a system with atrocious input lag, I wouldn't want to play a game on, but it will be due to the added figured here. I guess if you generally have a good setup, with a fast CPU, decent mouse, and a screen with low input lag, you can probably get away with high render pipeline input lag no problem, even with SLI.

If you don't - however - they can add up and get pretty bad.


So to sum all that up:
  • Input lag from the render pipeline can matter, but if everything else is good, it probably won't.
  • 2x SLI AFR INCREASES rendering pipeline input lag between 2x and 3x at the same FPS when compared to a single GPU, so NEVER add SLI to try to combat input lag. it will have the opposite effect, and significantly so. This only gets worse in 3 way (3x-4x) or 4 way (4x-5x).
  • Unless you are using a CRT, your biggest source of Input lag is probably going to be your screen, not your render pipeline (unless you use SLI, in which case it goes up)
  • Most perceived differences in input lag utp to a certain point are placebo effect, and go away in a controlled study. 100ms tends to be the least discernible input lag, but for some people it's higher than that, and the figure tends to go up with age :p
  • Input lag is mostly unimportant in single player games. High paced onjline multiplayer games are a different animal, but in something like fallout. even if you did have high input lag above 100ms total, it probably wouldn't be a big deal
 
Good work there, guys. I must say that in all the godray examples you showed except the last (lamppost) I preferred the high godrays over the ultra.

And I'll echo the comments about DSR.

With specific regard to godrays, I remember them from Far Cry 4 - perhaps you could compare and contrast to show the development of the technique.

I was, however, disappointed in two aspects: firstly there's no testing at 4K, and secondly there are no VRAM usage figures. I'm sure you'll cover these in your full article.

Something else you might like to consider is not graphical quality but visual quality in respect of higher DPI monitors.

Anyway, thanks for the first look and I'm looking forward to what follows. There's lots for you to do, so get cracking! :D
 
been playing at 4k on my r290 just fine.if you have a better card then that then there shouldnt be an issue.
 
Back
Top