GeForce GTX 970: NVIDIA's Recommended GPU For Fallout 4

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
According to the official GeForce website, if you are building a system today to play Fallout 4, the GeForce GTX 970 is the card to get.

The official Fallout 4 system requirements call for a GeForce GTX 550 Ti with 2GB of VRAM for 1280x960, minimum setting, 30 FPS gameplay, and a GeForce GTX 780 with 3GB of VRAM for a higher level of detail. Neither of these previous-generation cards are available to purchase, however, so if you’re looking to build a system for Fallout 4 today, or to upgrade an older rig, what GeForce GTX GPUs should you be targeting for High setting, 60 FPS gameplay?According to our comprehensive benchmarking, the GeForce GTX 970.
 
Ars has a review online, looks like a 2012 game as far as visuals. I am, sure a 970 would run it no prob.
 
Ars has a review online, looks like a 2012 game as far as visuals. I am, sure a 970 would run it no prob.

Ars review is for the ps4, but yes it doesn't look like this game has top notch graphics. However that's par for the course for fallout games.

I think below has a good guide on how to setup your gpu and what impact all the gfx settings have. Excited to fire up my titan-x for this. Been sitting idle since I beat witcher 3 months ago.

http://www.geforce.com/whats-new/guides/fallout-4-graphics-performance-and-tweaking-guide
 
Glad I have a Titan X and, for the most part, don't have to adjust too much to play Fallout 4. I know I won't be able to max everything at 4k or 1440p but at least I'll be able to turn most things close to max.
 
Glad I have a Titan X and, for the most part, don't have to adjust too much to play Fallout 4. I know I won't be able to max everything at 4k or 1440p but at least I'll be able to turn most things close to max.

I'm just guessing, but Titan X probably can max this out at 1440p.
 
Ars review is for the ps4, but yes it doesn't look like this game has top notch graphics. However that's par for the course for fallout games.

I think below has a good guide on how to setup your gpu and what impact all the gfx settings have. Excited to fire up my titan-x for this. Been sitting idle since I beat witcher 3 months ago.

http://www.geforce.com/whats-new/guides/fallout-4-graphics-performance-and-tweaking-guide

Ha, totally missed the PS4 part.

But still, it has never been a ground breaker in the graphics dept.
 
The inevitable texture mods, draw distance mods, and other miscellaneous graphics mods should be able to make even a Titan X choke if you try to use them all.
 
I'm just guessing, but Titan X probably can max this out at 1440p.
not according to nvidia...
fallout-4-nvidia-geforce-gtx-900-series-performance.png
 
GTX 970, check, although I am a bit worried about my aging core i5-750.

if you have it OC'd to at least 4.2GHZ you will have decent performance. however still may present bottleneck in some games. don't know in Fallout 4.
 
not according to nvidia...

Thats with godrays on ultra. If you look at the comparison its VERY difficult to tell the difference from Ultra vs Low for this gameworks feature.

Honestly. I'm pretty sure that is the only thing effecting performance so heavily, and probably heavily impacting the AMD cards.


People are gonna play tonight, turn it to low and be playing at high frames without issue.
 
Oh I know it bottlenecks some games. I plan on building new in Feb next year so I may just have to turn some settings down until then.

BTW it is running 3.4GHz with a Corsair H80 cooler. Have to remember this is a multiplier locked 2.6GHz part normally.
 
I think some of the quality settings have a pretty huge impact on frame rate with little or any benefit to the end-user. Specifically "God Rays":

From nVidia's website:
http://images.nvidia.com/geforce-co...-interactive-comparison-001-ultra-vs-low.html
http://images.nvidia.com/geforce-co...-interactive-comparison-002-ultra-vs-low.html
http://images.nvidia.com/geforce-co...-interactive-comparison-003-ultra-vs-low.html
http://images.nvidia.com/geforce-co...-interactive-comparison-004-ultra-vs-low.html
http://images.nvidia.com/geforce-co...-interactive-comparison-005-ultra-vs-low.html
http://images.nvidia.com/geforce-co...-interactive-comparison-006-ultra-vs-low.html

Only the 5th one shows any easily noticeable difference. The "Ultra" quality looks to give at least a 30% frame rate hit on a 980Ti vs "Low" @1080.

So vendor includes proprietary tech in new game which adds little value, but at "Max settings" makes their older GPUs seem inferior, causing their customers to upgrade to newer chips....

I think I'll just turn down the settings, have it look virtually the same, and keep using my old card.

Good try NV... :rolleyes:

Let me know when gameworks gives me some actual value, and not just force a game to play like crap on a relatively new card.
 
Rinse & repeat Gameworks formula is starting to be predictable...
 
I think some of the quality settings have a pretty huge impact on frame rate with little or any benefit to the end-user. Specifically "God Rays":

From nVidia's website:
http://images.nvidia.com/geforce-co...-interactive-comparison-001-ultra-vs-low.html
http://images.nvidia.com/geforce-co...-interactive-comparison-002-ultra-vs-low.html
http://images.nvidia.com/geforce-co...-interactive-comparison-003-ultra-vs-low.html
http://images.nvidia.com/geforce-co...-interactive-comparison-004-ultra-vs-low.html
http://images.nvidia.com/geforce-co...-interactive-comparison-005-ultra-vs-low.html
http://images.nvidia.com/geforce-co...-interactive-comparison-006-ultra-vs-low.html

Only the 5th one shows any easily noticeable difference. The "Ultra" quality looks to give at least a 30% frame rate hit on a 980Ti vs "Low" @1080.

So vendor includes proprietary tech in new game which adds little value, but at "Max settings" makes their older GPUs seem inferior, causing their customers to upgrade to newer chips....

I think I'll just turn down the settings, have it look virtually the same, and keep using my old card.

Good try NV... :rolleyes:

Let me know when gameworks gives me some actual value, and not just force a game to play like crap on a relatively new card.

I don't see any difference. What the fuck?
 
Random question related to Fallout 4 performance.; I currently have a Sandy Bridge i5 and a GTX 670. I'm budgeting for a system upgrade in about 2 years (long story, but no cash will be available for a while). Do you guys think getting a 4GB GTX 960 will give a noticeable performance boost over the GTX 670? My guess is probably but wondering if anyone has practical experience with upgrading components of these classes.
 
I don't see any difference. What the fuck?

differences are minimum but are there.. couple of pictures look much more blurry with low.. however I would like to check first how different are in movement. I remember Far Cry 4 were a similar case, in pictures looked almost exactly but in movement the difference was very noticeable.
 
Random question related to Fallout 4 performance.; I currently have a Sandy Bridge i5 and a GTX 670. I'm budgeting for a system upgrade in about 2 years (long story, but no cash will be available for a while). Do you guys think getting a 4GB GTX 960 will give a noticeable performance boost over the GTX 670? My guess is probably but wondering if anyone has practical experience with upgrading components of these classes.

If you cant play the game now, the 960 will allow it. Dont expect 2x the performance but going from 2 gig to 4 gig vram will help with 1080p and it will idle 10x less power and use alot less power when gaming. 960 is kinda a dog. Its a $250 970 or bust right now.

Just look at the chart posted, a 670 is pretty dam close to a 770 (+-10%)
 
Random question related to Fallout 4 performance.; I currently have a Sandy Bridge i5 and a GTX 670. I'm budgeting for a system upgrade in about 2 years (long story, but no cash will be available for a while). Do you guys think getting a 4GB GTX 960 will give a noticeable performance boost over the GTX 670? My guess is probably but wondering if anyone has practical experience with upgrading components of these classes.

I've seen 970s for ~$250 used with warranty in FS/FT. Don't hobble yourself, find another $50 somewhere and be happier.
 
Rinse & repeat Gameworks formula is starting to be predictable...

This isn't a BlameWorks title. This doesn't have any Nvidia-exclusive graphic settings. This doesn't have any graphic settings that can't be adjusted or disabled to suit your particular GPU regardless of brand.

Sorry, sparky. Back to the outrage drawing board.
 
If you cant play the game now, the 960 will allow it. Dont expect 2x the performance but going from 2 gig to 4 gig vram will help with 1080p and it will idle 10x less power and use alot less power when gaming. 960 is kinda a dog. Its a $250 970 or bust right now.

Just look at the chart posted, a 670 is pretty dam close to a 770 (+-10%)

I've seen 970s for ~$250 used with warranty in FS/FT. Don't hobble yourself, find another $50 somewhere and be happier.

Duly noted, thank you for the sanity check. Time to explore FS/FT.
 
This isn't a BlameWorks title. This doesn't have any Nvidia-exclusive graphic settings. This doesn't have any graphic settings that can't be adjusted or disabled to suit your particular GPU regardless of brand.

Sorry, sparky. Back to the outrage drawing board.


the god rays are a gameworks feature. it may not be as specific as NVIDIA only but its definitely gameworks. And it definitely hits AMD cards harder.

good thing the difference between low vs ultra isn't noticeable... like at all.
 
So vendor includes proprietary tech in new game which adds little value, but at "Max settings" makes their older GPUs seem inferior, causing their customers to upgrade to newer chips....

God Rays isn't NVIDIA exclusive, nor did Nvidia didn't develop this game. And why would their "customers upgrade to newer chips" when they can simply adjust the setting downward? The cool thing about adjustable settings is you can adjust them.

Not sure why people are acting like any of these graphic settings are mandatory, or can't be adjusted or disabled.
 
Last edited:
God Rays isn't NVIDIA exclusive, nor did Nvidia didn't develop this game. And why would their "customers upgrade to newer chips" when they can simply adjust the setting downward? The cool thing about adjustable settings is you can adjust them.

Not sure why people are acting like any of these graphic settings are mandatory, or can't be adjusted or disabled.

The implication is that Nvidia designs their GameWorks features to only work well on their current generation hardware and make their older hardware and AMD's look worse.

It's good to know, god rays = low sounds like the ticket.
 
the god rays are a gameworks feature. it may not be as specific as NVIDIA only but its definitely gameworks. And it definitely hits AMD cards harder.

Oh well. I'd rather have an extra graphic feature that can be turned down or off than developers omitting features because they don't run identically on both brands of cards and someone might get their feelings hurt. Do we really want the class moving at the speed of the slowest student?
 
Last edited:
The implication is that Nvidia designs their GameWorks features to only work well on their current generation hardware and make their older hardware and AMD's look worse.

I understand the conspiracy theory, but it seems a little silly when factoring god rays obviously pre-date Maxwell. Meanwhile nobody really seems to get upset at game developers, who are the ones actually making the choice to implement these features in their games.

I have no doubt that when Pascal releases, whatever leap forward it demonstrates over Maxwell/Kepler in tesselation or any other areas will be deemed "a conspiracy to make AMD and older Nvidia cards look bad". You could set your watch to it.
 
Got a 980 Ti and it works on max @ 1440p as expected, occasional stutters but the drivers hadn't come out when I first played it before bed. Struggling to believe that you'd need to have a 970 for a decent experience given how well it runs for me.

Got out into the wasteland and it still all runs fine


Heavily slanted to make you play with a controller rather than keyboard and mouse though, which is shitting me. Couldn't even use the keyboard and mouse until I unplugged my controller

Specs for what it's worth

3770K @ 4.6ghz (Summer overclock)
Gigabyte G1 Gaming 980Ti
16GB RAM @ 1800
Maximus Formula V
Creative ZxR Soundcard

This is what Geforce Experience reckons for my PC, but we all know what to think of that. Might need to take it down a bit when I get into more intense areas I guess.

Fallout_4_Specs.png
 
Rinse & repeat Gameworks formula is starting to be predictable...

All the reviews I have read point to the AMD driver and interaction with the CPU.

Gamersnexus said the following:

"That leads me to believe this is either a game optimization or, more likely, a driver optimization issue. We contacted AMD late last week in search of the seemingly-inevitable Fallout 4 day-one drivers, but were told the drivers weren't ready yet."

Digital Foundry found the same.

You can't just blame NV or GW when AMD screws up again. There is zero excuse for AMD not having a day one driver for a game as big as FO4.

NV had a day one driver ready and AMD didn't bother; that is the crux of dealing with AMD.
 
All the reviews I have read point to the AMD driver and interaction with the CPU.

Gamersnexus said the following:

"That leads me to believe this is either a game optimization or, more likely, a driver optimization issue. We contacted AMD late last week in search of the seemingly-inevitable Fallout 4 day-one drivers, but were told the drivers weren't ready yet."

Digital Foundry found the same.

You can't just blame NV or GW when AMD screws up again. There is zero excuse for AMD not having a day one driver for a game as big as FO4.

NV had a day one driver ready and AMD didn't bother; that is the crux of dealing with AMD.

Behind the scenes politics is terrible, but let's even take AMD out of discussion. Nvidia is messing up even their last generation fan base with Gameworks which is just an awful, but with proprietary code is the logical thing for a hardware manufacturer to do to boost sales artificially.
 
Well, makes me glad I didnt upgrade the video card in my laptop just yet. I may just be able to play it and get away with it at lower settings. So YAY!
 

Am I the only one that thinks the low setting looks better than ultra? It smooths out the edges more. You're looking at the frickin' sun - I wouldn't expect the image to be sharp.
 
Behind the scenes politics is terrible, but let's even take AMD out of discussion. Nvidia is messing up even their last generation fan base with Gameworks which is just an awful, but with proprietary code is the logical thing for a hardware manufacturer to do to boost sales artificially.

Kepler is doing fantastic in this game. So how is NVIDIA lighting to blame if Kepler does excellent in FO4? If it were the case, Kepler would do poorly here too. Try again w/another b.s. theory, maybe something will stick someday.
 
Kepler is doing fantastic in this game. So how is NVIDIA lighting to blame if Kepler does excellent in FO4? If it were the case, Kepler would do poorly here too. Try again w/another b.s. theory, maybe something will stick someday.

You'll notice you never really hear Kepler users complaining. Instead it's AMD users - overly concerned for them - that seem to do the complaining on their behalf. Often they're still trying to beat that dead horse of early days Witcher 3 performance before Nvidia fixed the driver for certain older cards. Everything is a conspiracy nowadays as frustrations with Nvidia's performance lead has grown, and AMD finding itself in the twilight hours of their company.
 
They max out w/ the 980ti and I don't expect that is o/c'ed. So it might be possible to hit 60fps with a overclocked titan-x. Guess we'll know soon enough.

Check out the graph I posted, the Ti gets 39.2 maxed out at 1440p, to get to 60 you would need a 50% effective overclock...
 
Back
Top