Fallout 4 Performance and Image Quality Preview @ [H]

It's basically Skyrim with a fallout total conversion done, plus all the console port fun we've come to know in this industry.

Yes, everything is just a reskin of everything. :rolleyes: The overhauled gunplay, crafting and basebuilding is just a hallucination.
 
The thing that gets me is the minimum system specs. As everyone is saying this could be a 2012 game, it definitely won't run on 2012 hardware. We tried a mobile firepro card (3dmark equivalent to around a HD7770 desktop) with low settings, and even at 720p the menu screen was very choppy. I can understand maybe needing a 7870 for say 1080p on low, but when a HD5850 can't even run a game at 720p on low there is no scaling taking place.

If I had the game I'd give a GTX 580 a shot, but chances are it probably won't run well even though there is no reason why it shouldn't.
 
It would have been nice to see comparisons with godrays Off and Low. Remove them altogether and you have a DX9 title that may be dated by even 2012 standards. I can think of plenty of other aspects of the game I'd rather sacrifice half my performance on improving than godrays. While this might be a big subjective, the game looks better without them. Even with a 6950 the game is still playable with nearly all settings on Ultra.
 
No. It never was and never will be. At best that's misdirection in attempting to explain why AMD cards perform so poorly. You wanna know what the real problem is?

nVidia: Drivers released on launch day
AMD: Nothing yet

After all these years of slow driver releases and driver issues, AMD still can't get it right.

On a different note, thanks [H] for the review. I'm running it at 2560x1440 with everything on Ultra on my 980Ti and it's awesome. I with with TAA and didn't bother with FXAA - after reading the review I'm going to have a look at FXAA tonight.

There is one issue that you failed to mention, we had never seen a game were an AMD card is 30-40% slower compared to a similarly priced nVidia card except when it has nVidia Gimpworks on it, and even like that at least the graphics kinda offset the performance impact, but a game that looks like it was released on 2012 and that runs miserably on AMD hardware only shows one thing. Shoddy coding that was made in purpose to harm AMD hardware. According to some, Godrays uses heavy tessellation in a fashion that it makes AMD tessellator go underutilized often, may be it would be a good idea to use AMD's CCC and dial it back to 16X and see how it performs. Something similar happened with Witcher 3 Gerald's hair. Is frustrating that every time a game is released under the TWIMTBP banner, always comes riddled of performance issues and bugs, while a game that use none of that or uses the AMD banner, runs even better on geforce hardware.
 
Zarathustra[H];1041965569 said:
...but then I took a nuke to the knee. :p

LOL! :D



Anyway, I applied SMAA with Reshade and its MUCH better option than in-game FXAA (maybe Nvidia CP forced FXAA is better, dunno. It was in Skyrims case though). Not as effective in removing aliasing as TAA but atleast it lacks the fugly vaseline smeared looks. Anyone who thinks the AA options are lacking and do not have enough horsepower to run DSR/VSR above 1080p should give it a try.
 
Not everyone is happy with ~60 FPS. Many appreciate sharper and more fluid picture.

The issue is that it does not always work that way. If you pair two very powerful cards to play them at very low resolutions, it will lean to CPU bottleneck issues as the game is not demanding enough to scale properly and may reach high fps and some stuttering. I have a single R9 290X and I can max everything on 2K, even Witcher 3 and Far Cry 4 (Mine is overclocked). Imagine Titan-X, API's aren't very well coded to multithreaded, let alone to scale properly with two GPUs and balance the load so you can use the hardware better and reach sky high FPS, it is just a waste of money and only shows that he has no much clue regarding computer hardware and how game works.
 
There is one issue that you failed to mention, we had never seen a game were an AMD card is 30-40% slower compared to a similarly priced nVidia card except when it has nVidia Gimpworks on it, and even like that at least the graphics kinda offset the performance impact, but a game that looks like it was released on 2012 and that runs miserably on AMD hardware only shows one thing. Shoddy coding that was made in purpose to harm AMD hardware.

This is kinda delusional, no offense. You really think Bethesda set out to piss off 20% of their customers? Why would they do that? That's just bad business. Maybe instead you should be asking AMD why they can't be bothered to release a game driver.

The grasping at straws and conspiracy theory is really getting sad.
 
This is delusional. You really think Bethesda set out to piss off 20% of their customers? Why would they do that? That's just bad business. Maybe instead you should be asking AMD why they haven't bothered to release a game driver.

The grasping at straws and conspiracy theory is really getting ridiculous.

Wrong again, DX API is too high to even explore and predict how it is going to behave on different hardware configurations, just lay down the green kool aid and check this out.

List of games that uses Lameworks and has issues:

1) AC Unity - Terrible performance, glitches and bugs

2) Batman AK - Never got fixed

3) The Witcher 3 - After 20GB of patches, it works.

4) Far Cry 4 - Performed bad on Kepler. After 8GB of patches and drivers, now runs good.

5) GTAV - Getting better

6) Project Cars - Never worked right as it uses the CPU for PhysX when detects an AMD card

7) Dying Light - Performed bad on Kepler and AMD.

8) Fallout 4 - Seeing that a GTX 770 matches a 390X.

There should be NO REASON that a lowly GTX 770 matches a 390X no matter what. And that all those gameshave something in common, Lameworks! Why every time that a game with that banner is launched, always have issues? Why this does not happen on games that are more agnostic like COD for example? Or even games that uses the AMD banner like BF4 or Tomb Raider or Ryse?
 
There is one issue that you failed to mention, we had never seen a game were an AMD card is 30-40% slower compared to a similarly priced nVidia card except when it has nVidia Gimpworks on it, and even like that at least the graphics kinda offset the performance impact, but a game that looks like it was released on 2012 and that runs miserably on AMD hardware only shows one thing. Shoddy coding that was made in purpose to harm AMD hardware. According to some, Godrays uses heavy tessellation in a fashion that it makes AMD tessellator go underutilized often, may be it would be a good idea to use AMD's CCC and dial it back to 16X and see how it performs. Something similar happened with Witcher 3 Gerald's hair. Is frustrating that every time a game is released under the TWIMTBP banner, always comes riddled of performance issues and bugs, while a game that use none of that or uses the AMD banner, runs even better on geforce hardware.
Are you really fucking serious? I mean... seriously? Really?

Ok, so Bethesda deliberately set out to make their current flagship game play terribly on AMD cards, because fuck AMD and everyone who uses their cards. Yes, they might be paying customers of Bethesda and Bethesda would be shooting themselves in the face repeatedly by doing this, but fuck them all.

Seriously dude, lay off the whatever it is that you're taking that's causing the delusional paranoia. Reality is beckoning and it tastes good.
 
Are you really fucking serious? I mean... seriously? Really?

Ok, so Bethesda deliberately set out to make their current flagship game play terribly on AMD cards, because fuck AMD and everyone who uses their cards. Yes, they might be paying customers of Bethesda and Bethesda would be shooting themselves in the face repeatedly by doing this, but fuck them all.

Seriously dude, lay off the whatever it is that you're taking that's causing the delusional paranoia. Reality is beckoning and it tastes good.

Instead of sticking with the facts you just deflect them with no evidence or data that can prove me wrong, typical of fanboys. Just check the list of games that I posted and see for yourself, of course if you can handle the facts which I certainly doubt.
 
Dumb question...to use FXAA does it also need to be turned "on" under the nvidia control panel? Or is that just override for programs that don't have built in FXAA as an option?
 
PLEASE TAKE THE "GAMEWORKS IS THE DEVIL" ARGUMENTS OUT OF THIS THREAD.
 
TI, probably not. GTX 980 though, sounds like you might need that to get 1440p to run well.

currently playing this at 1440p w/ a Phenom II X6 and a Gtx 770 at 1440p with no issues. Sub 60 in some areas but nothing too noticeable. Max settings besides low god rays
 
Dumb question...to use FXAA does it also need to be turned "on" under the nvidia control panel? Or is that just override for programs that don't have built in FXAA as an option?

That is an override to use FXAA in any game you wish. Don't turn both on at the same time. Unless you like FXAA so much that you turn FXAA on your FXAA so you can blur while you blur.

...couldnt resist, sorry.
 
Staying on topic. Why not ask amd for a refund if you can list 10 plus games you are consistently getting fucked in. Maybe it's time to realize there's a common problem with one company.
 
Staying on topic. Why not ask amd for a refund if you can list 10 plus games you are consistently getting fucked in. Maybe it's time to realize there's a common problem with one company.

Oh yeah, I forgot that AMD is a game developer and that the games released under their banner ran like crap on nVidia hardware, oh wait!
 
Your use of lame work and stuff like that tells me you have reached not only Jones town level of amd Kool aid drinking but also a dash of Westboro to boot.

You really think a company will purposely fuck their market share?

Eh, I'm not sure its the Bethesda who would purposedly screw half of their customers out of malice with Dr.Evil laughter. Its probably a simple a little unfortunate side effect of using the little black box of special effects the Gameworks is. No, its most likely the Nvidia doing the Dr.Evil laugh if the accusations are true.
Whether its all paranoia or no, that is still an impressive list of Nvidia sponsored titles with initially mysterious performance issues on AMD and older Nvidia cards. All released in last couple of years. The possibility of coincidence is running thin.


Anyway, better stick on topic before banhammers start falling.
 
wth are you talking about? you sound like a child speaking of things that can't even understand,you things high FPS are rates are just about have 2x Titan X's water cooled? you think input lag its just about FPS?. LOLin' really bad...

So you are concerned about Vsync?.. so you never played any fallout before or new vegas or skyrim?. they have always used the same physics tied to refresh rate engine, that's why they use that refresh rate cap, not 60fps as many say, the cap its the refresh rate of your monitor... everything in those games are always tied to refresh rate, motion, movement speed, physics, particles, etc... that's why the game behave crazy when you disable vsync ( .ini file) and have FPS all over the place...

in the other hand thanks Brent for the preview, as always great.. waiting for the full review.

This...... have always had to have vsync disabled in skyrim and new vegas to avoid objects jumping around for no reason.

:)
 
im sure just when testing is done a new amd driver will come out and kyle and brent will have to start over. all will be right in the (amd) world.
 
I could tell when the first pictures came out this was going to look like a game from 5 years ago...I hope the story is as good as it's alluded to.

I loved the Fallout RPG's, we even used to play a table top game similar to it because it was so darn fun. Then they released the FPS and things kind of went downhill for me, I stayed away sort of. Like the franchise, but man it's been diluted down. And seeing these issues makes me want to stay away even more. It's sad, this started out PC first, and now they have moved to consoles.

Why do Nvidia cards do so well when AMD hardware runs the consoles? If it was indeed a cheap port, wouldn't you think the AMD cards should be better optimized...???
 
currently playing this at 1440p w/ a Phenom II X6 and a Gtx 770 at 1440p with no issues. Sub 60 in some areas but nothing too noticeable. Max settings besides low god rays

That's good to hear, I was getting the impression it would take a bit more than that to really run smooth. I'm guessing memory must be the real holdup for some of the older cards from being able to run the game properly.
 
wth are you talking about? you sound like a child speaking of things that can't even understand,you things high FPS are rates are just about have 2x Titan X's water cooled? you think input lag its just about FPS?. LOLin' really bad...

So you are concerned about Vsync?.. so you never played any fallout before or new vegas or skyrim?. they have always used the same physics tied to refresh rate engine, that's why they use that refresh rate cap, not 60fps as many say, the cap its the refresh rate of your monitor... everything in those games are always tied to refresh rate, motion, movement speed, physics, particles, etc... that's why the game behave crazy when you disable vsync ( .ini file) and have FPS all over the place...

in the other hand thanks Brent for the preview, as always great.. waiting for the full review.



its ok, his idea of sponsorship for his ogl/cal-o team was a free sandwhich and a pop.
 
Really disappointed that it only supports single monitor display at launch. FFS I really wanted Fallout awesomeness at 3240x1920 (or bigger) w/o jumping thru the config hoops.
 
Great quick read. I don't agree with the AA choice but, I love those sliders.

17 hours in and it's a blast. The settlement management/design and equipment upgrades are a lot of fun. I spent 3-4 hours building a damn fence to funnel in the badguys to my lines of fire last night. That's only one settlement:eek:.
 
I thought FXAA always added blur as well...is it something unique about the Creation engine that makes FXAA act differently?...or does FXAA add less blur then TAA?
 
I realize it's all subjective...but I still find TXAA better than FXAA for this particular game.

With TXAA, the grass, bushes and faraway objects look MUCH smoother especially with any sort of motion. Maybe my eyes just aren't that sensitive to it, but I'm not seeing that "vaseline" filter with TXAA, either.

Again, all subjective I realize, but with TXAA the game just looks smoother to me.
 
^ BTW it isn't TXAA, TXAA is an NVIDIA GameWorks feature that only works on NVIDIA GPUS.

This game uses something called TAA, which works on both AMD and NVIDIA GPUs, it is shader based. It might stand for Temporal AA.
 
Speaking of, what version of DX is FO4 using?
That's a surprisingly difficult question to answer. I haven't seen anything official that even mentions DirectX. The original engine would have been DX9, but they'd need at least DX10 for geometry shaders and/or tesselation. My guess would be DX11 as that's the most recent and widely supported version. However, the minimum requirements listed on Steam should all support DX12/Vulkan to the best of my knowledge. The game seems to run fine with mostly Ultra settings(few rough areas aside) on a card even below the minimum recommended hardware.
 
Okay, back on subject. Kyle and Brent, I am impatiently waiting on the full review so I can figure out which GPU I am going to buy. I got my current card (ASUS Direct CU HD 6870) here:

P1000807.jpg


That is Kyle holding up either my card or on of it's sisters furnished by ASUS. I'm the guy in the Green t shirt on the far right just above the guy with the tan hat just before I won it at the AMD [H]ard|OCP GamExperience on 7/16/11. That card does not meet the minimum spec for FO4 so I need to really, really upgrade my GPU LOL. Except for upgrading my system memory to 16GB and getting a new GPU, I think the rest of my system is good for now (I hope :eek:).
 
That's a surprisingly difficult question to answer. I haven't seen anything official that even mentions DirectX. The original engine would have been DX9, but they'd need at least DX10 for geometry shaders and/or tesselation. My guess would be DX11 as that's the most recent and widely supported version. However, the minimum requirements listed on Steam should all support DX12/Vulkan to the best of my knowledge. The game seems to run fine with mostly Ultra settings(few rough areas aside) on a card even below the minimum recommended hardware.

FWIW, MSI Afterburner shows DX11 in the OSD.
 
As of writing this, SLI seems to be broken in Fallout 4. There is a interim fix; however it works for some but not all. Simply use the the NVIDIA Control Panel and change the SLI profile to "Alternate Frame Rendering 2."

On the Geforce forums Nvidia has confirmed it's a problem with the SLI profile and they are working on a new one.

I'm running two GTX 980s in SLI and AFR2 doesn't seem to make a difference. With the default Nvidia profile, one GPU is at 99%, the other is idle, and my FPS bounces between 42 and 60 according to Afterburner. With AFR2 set, I get the same framerates but both GPUs are at around 99%. This is at 3440x1440.

Changing the SLI compatibility bits to Fallout 3/NV doesn't work either, as the game appears to be running DX11. That setting is for DX9.
 
This is a typical Fallout game. It's a hilarious buggy but fun hot mess with graphics ranging from mediocre to poor. Runs fine maxed out on my GTX 980 at 1920x1200. Constant 60FPS with vsync on. Stays pretty consistently around 100FPS with it off. I don't think the Fallout games have ever been known for their graphics. Fallout 2 came out around the same time as Half-Life 1, yet parts of it looked like a Super Nintendo game.
 
Meh, glad I didn't preorder. I will wait for a sale on steam and optimized drivers.
 
The thing that gets me is the minimum system specs. As everyone is saying this could be a 2012 game, it definitely won't run on 2012 hardware. We tried a mobile firepro card (3dmark equivalent to around a HD7770 desktop) with low settings, and even at 720p the menu screen was very choppy. I can understand maybe needing a 7870 for say 1080p on low, but when a HD5850 can't even run a game at 720p on low there is no scaling taking place.

If I had the game I'd give a GTX 580 a shot, but chances are it probably won't run well even though there is no reason why it shouldn't.

Well, on my 7970, it runs just fine on Ultra (35-60fps). I haven't mess with the Godray settings yet, but that should be interesting. Maybe on lower end cards it won't play, but that's to be expected.
 
That is Kyle holding up either my card or on of it's sisters furnished by ASUS. I'm the guy in the Green t shirt on the far right just above the guy with the tan hat just before I won it at the AMD [H]ard|OCP GamExperience on 7/16/11. That card does not meet the minimum spec for FO4 so I need to really, really upgrade my GPU LOL. Except for upgrading my system memory to 16GB and getting a new GPU, I think the rest of my system is good for now (I hope :eek:).


Offtopic: Kyle could've easily been one of the supporting players on Sons of Anarchy.
 
Why this does not happen on games that are more agnostic like COD for example? Or even games that uses the AMD banner like BF4 or Tomb Raider or Ryse?

Some CoD games used Gameworks as did MGS:V. Tomb Raider was broken at release for Nvidia users from a performance standpoint. Some quick patches and drivers jumped the performance up by 40% or so.
 
Back
Top