Fallout 4 to feature Nvidia Gameworks.

I have high hopes for this game. I'm certainly not feeling pessimistic at all. In fact, the more options a game has the better IMO. The game isn't even out yet. Nothing has been confirmed which settings we are getting. I just can't understand all the doom and gloom posts. This is one of the biggest releases this year.
 
Sweet, the more games that can offer additional features to NVIDIA customers with GW the better. The only people I can see being upset about this are the few customers AMD has left.
 
Sweet, the more games that can offer additional features to NVIDIA customers with GW the better. The only people I can see being upset about this are the few customers AMD has left.
The "I must max everything at 60+ fps" mentality is the real problem.
People see a slider and they want to max it out. It's impossible for some people to understand that some sliders are meant for people running $2,000+ PCs. You think that would make more sense these days with everyone whining about consoles holding back PC gaming, PC games being more customizable, etc etc. It makes even more sense for Nvidia to target that demographic because they succeed the most at the enthusiast level.

If you go to Nexus and download every single graphics mod for Skyrim, it will run like shit. You just have to know where your limits are.

I had a 280X and 5870 for a long time. I turned graphical settings off. They are budget cards. I turned off TressFX. I turned off HBAO+. I turned off HairWorks. And I never whined about it. People who own 960s, 380s, 390s, 970s, etc, will have to make similar sacrifices at varying degrees of graphical quality. Those are just the sacrifices you make as technology moves forward, and the only people whining about it are those who are bitter about being left in the dust.

If you don't want to upgrade your PC, get a console. But if someone else wants to buy $1,000 worth of GPUs to render fancy hair then I think it's fair for AMD/Nvidia to ensure those PC gamers have access to those features. That is why they are selling expensive GPUs.

As long as the game isn't artificially crippled even with GameWorks disabled, then it's fine. If you don't like the features, then turn them off. But I don't like seeing people whine about free graphics options they aren't being forced to use.
 
Last edited:
If this was a badged BlameWorks title, Nvidia would already have been promoting the fact for months. So cool your tits, ladies.

Nevermind this all originated from Wccftech, which isn't even a rumor site anymore. These days they just make shit up for the clickz.
 
So what's the score when it comes to glitches for games that are GW badged vs games that are not at launch time? I remember Batman: Arkham Knight was a mess, and AC: Unity was quite a mess as well?
 
The "I must max everything at 60+ fps" mentality is the real problem.
People see a slider and they want to max it out. It's impossible for some people to understand that some sliders are meant for people running $2,000+ PCs. You think that would make more sense these days with everyone whining about consoles holding back PC gaming, PC games being more customizable, etc etc. It makes even more sense for Nvidia to target that demographic because they succeed the most at the enthusiast level.

If you go to Nexus and download every single graphics mod for Skyrim, it will run like shit. You just have to know where your limits are.

I had a 280X and 5870 for a long time. I turned graphical settings off. They are budget cards. I turned off TressFX. I turned off HBAO+. I turned off HairWorks. And I never whined about it. People who own 960s, 380s, 390s, 970s, etc, will have to make similar sacrifices at varying degrees of graphical quality. Those are just the sacrifices you make as technology moves forward, and the only people whining about it are those who are bitter about being left in the dust.

If you don't want to upgrade your PC, get a console. But if someone else wants to buy $1,000 worth of GPUs to render fancy hair then I think it's fair for AMD/Nvidia to ensure those PC gamers have access to those features. That is why they are selling expensive GPUs.

As long as the game isn't artificially crippled even with GameWorks disabled, then it's fine. If you don't like the features, then turn them off. But I don't like seeing people whine about free graphics options they aren't being forced to use.

I totally agree but the problem is people will keep crying about it endlessly instead of simply switching GW features off.
 
Are AMD fans going to have have another cryfest about gameworks with fallout 4 now too? When are they going to realize it's AMD's fault for refusing to fix their DX11 overhead and not making a chip with modern tessellation capabilities?

Like most of you cryied when TressFx was released in Tomb Raider. lucky you AMD isnt nvidia and they released a patch. by that time i was a 780ti user.

so, is nvidias fault that PASCAL isnt a modern chip because it doesnt support Async right?
 
Last edited:
The "I must max everything at 60+ fps" mentality is the real problem.
People see a slider and they want to max it out. It's impossible for some people to understand that some sliders are meant for people running $2,000+ PCs. You think that would make more sense these days with everyone whining about consoles holding back PC gaming, PC games being more customizable, etc etc. It makes even more sense for Nvidia to target that demographic because they succeed the most at the enthusiast level.

If you go to Nexus and download every single graphics mod for Skyrim, it will run like shit. You just have to know where your limits are.

I had a 280X and 5870 for a long time. I turned graphical settings off. They are budget cards. I turned off TressFX. I turned off HBAO+. I turned off HairWorks. And I never whined about it. People who own 960s, 380s, 390s, 970s, etc, will have to make similar sacrifices at varying degrees of graphical quality. Those are just the sacrifices you make as technology moves forward, and the only people whining about it are those who are bitter about being left in the dust.

If you don't want to upgrade your PC, get a console. But if someone else wants to buy $1,000 worth of GPUs to render fancy hair then I think it's fair for AMD/Nvidia to ensure those PC gamers have access to those features. That is why they are selling expensive GPUs.

As long as the game isn't artificially crippled even with GameWorks disabled, then it's fine. If you don't like the features, then turn them off. But I don't like seeing people whine about free graphics options they aren't being forced to use.

So your telling people to ditch their pc's and buy a console? really?
about the gimpworks how was batman and AC? so you play them without an issue?
this is about fairness, a lot of people play on pc and most of them have weak gpus... nvidia is just making sure that everygame run like crap to sell "high end" gpus.
 
So what's the score when it comes to glitches for games that are GW badged vs games that are not at launch time? I remember Batman: Arkham Knight was a mess, and AC: Unity was quite a mess as well?
I struggle to think of a single AAA PC title period that did not have glitches at launch time. PC games have always had varying degrees of problems upon release for as long as I've been on the platform.

People are blaming Game Works for issues that have been inherent to PC gaming for decades. How quickly people forget the many, many examples of PC launch day fuck ups once they find some bullshit issue they can latch on to and use to fuel Internet arguments. Nvida using Game Works to intentionally cripple performance on AMD cards and Kepler is the conspiracy du jour right now, and people parrot it endlessly to try and earn nerd points and Reddit karma.

There's no collusion between Nvidia and developers to try and force you to buy a 980ti (I'm still waiting for someone to explain to me how that particular profit sharing conspiracy works), it's a simple fact of 3D rendering life - smoke effects kill framerate, cloth effects kill framerate and hair/fur effects really really REALLY kill framerate.
 
If this game runs like shit, this might finally push me over the edge to buy a console. I won't upgrade my PC past the 280x / 8350 and I'll keep my PC gaming for Steam Summer Sales and the occasional Non-Indie Humble Bundle. I just feel like the consolers get better games, and the ones we get don't work worth shit.
 
I struggle to think of a single AAA PC title period that did not have glitches at launch time.

I didn't ask for a game that had a glitch free launch. Glitches are a norm nowadays unfortunately. In the case of Arkham Knight and AC: Unity the glitches were even worse than normal launch glitches. The publishers had to even pull one of them from the shelf. Maybe it's just a coincidence that both games have GW branding but it does make one wonder.
 
If this game runs like shit, this might finally push me over the edge to buy a console. I won't upgrade my PC past the 280x / 8350 and I'll keep my PC gaming for Steam Summer Sales and the occasional Non-Indie Humble Bundle. I just feel like the consolers get better games, and the ones we get don't work worth shit.

I just bought a PS4 last week, definitely a lot of merit to it and I wholeheartedly recommend it.
 
smoke effects kill framerate, cloth effects kill framerate and hair/fur effects really really REALLY kill framerate.


Thing is, it doesn't have to. If standard Directcompute physics were used in DX11, the impact would be much, much less such as we see in the Havoc engine. And now standard use of ACE units in DX12 is being encouraged on consoles because of low-to-no impact of physics (amongst others) on graphics fps.
 
I sold my PS4. The grass isn't greener on the other side. Witcher 3 ran atrociously on it and stutters like a mother. Not only that, but the graphics are turned down by default. On PC, you have an option, but you people bitch that your system can't run them at high so you run to the PS4 which defaults to low. Oh yeah, have fun looking at the loading screen. I guess I'm spoiled with my SSD.
 
I sold my PS4. The grass isn't greener on the other side. Witcher 3 ran atrociously on it and stutters like a mother. Not only that, but the graphics are turned down by default. On PC, you have an option, but you people bitch that your system can't run them at high so you run to the PS4 which defaults to low. Oh yeah, have fun looking at the loading screen. I guess I'm spoiled with my SSD.

I'm not saying that there aren't the few rare gems that are amazingly better on pc. But the only thing most of these PC versions offer are slapped on Gameworks style effects, massive amounts of Anti-Aliasing (I mean really guys?) and very inefficient tessellation. Yeah, I definitely wouldn't want to play a racing game at 30fps 720p, but Fallout 4? Probably just fine for my tastes. I have no fracking interest in the next DOTA or whatever BS the average PC Gamers are playing now. I spent an initial $1000 on my rig in 2013. The monitor was DOA and I had to pay shipping both ways to get a working one, a la New Egg. Then my cheap graphics card died. Shelled out the 300 bucks for the 280x at that time. That is a hell of a lot of money to spend on a system that plays games like shit. DRM'd SHIT. The game I have spent the most time on is Skyrim, its freaking amazing on PC. I love the mods. I got Crysis 3 after seeing how all these benchmarks were competing to see who could run it best... guess what... Crysis is a terrible game. Watch Dogs... yeah... fart... Arkham Knight... that was like getting blue balls... I am beta testing a game that I was very excited for... well lets say its for the DOTA crowd. I never got to play Destiny like I wanted to. Halo 5 is coming out. Steam In-Home streaming breaks every other update.

Mainly, I won't have to worry if a game was gimped or optimized for my hardware configuration.
 
I didn't ask for a game that had a glitch free launch. Glitches are a norm nowadays unfortunately. In the case of Arkham Knight and AC: Unity the glitches were even worse than normal launch glitches. The publishers had to even pull one of them from the shelf. Maybe it's just a coincidence that both games have GW branding but it does make one wonder.

Wonder about what exactly? Nvidia wasn't the developer of those games. Both those games were rushed, launched half baked, and their development processes were fundamentally flawed, each for unique reasons.

FUD more.
 
Wonder about what exactly? Nvidia wasn't the developer of those games. Both those games were rushed, launched half baked, and their development processes were fundamentally flawed, each for unique reasons.

FUD more.

Hahaha nvidia lawyer LOL
 
I'm looking forwards to seeing how Bethesda implements certain GameWorks features. Hopefully they do it in a way that will add some real spice to the game's graphics.
 
With all of this shit about people expecting to run high AAA games maxed out with outdated GPUs and the other side whining because can't run GameWorks features and just like talk shit I hope really, really hope that this game run like a pure big mass of shit in every AMD card on the market... with or without GameWorks, all of you fanboys are really the cancer of the pc market.. you can enable or disabl graphics settings or features you know right?..

you card run a high AAA game like shit with ultra settings? Well upgrade your card.. that what ultra settings means, isn't for everyone in the pc market, once time ago people were able to understand that shit but now everyone think that because have a mid range GPU have the rights to max out every game at 60fps, my lord that just a stupid mind.. this market have been that way forever and always happened with most important PC games..

I remember the Witcher 2 as example people saying "you want decent performance the witcher 2 disable these features meant to be used with high end GPUs or multi card configurations" everyone was happy playing the witcher 2 as good as they can,but now we have people with mid range cards or old cards wishing to play the Witcher 2 maxed out above 60 fps.. what the hell people? Really?. Every guy out there want to run ultra settings for every game in the market with mediocre cards.. in the past every one used to be happy with "high settings, because PC ultra settings are a luxury for those who have higher end gaming rigs"... that's what ultra settings means.. if you can't run the game properly you have to upgrade your GPU.. why is so hard that actually? run a game with ultra in the past was something elitist, now people whine because can run the game with ultra settings but have to disable certain features again that just stupid thinking..
 
I happen to like the fact that game development pushes the limits of what our hardware can do.
 
I'm not saying that there aren't the few rare gems that are amazingly better on pc. But the only thing most of these PC versions offer are slapped on Gameworks style effects, massive amounts of Anti-Aliasing (I mean really guys?) and very inefficient tessellation. Yeah, I definitely wouldn't want to play a racing game at 30fps 720p, but Fallout 4? Probably just fine for my tastes. I have no fracking interest in the next DOTA or whatever BS the average PC Gamers are playing now. I spent an initial $1000 on my rig in 2013. The monitor was DOA and I had to pay shipping both ways to get a working one, a la New Egg. Then my cheap graphics card died. Shelled out the 300 bucks for the 280x at that time. That is a hell of a lot of money to spend on a system that plays games like shit. DRM'd SHIT. The game I have spent the most time on is Skyrim, its freaking amazing on PC. I love the mods. I got Crysis 3 after seeing how all these benchmarks were competing to see who could run it best... guess what... Crysis is a terrible game. Watch Dogs... yeah... fart... Arkham Knight... that was like getting blue balls... I am beta testing a game that I was very excited for... well lets say its for the DOTA crowd. I never got to play Destiny like I wanted to. Halo 5 is coming out. Steam In-Home streaming breaks every other update.

Mainly, I won't have to worry if a game was gimped or optimized for my hardware configuration.
Yep, agreed. Most games are poorly ported consoles anyway with overtesselation or other gimp techs the video card makers. I rather have a simpler experience that is not time consuming with pc stuff and gimp works or dropped frames or whatever. I don't give a damn about whatever the graphic card cos have to serve up... Just fed up with it already ... I want to just sit out the pc master race crap that it has made up to feel good about itself and watch the customer screw over show from the sidelines
 
Witcher 3 ran atrociously on it and stutters like a mother. Not only that, but the graphics are turned down by default. On PC, you have an option, but you people bitch that your system can't run them at high so you run to the PS4 which defaults to low. Oh yeah, have fun looking at the loading screen. I guess I'm spoiled with my SSD.

Lol, it was actually fixed last week >.<, and well, you can add an SSD to a ps4 (of course that you can't add better graphics, but the "choose your poison" sounds kind of right for this fact).

What i don't get is how MS hasn't fixed the xbone's OS to just do a regular offline install, that is still slower than the ps4, and then ask the user if he wants to download whichever patch is needed, ohhh no, it tries to do the install while patching if it finds a connection and that makes up for "interesting" install times.
 
So what's the score when it comes to glitches for games that are GW badged vs games that are not at launch time? I remember Batman: Arkham Knight was a mess, and AC: Unity was quite a mess as well?

It's not just Game Works. Lot's of games use stuff that's just like Game Works all on their own. But here are some poor PC launches of recent memory, not including minor glitches:

Diablo III (network problems on an online only game)
Sim City (network problems on an online only game)
Battlefield 4 (network on a multiplayer based game)
Arkham Origins (game crippling level design bugs)(Gameworks)
Arkham Knight (Crippling performance, lack of video options)(Gameworks)
Watch Dogs (Purposely downgraded graphics)(Gameworks)
GTA IV(Talking eyeballs, crashes)

And I'm sure you can throw several of the Asassin's Creed games in there too. I haven't played any since the original.

Now some stuff that just doesn't make sense for a PC version.
Mass Effect 3, probably the other ones too (gamepad support removed)
Need For Speed (30fps...)
Dark Souls (720p 30fps)
Dead Space (30fps)
Dead Rising 3 (30fps)
 
Last edited:
Way I see it, as long as I can play Fallout 4 at Skyrim-esque fidelity with a frame rate that never dips below 30, I'll be happy.
 
I'm wondering how my single 980ti will do at 4k. Debating whether or not I should add another one for SLI.
 
I've never played a game featuring nVidia GameWorks. Fallout 4 will be an interesting case to see how far I can push my graphics card with the new features in the modified Creation Engine.
 
I'm wondering how my single 980ti will do at 4k. Debating whether or not I should add another one for SLI.

Wait till the game gets out, then make that decision.

980ti isn't going anywhere anytime soon.
 
I've never played a game featuring nVidia GameWorks.

That you know of. Most games probably use it to some degree. It's an easy way to optimize for the NVIDIA platform. Not every game may need a physics package or want extra AA options, so you just may not notice it.
 
That you know of. Most games probably use it to some degree. It's an easy way to optimize for the NVIDIA platform. Not every game may need a physics package or want extra AA options, so you just may not notice it.

And yet every game released with it has been covered in controversy. I don't think it's optimizing much.
 
From what I've seen it's poorly implemented and murders performance (see Witcher3 Hairworks). The mark of a good engine is being well optimized. GTA5 is the perfect example of a perfectly optimized engine that scales beautifully. Crytek is the perfect example of a poorly optimized engine that scales like crap.
 
From what I've seen it's poorly implemented and murders performance (see Witcher3 Hairworks). The mark of a good engine is being well optimized. GTA5 is the perfect example of a perfectly optimized engine that scales beautifully. Crytek is the perfect example of a poorly optimized engine that scales like crap.

Hairworks are much better optimised now.
 
I'm a little indecisive on the whole NVidia/AMD optimization debate. I am getting a little sick and tired of the AMD whining about how they feel neglected. I'm starting to have the opinion that if neither are going to set some sort of standard, someone needs to take the lead and move forward. It is an open free market, and the best ideas should win. If NVidia can design a suite that developers like, and support, is it not Nvidia's right to market that and seek out exclusivity? The same goes for AMD. It seems to be that Nvidia has made the right choices, and AMD has made the wrong choices as of late (see Mantle). I hear the word competition used pretty freely in this argument, but competition means the strongest win, not everyone wins.
 
Brent:
Since when is middleware that is designed to hinder the competition something you don't care to see, it is something which is not needed in the form it exist now. There is no other measure then being anti competitive. There is no need for 200 tessellation points in something as simple as a cape.

Game development needs to be about games not about how much money Nvidia is willing to give you to "cripple" your game for AMD customers
 
Back
Top