Fallout 4 to feature Nvidia Gameworks.

If you're running a corporation and your accountant decides that 28% of your profit is worth pissing into the wind, then let me grab some popcorn and takes bets on when you're going out of business. The way that you change things is to open your mouth and vote with your wallet.

I'm damn good at it.

Now you're irrationally assuming every single AMD user is going to boycott every developer that doesn't cater to their performance aspirations. That's absurd. If AMD is unable to pull out of their nosedive, they will eventually go out of business. Then what are you going to do? Quit gaming and start basket weaving? The inferior product will inevitably fail, it's the open market. Right now AMD is the inferior product, which is reflected in their shrinking market share. You're citing accounting and business decisions, but you ignore the fact for many developers paying your programmers for months to optimize a game which is probably going to have a sub 10% impact on sales (I would actually realistically guess it's around 5%, maybe) exceeds the loss in sales to begin with. That's why you're seeing slower response and less action to these patches. I have several acquaintances that have or had AMD cards, and no longer do, or are looking to switch to nvidia for their next card. Eventually it's going to hit that mark and developers are going to write it off entirely. I can't tell you how many meetings I've personally been in where specific hardware was completely omitted from support because of business costs.

This is not a company that has a bright future:
AVmmOwt.jpg
 
Last edited:
Playing through Borderlands 2 again, last time I played was a year or two ago with an AMD card. The PhysX effects are amazing.. I had a similar reaction to PhysX in Warframe recently, as well.

If it's a choice between having a feature that cripples performance or not having the feature at all, I will gladly enable those settings when I re-play the game a few years down the line.


GameWorks is middleware. Presumably it's CHEAP and EASY for developers to implement. Perhaps if AMD had an entire suite of proprietary graphics effects (aside from just TressFX which is in one AAA game) then we might have something to compare it to.
You want graphics features that do the same thing as GameWorks but without sacrificing performance? It might be possible, but who's going to make it and implement it in every individual game? It's expensive and time consuming. That's why GameWorks exists in the first place.

Developers aren't using GW to replace other equally competitive graphics options, they're using it as an add-on to fill holes they can't develop themselves. I think that's the misinterpretation a lot of people make. So you have a choice between GameWorks vs nothing at all.

Of course that's ignoring any AMD sabotage via tessellation and other similar things.

Just wanted to add that after the Nvidia contract or whatever ran out, don't really want to speculate, the Warframe developers released a patch that added in all of those effects for AMD users. Guess what. Enabling them cost zero resources. None. Zero. Zilch.

A great example of AMD assisting with PhysX style effects is in the game Alien isolation. You can read about them in the [H]ardocp review. I didn't take the time to review what was written last year, but from what I remember it was something like, "We're not going to add this game to the testing suite because it runs too smoothly on all video cards."

Well read the last section entitled, "The Bottom Line".


The Bottom Line

Alien: Isolation performs extremely well across a wide swath of GPUs, which many times means the graphics are subpar, but that is not the case here. This game is a showroom of DX11 efficiency; Watch Dogs eat your heart out. We haven't seen the likes of this much attention to detail using DX11 on the PC in a very long time. We hope to see more titles focus on this. Less console-itis please.

While we mostly have positive things to say about Alien: Isolation, there are a few negatives. The one "downside" to games that perform this well is that it doesn't push the need for a faster GPU. That said it is great to see more mainstream GPU owners getting to sample high end graphics quality gameplay. While the image quality is quite good, there are a couple of things we'd like to see added. More dynamic shadows please; the shadows in the game seem somewhat pre-baked. More dynamic shadows in Alien: Isolation would go a long ways to making the game even more dramatic and compelling. It would be great for at least high end GPU owners to turn on better levels of detail to even further push the game's immersion factor.

We very much need more AA options in Alien: Isolation. The performance is there for a SMAA "4TX" option. We know SMAA 4X exists, Crysis 3 has it and that's an aging title. Finally, we would like to see a high resolution texture pack for PC gamers, or an option to run uncompressed textures. The performance is there, the technology is there, with BC7 compression we would love to see a high resolution DLC texture pack for this game. Add these few things, and visually the game will be much more engrossing.


All of this should explain to you why we are not going to be adding Alien: Isolation to our gaming suite. $200 video cards are plenty powerful enough to play this game at 1440p. At 1080p resolution you can even game with $150 video cards and still play at maximum settings. There is no need to spend $350+ on a video card for Alien: Isolation, unless you are a 4K gamer. At 4K a high-end card will do just fine, but for the majority of people a $200-250 video card is sufficient. We think that Alien: Isolation is a template that other game designers should take notice of. We love the use of open technologies based on DirectCompute and the absence of proprietary technologies that make Alien: Isolation truly a great title. Team Green and game devs please take notice of this, as this is what pushes the gaming industry forward as a whole. And that is simply good for all of us.
 
the Warframe developers released a patch that added in all of those effects for AMD users. Guess what. Enabling them cost zero resources. None. Zero. Zilch.

So you're now saying the patch wrote itself? Can you program? I'd like to hire you.
 
Now you're irrationally assuming every single AMD user is going to boycott every developer that doesn't cater to their performance aspirations. That's absurd. If AMD is unable to pull out of their nosedive, they will eventually go out of business. Then what are you going to do? Quit gaming and start basket weaving? The inferior product will inevitably fail, it's the open market. Right now AMD is the inferior product, which is reflected in their shrinking market share. You're citing accounting and business decisions, but you ignore the fact for many developers paying your programmers for months to optimize a game which is probably going to have a sub 10% impact on sales (I would actually realistically guess it's around 5%, maybe) exceeds the loss in sales to begin with. That's why you're seeing slower response and less action to these patches. I have several acquaintances that have or had AMD cards, and no longer do, or are looking to switch to nvidia for their next card. Eventually it's going to hit that mark and developers are going to write it off entirely. I can't tell you how many meetings I've personally been in where specific hardware was completely omitted from support because of business costs.

Well I do car stereo, woodworking, pottery, painting, etc. I wish I had taken some music lessons as a child as I love live music. I can't sing worth a F so that ship sailed, but it's fun to torture my nieces and nephews. Do you really think I'm so shallow of a person that all I do is play video games for entertainment? Really?

Why do you constantly accuse me of wanting AMD to stay alive? Honestly all I care about as far as video games are concerned is that new features are added to gaming and that they are optimized to run across all relevant hardware at the time.

Guess what really happens when gaming turns into a one trick pony? People move on because they lose their passion in gaming. Everything starts to look the same. Nobody can discern Game A from Game B. So then we'll find a new hobby or return to our old hobbies.

Just the way that the world works. Look at the movie industry. Tech prowess up. Viewership down. Everything is a copy of something that was released earlier in time or carries the same themes over and over from film to film. You have no idea how many times I have to school sub 25 year olds on the fact that Movie A or song B is just an interpretation of something from the 1950's.
 
So you're now saying the patch wrote itself? Can you program? I'd like to hire you.

WTF are you saying? You know what. I'm going to leave the discussion. You're pulling straws out of your posterior now and I don't have the time for that type of discussion. ;)
 
WTF are you saying? You know what. I'm going to leave the discussion. You're pulling straws out of your posterior now and I don't have the time for that type of discussion. ;)


You used a really bad example of game Warframe is a MMO based on user base with micro transactions, as money is based on this, to keep their development efforts afloat , which they have to keep their dev team, unlike stand alone games. Most studios have a set of in house developers and they will also outsource a majority of the work. As a game goes gold part of the dev team spins off to continual development of the product, which is pretty much a skeleton crew which those people will train new members as needed for future updates and patches. These are two very different models of business.
 
depending on where the changes that need to be done different needs are needed. For anything within a shader, driver optimizations and shader replacement/ or instruction shuffling can be done. If its something that is deeper and part of the a rendering solution, that can't be done by drivers.

Problem is, if i can't look at the source code, there is no way for me to know if the problem is deeper and not "just on my drivers".


edit to add:

Similarly but on the opposite would be Nvidia & Ashes of the Singularity's AA in dx12, Nvidia had the source code and tried a smear campaign about the game being coded wrongly when it was a driver snafu on their part...
 
Problem is, if i can't look at the source code, there is no way for me to know if the problem is deeper and not "just on my drivers".


look at it this way, a developer that can't look at game works source (didn't pay for it), won't be able to change their render code too much, or to the degree that would hurt AMD hardware. Game work libraries are stand alone pieces where the development team doesn't need to change much of their code to get them to work, this is the whole purpose of any 3d party libraries, simple integration to give results quickly with a little money as possible.

Gameworks libraries are not optimized for AMD hardware, only for nV hardware. What is the benefit of nV optimizing for AMD hardware or any other hardware for that matter, NONE, its counter productive, especially with their current marketshare. AMD's dev libraries on the other hand have to be open because they don't have the marketshare to drive developers to use them.

I think the whole thing about something not being open is kinda moot when you see the ability of each IHV and what they do with what developers.

Similarly but on the opposite would be Nvidia & Ashes of the Singularity's AA in dx12, Nvidia had the source code and tried a smear campaign about the game being coded wrongly when it was a driver snafu on their part...


It wasn't purely drivers, Ashes of Sigularity developers didn't even know why their product wasn't working on nV hardware like it was supposed to. It wasn't the inability for Async that hurt nV, it was certain code which runs well on AMD hardware causes major issues with concurrent graphics and compute execution, async was actually working fine but at a certain point it broken nV's drivers and thus caused major slow downs because it bottlenecked the concurrent kernal execution. Now the opposite would be true if they made the code with nV hardware in mind, there will be a slow down on AMD hardware. Different hardware needs different code paths for optimal performance on both IHV's hardware, and we saw this with AOS with same drivers as first tested by sites, where the Fury X is right around the 980 ti in DX12 where is should be.

If you are specifically talking about AA, AA has a lot to do with how a renderer is coded, so nV might have been a bit lazy on their end on that for not telling the developer or helping them to correct that error. Its also not like AOS's team has their hands clean either, they went around public forums and told people basically unsubstantiated reasons for why AMD's hardware was performing "better" on their software in DX12 before they asked the question why isn't it running like they thought it would. And also AMD jumped on the bandwagon too. Well that's business man, if something that looks good for you, use it, and it was free even better. Now for AMD that wasn't smart because what they stated was actually wrong, there was two points they were incorrect for this game, of course in reality one point they stated was correct, about context switching but was not part of what was happening with AOS. They probably went through some dev docs from nV and reached that conclusion which is valid but not in this program.
 
Last edited:
I do not believe that improving visual quality and providing better looking and forward thinking 3D effects hinders anyone. This is what gamer's should want, games to evolve and keep looking better and utilizing modern 3D graphical effects. But, maybe, minecraft is all people are satisfied with these days on the PC, quite sad, I remember a time when it was all about pushing GPU features.

Of course NVIDIA is going to do everything it can to be better than the competition, one should expect that from a competitor. I would hope AMD fights back.
That's up to the game developer to program the tessellation factors.
Again, let's put the blame where the blame goes, voice your concerns, but point them toward the right group of people.

I like though how many people think they know what's better for the game in question than the people who made the game in the first place about the choices they make for their own game. Everyone becomes an expert on game design and 3D graphics choices all the sudden.

Pushing features maybe but obfuscating them from competitors is another form of trying to monopolizing the market.

No one can say that those features should only be done by middleware it comes with a price that price is people would see that the features described are gimmicks as they can be turned off yet are not part of gameplay or pushing 3d graphics they are there to do damage the last time with the witcher 3 it ended up killing Nvidia performance even on one of their series of cards ...

But find me a games developer which has valid points of arguing that 200 tessellation points in a cape equals outstanding gameplay and pushing 3d graphics.

GameWorks has been proven to use encrypted binary for the solution to work on AMD cards if Nvidia removes this feature as a whole I would not be opposed to GameWorks in the form it exists now even if it is purely to hinder AMD.

Nvidia encrypted binary can be changed at any point in time you can not optimize for it. This binary is the only way to make the middleware working.
 
Last edited:
Pushing features maybe but obfuscating them from competitors is another form of trying to monopolizing the market.

No one can say that those features should only be done by middleware it comes with a price that price is people would see that the features described are gimmicks as they can be turned of yet are not part of gameplay or pushing 3d graphics they are there to do damage the last time with the witcher 3 it ended up killing Nvidia performance even on one of their series of cards ...

But find me a games developer which has valid points of arguing that 200 tessellation points in a cape equals outstanding gameplay and pushing 3d graphics.

GameWorks has been proven to use encrypted binary for the solution to work on AMD cards if Nvidia removes this feature as a whole I would not be opposed to GameWorks in the form it exists now even if it is purely to hinder AMD.

Nvidia encrypted binary can be changed at any point in time you can not optimize for it. This binary is the only way to make the middleware working.


What? Do you even know what an encrypted binary is? Just a hint its to obfuscate machine code, compiling is not the same as encrypted binary lol.

And to encrypt source code is not the same thing as compiling either, that is to obfuscate written code, by adding in extra lines and to confuse others to not understand the code.
 
Back
Top