Fallout 4 optimized & CF fixed drivers

We can add War Thunder to the list of games infected by gameworks. I find this case to be especially interesting because these developers had a decently efficient game engine before.

And that's the worst part, this game was working relatively well and quite optimized before hand. But then... nvidia must've gotten to the devs and... now you can read the forums. To their credit the devs tried to keep it GPU agnostic, but... well... it has not gone well across the board to say the least.

An efficient engine became a huge laggy resource hog mess, even at the lowest settings. Graphics are also arguably worst to boot, looks like you are playing with plasticine tanks and planes with stickers now.

So as a test case; we have a game which worked well, now has Gameworks options introduced and VOILA! You need a hardware upgrade for practically no benefit.

I've read around and while I see complaints of performance issues, we don't know if it's because of GameWorks or other changes. A big change I see with the new patch is Physically based rendering and destructible environment. Both can be pretty big resource hogs, if not tuned.
 
I've read around and while I see complaints of performance issues, we don't know if it's because of GameWorks or other changes. A big change I see with the new patch is Physically based rendering and destructible environment. Both can be pretty big resource hogs, if not tuned.

It's all promoted as gameworks features by nvidia, which unfortunately you can't turn off. If you could turn it off it would be great, but for most of it you can't.

The destructible environment so far doesn't mean much as far as I've play tested to what it was before.
 
It's all promoted as gameworks features by nvidia, which unfortunately you can't turn off. If you could turn it off it would be great, but for most of it you can't.

The destructible environment so far doesn't mean much as far as I've play tested to what it was before.

I thought only WaveWorks was used?
 
So how long are AMD GPU owners going to have to wait for said drivers? NVIDIA had fallout 4 optimized drivers out on the 9th.

But you fail to mention no sli and they had likely access to the game first? And gamesdontworks and they still couldn't solve that? But yes let's give nvidia a free pass. Jesus, the bias on this board is bullshit. Allahprime1bar! Allah nvidibar!!

Amd users can simply turn tessellation down for a huge performance bump. God rays also for some. The ti doesn't even run solid 60fps on a crappy looking game at 1440, questions should really be directed to the developer.

But thanks everyone for alpha testing, for us patient folks.

And destructable environment (ground in particular) has been acheiveable even on a damn ps2 running red faction...
 
But you fail to mention no sli and they had likely access to the game first? And gamesdontworks and they still couldn't solve that? But yes let's give nvidia a free pass. Jesus, the bias on this board is bullshit. Allahprime1bar! Allah nvidibar!!

Amd users can simply turn tessellation down for a huge performance bump. God rays also for some. The ti doesn't even run solid 60fps on a crappy looking game at 1440, questions should really be directed to the developer.

But thanks everyone for alpha testing, for us patient folks.

And destructable environment (ground in particular) has been acheiveable even on a damn ps2 running red faction...

Lets not forget 5150Joker :p
 
I thought only WaveWorks was used?

Well, per nvidia's video trailer there is also Gameworks Destruction, which IMO is the biggest problem.

Waveworks you can turn off (if I remember right), and really, many maps don't have much water so it's mostly not seen anyways.

This Destruction thing however seems to be killing performance, and can lag it Hard, all for some pretty primitive looking effects IMHO.

Their new graphics engine is not tagged as gameworks "insert effect name here", but per some dev comments and nvidia worked with them on some parts of it at least.
 
But you fail to mention no sli and they had likely access to the game first? And gamesdontworks and they still couldn't solve that? But yes let's give nvidia a free pass. Jesus, the bias on this board is bullshit. Allahprime1bar! Allah nvidibar!!

Amd users can simply turn tessellation down for a huge performance bump. God rays also for some. The ti doesn't even run solid 60fps on a crappy looking game at 1440, questions should really be directed to the developer.

But thanks everyone for alpha testing, for us patient folks.

And destructable environment (ground in particular) has been acheiveable even on a damn ps2 running red faction...

1. Maybe a switch to decaf?

2. This thread is specific to AMD because that is the card I have.
 
http://wccftech.com/multi-gpu-nvidia-sli-amd-crossfire-performance-value-comparison/





Still trolling AMD threads, Joker? You're another person besides Wreckage/PRIME1 who needs to be permanently banned from the AMD Flavor forum. Why don't you go play with your Titan X SLI setup instead?



:D Poor Joker...

C4AVLDt.jpg


Him and sprayingmango used to hate Nvidia but are now their best lovers. I wonder what happened...
 
^^^ Wow that post is crazy haha. I would never have guessed that.

Anyways I am running the game on a borrowed MSI390OC and it runs great at VSR 3200x1800 with godrays turned down a bit. I am one of the rare people that doesn't mind a game like this running at a steady 35fps. If I turn it down to 1440p with all affects on it still runs great in my opinion.

Regardless of my thoughts on performance I do have two things to say.
I feel like this Gameworks fiasco has gotten so out of hand. Sure at one time we saw some questionable overuse of features that hurt AMD, that I agree with. Still all of you guys cannot act like any and every developer that uses Gameworks is gimping AMD on purpose, or that every fault in the game is a result of Gameworks.

I mean come the fuck on.

This is Bethesda releasing a game with no focus on the PC at all. Horrid textures, early 2000s level physics interaction, horrid AA, terrible windowed mode (cant use VSR resolutions with windowed mode on), and the list goes on and on. I do feel that game is solid in terms of gameplay but Bethesda really let us all down with how shitty this game looks all these years later. Place the blame with Bethesda, not Nvidia.

Also AMD performance on this game is not shit, just because something is marginally slower doesn't mean its shit. The game plays great on this card I've been borrowing and that's that for me.
 
I think the issue most have with it is the FuryX to 980Ti performance gap. It in absolutely no way mirrored expected results from even previous poor performing games.
 
Hahaha i dont think he will came back for more... LOL


why that just shows he is capable at looking at both sides :), unlike others here calling out for people to get banned, is that better? That shows they are being sensitive to their purchases? Who should the real people that should be getting infractions or being banned?
 
Him and sprayingmango used to hate Nvidia but are now their best lovers. I wonder what happened...

How old is that post? Like 10 years ago? Opinions and viewpoints change, that's what intelligent people are capable of doing. I was wrong to say those things about Kyle at the time and clearly my views about Hard|OCP were incorrect---I attribute it to being young and stupid. :D All this proves is that I'm capable of looking at AMD and NVIDIA objectively. In fact, I used to advocate for Team Red quite a bit back in the day as I believed they were technologically ahead of the curve and were on point (e.g. 9700 pro/9800) but then AMD took over and well, they hit a spiral of failure to which there's no pulling out.

Edit: Found the post, almost 13 years ago. Must have really dug deep to find that, pathetic.


^^^ Wow that post is crazy haha. I would never have guessed that.

Anyways I am running the game on a borrowed MSI390OC and it runs great at VSR 3200x1800 with godrays turned down a bit. I am one of the rare people that doesn't mind a game like this running at a steady 35fps. If I turn it down to 1440p with all affects on it still runs great in my opinion.

Regardless of my thoughts on performance I do have two things to say.
I feel like this Gameworks fiasco has gotten so out of hand. Sure at one time we saw some questionable overuse of features that hurt AMD, that I agree with. Still all of you guys cannot act like any and every developer that uses Gameworks is gimping AMD on purpose, or that every fault in the game is a result of Gameworks.

I mean come the fuck on.

This is Bethesda releasing a game with no focus on the PC at all. Horrid textures, early 2000s level physics interaction, horrid AA, terrible windowed mode (cant use VSR resolutions with windowed mode on), and the list goes on and on. I do feel that game is solid in terms of gameplay but Bethesda really let us all down with how shitty this game looks all these years later. Place the blame with Bethesda, not Nvidia.

Also AMD performance on this game is not shit, just because something is marginally slower doesn't mean its shit. The game plays great on this card I've been borrowing and that's that for me.


I agree, Bethesda did a really poor job with the game engine and assets in this game. Imagine if NVIDIA hadn't gotten involved, the game would look worse than it already does! I'm glad we at least get to have God Rays in this game because clearly Bethesda wasn't willing to put in the work to code something better. I think because of how good the game content is (I'm really enjoying it despite a lot of shortcomings) and the notion of "well it's Bethesda", they get a free pass that a lot of other developers would get crucified for doing. They should have at the very least shipped the game with high resolution textures that take advantage of modern GPUs, had the option to disable vsync and worked closely with both GPU companies so that there would be SLI/Crossfire profiles available on release. But Bethesda knows the PC gaming community will eat up anything they throw at them so they've gotten lazy and are exploiting that good will.

Look at Rockstar, they're behaving just as bad as Bethesda (maybe worse) despite the good job (albeit very late) they did with GTA V. They're shutting down any multiplayer mods and even harassing them by sending people to their houses and threatening lawsuits, all so they can preserve their shark card money. That's why I find it funny when people try to blame the decrepit state of PC gaming on NVIDIA when the fault lies with these big publishers for putting no effort towards the PC; at least NVIDIA is doing something to add some value to the PC ports.
 
Last edited:
Well, per nvidia's video trailer there is also Gameworks Destruction, which IMO is the biggest problem.

Waveworks you can turn off (if I remember right), and really, many maps don't have much water so it's mostly not seen anyways.

This Destruction thing however seems to be killing performance, and can lag it Hard, all for some pretty primitive looking effects IMHO.

Their new graphics engine is not tagged as gameworks "insert effect name here", but per some dev comments and nvidia worked with them on some parts of it at least.


Object destruction if done the way I'm thinking its done, (not sure how the game works destruction is done, haven't looked into it), is procedural based, and it will hurt performance, you are making vertices and polygons on the fly, added to that then you have to create most likely procedural textures and also calculate normals for them, after all this, there maybe tessellation on those newly created objects as well.....

All of these can't be done in parallel because there are dependencies that have to be taken into account for.

Older generation games have had pre made destructive environments, which of course lessen the burden as you just switch out models that are being rendered, or very simple destruction like for glass.
 
Troll? Which post in this thread of mine is trolling? The only ones trolling here are you, CoolVibrations, Creig, and the others here who can't handle any outside opinions and inject 3rd party forum posts into the thread that have nothing to do with the discussion.

Dude, you got owned pretty hard and you still trying lollllll
 
The problem is with the game dev not communicating what AMD needs for driver development. In extreme examples the game dev needs to patch the game.

I agree that the Devs need to be transparent. If they aren't handing over the info, for whatever reason, or if AMD is dragging their feet would be good to know.

Buying nVidia is the last thing that someone should do in this situation though. If anyone thinks that only having an nVidia install base and no competition will improve anything they don't understand how monopolies work. Or, they work for nVidia. The 2nd is far more likely since the concept of monopolies is simple enough for a 10 year old to understand.
 
FO4 is heavy on CPU, AMD users are going to suffer more thanks to the driver overhead issues. One of the few games these days with that problem.
 
FO4 is heavy on CPU, AMD users are going to suffer more thanks to the driver overhead issues. One of the few games these days with that problem.

Granted most of the uproar will be moot after a patch/driver. But in no way is the reason for the discrepancy solely a dx11 issue. Just have to look at the games as a whole to see how poorly these releases look against the norm. No one here is going to debate the 980TI being in the lead. With GW probably wouldn't feel too bad about 10-15% between the 980TI and FuryX. But a near 40-50% discrepancy isn't in any way reasonable nor should it be condoned. By whatever reason it is an issue that should be getting far more concern. Funny how at the beginning of the year it was a bit down played but with each release thus far the issue has grown to far larger proportions. And now watch so many scramble still peddling the same wares they did before. Its like in Naked Gun when Leslie Neilsons character is standing in front of the exploding fireworks factory exclaiming there is nothing to see here. lol

It isn't really GW that causes the biggest issue, although it doesn't help, closed source and all. It is the apparent lack of time given for drivers before release. TW3 showed that even Nvidia was caught with it pants down in regards to Kepler ( if you believe there was no intent on Maxwell sales initiative). Anno 2205 looks dangerously the same with it release benchmarks. Granted as I stated before after the drivers and patches we will likely see the norm results we expect, but unfortunately it is this poor releasing that is the issue at hand.
 
How old is that post? Like 10 years ago? Opinions and viewpoints change, that's what intelligent people are capable of doing. I was wrong to say those things about Kyle at the time and clearly my views about Hard|OCP were incorrect---I attribute it to being young and stupid. :D All this proves is that I'm capable of looking at AMD and NVIDIA objectively. In fact, I used to advocate for Team Red quite a bit back in the day as I believed they were technologically ahead of the curve and were on point (e.g. 9700 pro/9800) but then AMD took over and well, they hit a spiral of failure to which there's no pulling out.

Edit: Found the post, almost 13 years ago. **Must have really dug deep to find that, pathetic.**

Actually no, I didn't try hard at all. I confused you for sprayingmango because I swear one of you had a huge boner for AMD's 7970, especially with Eyefinity... it wasn't you, obviously.

I remember seeing your name back in the day in Beyond3D/Rage3D so it was simple enough. You used to argue with Razor1 criticizing him of being an Nvidiot (I'm pretty sure you don't remember that either but he's in this same thread, ironically enough). Unlike you though, despite being young, I was intelligent enough not to post drivel that can be traced back to me.

It seems you still haven't learned your lesson. Feel free to keep proclaiming Nvidia's greatness in AMD Flavor though.
 
Granted most of the uproar will be moot after a patch/driver. But in no way is the reason for the discrepancy solely a dx11 issue. Just have to look at the games as a whole to see how poorly these releases look against the norm. No one here is going to debate the 980TI being in the lead. With GW probably wouldn't feel too bad about 10-15% between the 980TI and FuryX. But a near 40-50% discrepancy isn't in any way reasonable nor should it be condoned. By whatever reason it is an issue that should be getting far more concern. Funny how at the beginning of the year it was a bit down played but with each release thus far the issue has grown to far larger proportions. And now watch so many scramble still peddling the same wares they did before. Its like in Naked Gun when Leslie Neilsons character is standing in front of the exploding fireworks factory exclaiming there is nothing to see here. lol

It isn't really GW that causes the biggest issue, although it doesn't help, closed source and all. It is the apparent lack of time given for drivers before release. TW3 showed that even Nvidia was caught with it pants down in regards to Kepler ( if you believe there was no intent on Maxwell sales initiative). Anno 2205 looks dangerously the same with it release benchmarks. Granted as I stated before after the drivers and patches we will likely see the norm results we expect, but unfortunately it is this poor releasing that is the issue at hand.
I have yet to see any performance analysis comparing God Rays with low tessellation (4x or 8x) vs high (16x +) so I won't comment on how its impacting AMD. There's no proof there whatsoever. I'm not saying it's untrue, though. The game is a CPU hog and whenever that happens, AMD GPUs are going to suffer. So yes it's a DX11 problem. When modern Intel CPUs are bottlenecking Nvidia GPUs, that effect is going to be significantly worse for AMD owners.

I pity anyone running an AMD CPU + GPU combo in this game. Unless it scales really well on FX-8000 chips.
 
I have yet to see any performance analysis comparing God Rays with low tessellation (4x or 8x) vs high (16x +) so I won't comment on how its impacting AMD. There's no proof there whatsoever. I'm not saying it's untrue, though.

The game is a CPU hog and whenever that happens, AMD GPUs are going to suffer. So yes it's a DX11 problem. When modern Intel CPUs are bottlenecking Nvidia GPUs, that effect is going to be significantly worse for AMD owners.

I pity anyone running an AMD CPU + GPU combo in this game. Unless it scales really well on FX-8000 chips.

Think you missed the point. I wasn't saying that the AMD GPUs don't suffer under DX11, but rather they have not ever suffered that much of a penalty. Not even Skyrim showed that much of an issue. Now maybe if we compound that with Tess usage and other GW features then it adds up but still not to a 40-50% difference.
 
How much suffering are we talking about?
I frequently drop to 25-30fps with this game maxed out. Anyone running an AMD GPU will be lower. I would not be surprised to see AMD users sub-20fps.

The performance issues AMD owners are suffering are not GameWorks related, it's a combination of FO4's poor optimization (particularly Shadow Distance) combined with AMD's poor DX11 drivers. Tessellation is the last thing you need to worry about.
 
The problem is with the game dev not communicating what AMD needs for driver development. In extreme examples the game dev needs to patch the game.

I agree that the Devs need to be transparent. If they aren't handing over the info, for whatever reason, or if AMD is dragging their feet would be good to know.

Buying nVidia is the last thing that someone should do in this situation though. If anyone thinks that only having an nVidia install base and no competition will improve anything they don't understand how monopolies work. Or, they work for nVidia. The 2nd is far more likely since the concept of monopolies is simple enough for a 10 year old to understand.

Apple had the monopoly on smartphones with the first iPhone. and yet in the current market there are many other players participating.

And with the current GPU/CPU market, nVidia/Intel are having virtual monopoly anyway. They get to dictate where the market heads to next. Not their fault AMD flopped hard. If AMD fails, I'm sure someone else will pick up the slack, maybe by buying out AMD then.

The CPU/GPU market is too lucrative for 1 dominant player. ARM showed you it can happen. Why won't it happen with GPUs? You don't think someone like Qualcomm/Samsung will step in? Or even Microsoft? MS' hardware division has been on fire lately, I'm sure they'd love to get their hands on a CPU+GPU mfg.
 
Apple had the monopoly on smartphones with the first iPhone. and yet in the current market there are many other players participating.

And with the current GPU/CPU market, nVidia/Intel are having virtual monopoly anyway. They get to dictate where the market heads to next. Not their fault AMD flopped hard. If AMD fails, I'm sure someone else will pick up the slack, maybe by buying out AMD then.

The CPU/GPU market is too lucrative for 1 dominant player. ARM showed you it can happen. Why won't it happen with GPUs? You don't think someone like Qualcomm/Samsung will step in? Or even Microsoft? MS' hardware division has been on fire lately, I'm sure they'd love to get their hands on a CPU+GPU mfg.

Ok there are just too many equating market share to ownership. Market share is just what is sold not in any way an accurate representation of what percentage is being Used. Keep in mind for the first half of this year AMD sold nothing into the channel therefore part of the market numbers were 0 because AMD released nothing to AIB partners. Therefore all sales were just depleting stock already in the channel. Also keep in mind this isn't solely new ownership, as in never owned before, nor can it be equate to entirely those that moved from AMD to Nvidia, albeit that will be part of the number. This is partly, and likely in great share to those that upgraded from an existing Nvidia product to a new Nvidia product. Seems from my experience and what I have seen Nvidia owners tend to upgrade far more often. Seriously how many posters do you see talking about the 680 they still use. In contrast look at the great number of 7970s posters still use(not talking about 280X, but vanilla 7970).

But that has precious little to do with what you are alluding to. Games aren't made to market share they are far more likely made to ownership. Otherwise would you ever see 5xx series or 6xxx series listed in minimum requirements? And in another forum a great point was made: the sheer number of GCN GREATLY out weighs all Nvidia architectures. How? Consoles! Yes those little boxes skew the numbers greatly in favor of AMD ownership as it pertains to game construction. Also lets not forget that most of these titles have been in the works for greater than a year, up to 3 years a great deal of the time. Therefore Maxwell was not even a consideration for some in the beginning. So then we are left wondering at what point was this consideration made to add these optimizations that show these results and why against the percentages of ownership are these the results we are getting.
 
Ok there are just too many equating market share to ownership. Market share is just what is sold not in any way an accurate representation of what percentage is being Used. Keep in mind for the first half of this year AMD sold nothing into the channel therefore part of the market numbers were 0 because AMD released nothing to AIB partners. Therefore all sales were just depleting stock already in the channel. Also keep in mind this isn't solely new ownership, as in never owned before, nor can it be equate to entirely those that moved from AMD to Nvidia, albeit that will be part of the number. This is partly, and likely in great share to those that upgraded from an existing Nvidia product to a new Nvidia product. Seems from my experience and what I have seen Nvidia owners tend to upgrade far more often. Seriously how many posters do you see talking about the 680 they still use. In contrast look at the great number of 7970s posters still use(not talking about 280X, but vanilla 7970).

But that has precious little to do with what you are alluding to. Games aren't made to market share they are far more likely made to ownership. Otherwise would you ever see 5xx series or 6xxx series listed in minimum requirements? And in another forum a great point was made: the sheer number of GCN GREATLY out weighs all Nvidia architectures. How? Consoles! Yes those little boxes skew the numbers greatly in favor of AMD ownership as it pertains to game construction. Also lets not forget that most of these titles have been in the works for greater than a year, up to 3 years a great deal of the time. Therefore Maxwell was not even a consideration for some in the beginning. So then we are left wondering at what point was this consideration made to add these optimizations that show these results and why against the percentages of ownership are these the results we are getting.

Do you have any data/proof to back up your conjecture? All I'm seeing is the tired same old tirade parroted over and over. Console advantage is a wash for AMD.

AMD chips were used with the 360, and now PS4 and Xbone, and where is the advantage for AMD? I don't see any performance/feature advantage, nor do I see better time to market/better stability. I just don't see any indication of that being an advantage to AMD. It's been 10 years since the 360 came out. Where's this vaperformance that you speak of? If anything, nVidia's doing better since they left the console market. Funny that.

Also, sales number indicate ownership trends. Sale channels request the amount of stock they request because that's the amount they think they can sell. Of course, it's not gonna be a 1-1 relationship between sales number and actual ownership, but it's great as an indication. If channels stop stocking your stuff, you get 0 sales. Simple as that.

Minimum requirements are not actually indicative of performance. You don't have settings used, you don't have FPS numbers, you have nothing but part names. As far as anybody can tell, they could've made the game run at 1, 10, 15, 20 FPS on the part and throw it out there.

As for your "point" about users with old cards not upgrading, please give me some data from somewhere reputable. Your back side is not one of them.
 
Currently, CrossFire is better than SLI.
Technology wise yes, XDMA offers much better scaling and consistency than the bridge used for SLI. The problem is the software support is hit or miss. I had a very good experience with CrossFire in Battlefield and most other DICE/EA games, but not so good of an experience with Unreal Engine games.
 
Do you have any data/proof to back up your conjecture? All I'm seeing is the tired same old tirade parroted over and over. Console advantage is a wash for AMD.

AMD chips were used with the 360, and now PS4 and Xbone, and where is the advantage for AMD? I don't see any performance/feature advantage, nor do I see better time to market/better stability. I just don't see any indication of that being an advantage to AMD. It's been 10 years since the 360 came out. Where's this vaperformance that you speak of? If anything, nVidia's doing better since they left the console market. Funny that.

Also, sales number indicate ownership trends. Sale channels request the amount of stock they request because that's the amount they think they can sell. Of course, it's not gonna be a 1-1 relationship between sales number and actual ownership, but it's great as an indication. If channels stop stocking your stuff, you get 0 sales. Simple as that.

Minimum requirements are not actually indicative of performance. You don't have settings used, you don't have FPS numbers, you have nothing but part names. As far as anybody can tell, they could've made the game run at 1, 10, 15, 20 FPS on the part and throw it out there.

As for your "point" about users with old cards not upgrading, please give me some data from somewhere reputable. Your back side is not one of them.

Seriously you must just like to argue. Steam survey has all the numbers you need. Everything I stated is based on fact, real numbers. Market trends speak only to sales never to ownership percents. Another fact you do not seem to be aware of.
 
Seriously you must just like to argue. Steam survey has all the numbers you need. Everything I stated is based on fact, real numbers. Market trends speak only to sales never to ownership percents. Another fact you do not seem to be aware of.

So you mean the majority of people buy GPUs to put on shelves? So Intel is the dominant player in the GPU industry? Steam survey, laughable. Yes, that's 1 data point. However if NVIDIA outsells AMD 4-1 over consecutive quarters, those data points are now suddenly worthless because of Steam surveys?

Let me pretend that Steam is an accurate representation of the GPU market. On Steam, NVIDIA ownership is twice that of ATI. 54% vs 28%.
Games aren't made to market share they are far more likely made to ownership. Otherwise would you ever see 5xx series or 6xxx series listed in minimum requirements? And in another forum a great point was made: the sheer number of GCN GREATLY out weighs all Nvidia architectures

What I don't understand is your logic. On Steam you have 2 NVIDIA GPUs per 1 AMD GPU. On sales you have 4 NVIDIA sales per 1 AMD sale.

So please, can you enlighten me on how the hell did you arrive at the conclusion that there are more GCN GPUs out there compared to NVIDIA, even though both data points you and I showed points to the opposite?

Also, can you name me 1 instance where AMD PC GPUs benefit from consoles? Mantle was not a benefit, it was an 8 million USD gamble that, from the looks of it now, is not paying off. So where is the supposed console benefit?

Also, Steam is very unreliable as a measure of GPU ownership. http://forums.anandtech.com/showthread.php?t=2134147
 
FO4 is heavy on CPU, AMD users are going to suffer more thanks to the driver overhead issues. One of the few games these days with that problem.

Bethesda is the class of company that needs to get off its ass and create a more modern engine that supports dx12.

2015 is their last past, both they and ubisoft NEED to have dx12 support for any games with higher end graphics released in 2016 and beyond.
 
So you mean the majority of people buy GPUs to put on shelves? So Intel is the dominant player in the GPU industry? Steam survey, laughable. Yes, that's 1 data point. However if NVIDIA outsells AMD 4-1 over consecutive quarters, those data points are now suddenly worthless because of Steam surveys?

Let me pretend that Steam is an accurate representation of the GPU market. On Steam, NVIDIA ownership is twice that of ATI. 54% vs 28%.


What I don't understand is your logic. On Steam you have 2 NVIDIA GPUs per 1 AMD GPU. On sales you have 4 NVIDIA sales per 1 AMD sale.

So please, can you enlighten me on how the hell did you arrive at the conclusion that there are more GCN GPUs out there compared to NVIDIA, even though both data points you and I showed points to the opposite?

Also, can you name me 1 instance where AMD PC GPUs benefit from consoles? Mantle was not a benefit, it was an 8 million USD gamble that, from the looks of it now, is not paying off. So where is the supposed console benefit?

Also, Steam is very unreliable as a measure of GPU ownership. http://forums.anandtech.com/showthread.php?t=2134147

Try to take step back, relax and think logically. GCN is in all consoles now, and their sales are quite large. Hence why I state GCNs number is greater than Maxwell by far and large and still greater than Maxwell and Kepler combined. This is only speaking to numbers not performance or dictating a superior architecture, so no need to go that route.

The 4:1 and 2:1 go so far as to prove what I said. NVIDIA owners tend to upgrade far more frequently. Most of the numbers prove it as well as owners that post in forums. It isn't a contest nor is it a statement of anything more than observable trends.

As I mentioned minute one, I don't care about who's in first, don't deny the 980TI is clearly the winner this release. GW in part doesn't bother me, but does have some traits that all of us should be concerned about. Honestly think there should be Tess sliders in game, thank God AMD has an over ride. The lack of such in TW3 shows a very negative trend developing. Again this should have never been an AMD NVIDIA debate but a dev/gamer one, which includes concern for GWs.
 
Try to take step back, relax and think logically. GCN is in all consoles now, and their sales are quite large. Hence why I state GCNs number is greater than Maxwell by far and large and still greater than Maxwell and Kepler combined. This is only speaking to numbers not performance or dictating a superior architecture, so no need to go that route.

The 4:1 and 2:1 go so far as to prove what I said. NVIDIA owners tend to upgrade far more frequently. Most of the numbers prove it as well as owners that post in forums. It isn't a contest nor is it a statement of anything more than observable trends.

As I mentioned minute one, I don't care about who's in first, don't deny the 980TI is clearly the winner this release. GW in part doesn't bother me, but does have some traits that all of us should be concerned about. Honestly think there should be Tess sliders in game, thank God AMD has an over ride. The lack of such in TW3 shows a very negative trend developing. Again this should have never been an AMD NVIDIA debate but a dev/gamer one, which includes concern for GWs.

Wait what? How can you infer from the 4:1 and 2:1 trends that nVidia owners tend to upgrade more frequently? It might be that the increase is mostly due to AMD users switching, it might be that ducks suddenly have an interest in gaming. What the hell are you on about? Most of what numbers prove that?

Owners that post in forums only serve to illustrate the enthusiast space, really. You have to care past a certain point to join a forum to post about it.

GCN is in all consoles, great. So then why does every single console port run like dog crap on AMD hardware, even more so than nVidia hardware? Step back, chill and think logically ;)
 
Wait what? How can you infer from the 4:1 and 2:1 trends that nVidia owners tend to upgrade more frequently? It might be that the increase is mostly due to AMD users switching, it might be that ducks suddenly have an interest in gaming. What the hell are you on about? Most of what numbers prove that?

Owners that post in forums only serve to illustrate the enthusiast space, really. You have to care past a certain point to join a forum to post about it.

GCN is in all consoles, great. So then why does every single console port run like dog crap on AMD hardware, even more so than nVidia hardware? Step back, chill and think logically ;)

I have tried to explain in somewhat easy-to-understand detail but yet, I gather intentionally, you choose the ignorant path as if the obvious is far from that for you.

4:1 SALES RESULTING IN A 2:1 OWNERSHIP: lets look at this, simple math. The only way to make the 4:1 rationale in the context you are trying to make is if the original ownership ratio was well in favor of AMD and we know this has not been the case in quite some time. This is assuming everyone bought new cards. Now if we assume only some did at an equal rate for both sides then even that doesn't correlate to the rough ownership value. Maybe the ownership levels are really high and the new owners don't add much to the final ownership level. This would make sense however doesn't change the original argument that devs MUST consider all. Point of fact Maxwell is still very small in terms of ownership against Kepler and GCN, therefore despite Marketshare it would behoove one not to favor one too greatly as they essentially don't consider the bulk of the consumer needs.

It is quite simple and not at all difficult to grasp. Doesn't require hurt feelings or the need to defend either side. Just facts nothing more.
 
GCN is in all consoles, great. So then why does every single console port run like dog crap on AMD hardware, even more so than nVidia hardware? Step back, chill and think logically ;)
Highlighted the key words for you there. Consoles all use low level programming and handle their own memory management. For a port you end up with DirectX or OpenGL doing all the scheduling and memory management. That means most if not all of the optimizations get removed. Further, supporting async operations can be a big deal. AMD for example uses a discrete tesselation unit on their GPUs. On a console devs could be rendering one frame while tesselating the next. Maximizing utilization of the hardware. Port it over to a PC and it may turn into synchronous where you are rendering or tesselating. It's my understanding there is a hack to get it to work under DirectX and OpenGL, but that may be difficult if the tesselation is occuring within some binary blob or code block you can't access.
 
So you mean the majority of people buy GPUs to put on shelves? So Intel is the dominant player in the GPU industry? Steam survey, laughable. Yes, that's 1 data point. However if NVIDIA outsells AMD 4-1 over consecutive quarters, those data points are now suddenly worthless because of Steam surveys?

Let me pretend that Steam is an accurate representation of the GPU market. On Steam, NVIDIA ownership is twice that of ATI. 54% vs 28%.


What I don't understand is your logic. On Steam you have 2 NVIDIA GPUs per 1 AMD GPU. On sales you have 4 NVIDIA sales per 1 AMD sale.

So please, can you enlighten me on how the hell did you arrive at the conclusion that there are more GCN GPUs out there compared to NVIDIA, even though both data points you and I showed points to the opposite?

Also, can you name me 1 instance where AMD PC GPUs benefit from consoles? Mantle was not a benefit, it was an 8 million USD gamble that, from the looks of it now, is not paying off. So where is the supposed console benefit?

Also, Steam is very unreliable as a measure of GPU ownership. http://forums.anandtech.com/showthread.php?t=2134147

You do realize mantle did pay off, amd finished dx12, vulkan which is coming in mantle taken to the next level. It will be multi platform and the new future standard, mantle brought us asynchronous compute(many ps4 titles use it, some box one, future battlefield games, aots , deus ex and other amd evolved titles and some dx12 games.), mantle brought us split frame rendering, the ability to crossli e.g. Aots crossli'd a fury x(master card since its superior) and nferiors 980ti and it worked swapped the cards performance was worse. This and more is what mantle gave us. Go back to the nvidia forums where you belong.
 
Highlighted the key words for you there. Consoles all use low level programming and handle their own memory management. For a port you end up with DirectX or OpenGL doing all the scheduling and memory management. That means most if not all of the optimizations get removed. Further, supporting async operations can be a big deal. AMD for example uses a discrete tesselation unit on their GPUs. On a console devs could be rendering one frame while tesselating the next. Maximizing utilization of the hardware. Port it over to a PC and it may turn into synchronous where you are rendering or tesselating. It's my understanding there is a hack to get it to work under DirectX and OpenGL, but that may be difficult if the tesselation is occuring within some binary blob or code block you can't access.

So then AMD gains pretty much nothing because the programming model is different between console and DirectX. So then where is the advantage? DX12? People do realise nVidia also has console experience, right?

You do realize mantle did pay off, amd finished dx12, vulkan which is coming in mantle taken to the next level. It will be multi platform and the new future standard, mantle brought us asynchronous compute(many ps4 titles use it, some box one, future battlefield games, aots , deus ex and other amd evolved titles and some dx12 games.), mantle brought us split frame rendering, the ability to crossli e.g. Aots crossli'd a fury x(master card since its superior) and nferiors 980ti and it worked swapped the cards performance was worse. This and more is what mantle gave us. Go back to the nvidia forums where you belong.

Alright, I'll bite the bait.

AMD finished DX12 has never been proven, just speculated by AMD fans. Vulkan hasn't materialised yet, and the first serious foray into Linux gaming by Steam machines is held back by poor drivers. On Windows, Vulkan won't challenge DirectX any time soon.

As for DX12 performance and asynchronous compute, funny how closer to when it matters nVidia wins in AotS. Let's reserve judgment until release time, shall we? No point speculating on alphas/betas.

Split frame rendering is dubious, it's up to developers to support it. Seeing how many have proper support for decent CFX/SLI, I doubt that'll be meaningful. We'll see.

FuryX being superior to 980Ti, in what metrics?
 
Back
Top