Async compute gets 30% increase in performance. Maxwell doesn't support async.

how is the word asynchronous a marketing term

"Asynchronous Compute" is essentially the name of an AMD feature. And now people are running around saying that nvidia's architecture is inherently a "serial" architecture. And I've heard people say that Maxwell can do "parallel" compute but not "asynchronous" compute.

It's like we've lost our minds.
 
It has since been demonstrated that in this benchmark a 290X is the same as a Fury X which is the same as a 980 Ti. I think this demonstrates that likely one of two things is true:

1) The benchmark is CPU limited.
2) There is something broken somewhere and the brokenness is not specific to either nvidia or AMD.
Oxide has claimed they only used a "moderate" amount of async compute, and being that the 290X and Fury X have the same amount of ACEs, a bottleneck there would make them perform identically.

I'd like to see more people asking about Fury X's performance relative to the 980 Ti, being that all 3 cards are equal in AotS. It seems everyone is focusing on the 290X.
 
Oxide has claimed they only used a "moderate" amount of async compute, and being that the 290X and Fury X have the same amount of ACEs, a bottleneck there would make them perform identically.

I'd like to see more people asking about Fury X's performance relative to the 980 Ti, being that all 3 cards are equal in AotS. It seems everyone is focusing on the 290X.

Isn't it interesting that all of the initial benchmarks pitted a 290X against a 980 Ti?

Like why didn't they benchmark AMD's latest card, the one that actually competes with the 980 Ti?
 
Maxwell wasn't built for compute at all.
I'm not seeing how workstation features translate into gaming performance. It never did.

Test an actual DX12 game.

I think its a bigger issue then simple Compute and Gaming Performance

if what Oxide says is true.. Nvidia's own drivers are reporting a supported feature when in fact they don't support that feature in hardware (hence the performance penalty and vendor specific pathway)

If Async Compute is a DX12 requirement, then Nvidia has simply put a driver hack so they can say they are DX12 supported and their drivers don't break on a DX12 function

and that really is the bottom line.. if Nvidia hardware isn't DX12 feature complete.. should they be able to claim that the GTX980 is DX12 hardware?

Essentially they pulled a page from intel's book, which is to support many features in software vs hardware for compatibility reasons
 
Isn't it interesting that all of the initial benchmarks pitted a 290X against a 980 Ti?

Like why didn't they benchmark AMD's latest card, the one that actually competes with the 980 Ti?
Because they had to return their FX samples. Meaning they literally didn't even have a FX to test.
 
AMD weighs in..

For content that does not contribute to any discussion.


[–]AMD_Robert- EmployeeAMD 11 points 14 hours ago




Oxide effectively summarized my thoughts on the matter. NVIDIA claims "full support" for DX12, but conveniently ignores that Maxwell is utterly incapable of performing asynchronous compute without heavy reliance on slow context switching.

GCN has supported async shading since its inception, and it did so because we hoped and expected that gaming would lean into these workloads heavily. Mantle, Vulkan and DX12 all do. The consoles do (with gusto). PC games are chock full of compute-driven effects.

If memory serves, GCN has higher FLOPS/mm2 than any other architecture, and GCN is once again showing its prowess when utilized with common-sense workloads that are appropriate for the design of the architecture
https://www.reddit.com/r/AdvancedMi...ide_games_made_a_post_discussing_dx12/cul9auq

out of context switching = out of order I was talking about in the other thread.
 
Well for me, it's a mission accomplished. NVIDIA will most undoubtedly need to respond and that was the point.

When people spend $650 on a GPU, they should know what they're getting. Anandtech concluding that "Maxwell 2 does this too" was a gross over simplification.

Pascal will undoubtedly remedy this shortcoming. It is almost evident that it will. For the time being... People will know what they're buying and that's what tech websites ought to be doing. Informing the people. Being disruptive if need be. They shouldn't simply quote PR from large firms. I know, they have bills to pay. But if they all do this, the big firma won't have a choice but to rely on them anyway as well as innovate.

I'm happy with what I've been able to do. I wasn't alone... A ton of people argued and fought with me. You helped me a ton razor1 and I am very grateful for that.

In the end, all this will have had the effect of pushing innovation forward... Maybe even affecting the way tech journalists report on issues. I know, sounds idealistic and grandiose, I'm probably getting ahead of myself too. Truth is, I miss the PC Gaming community of old. I'm tired of useless flame wars and I lament the loss of intelligent conversations.

I'm sure you do too.
 
No, what I see is it brings parity to AMD and NVIDIA (for now...notice i said "for now")
Somhow the fact that AMD and NVIDIA are fighting neck by neck = "destroy NVIDIA"?

All aboard the hyberbole train!!!

The focus is the wrong place.
The question pleople should ask themselfes...what the *bleeep* have AMD been doing with their DX11 driver all these years? (that is the real kicker and will properbly be looked into in 6-12 months...when things have normalized.)



(I expect other DX12 titles to show a different performance-delta...people should take a breath...and think a little...things are never static in the GPU world...)

Factum, why do you post like you're questioning some science blog all the time? No one here can take you seriously especially when you don't even fact check your posts.
 
Well for me, it's a mission accomplished. NVIDIA will most undoubtedly need to respond and that was the point.

When people spend $650 on a GPU, they should know what they're getting. Anandtech concluding that "Maxwell 2 does this too" was a gross over simplification.

Pascal will undoubtedly remedy this shortcoming. It is almost evident that it will. For the time being... People will know what they're buying and that's what tech websites ought to be doing. Informing the people. Being disruptive if need be. They shouldn't simply quote PR from large firms. I know, they have bills to pay. But if they all do this, the big firma won't have a choice but to rely on them anyway as well as innovate.

I'm happy with what I've been able to do. I wasn't alone... A ton of people argued and fought with me. You helped me a ton razor1 and I am very grateful for that.

In the end, all this will have had the effect of pushing innovation forward... Maybe even affecting the way tech journalists report on issues. I know, sounds idealistic and grandiose, I'm probably getting ahead of myself too. Truth is, I miss the PC Gaming community of old. I'm tired of useless flame wars and I lament the loss of intelligent conversations.

I'm sure you do too.

You seem to think very highly of yourself all over the internet. And to be blunt, I am not exactly sure what it is you think you've accomplished. Perhaps part of the problem here is that you are quick to draw conclusions that can not be seriously held with such preliminary and shoddy evidence.

Now everybody knows that each GPU architecture has its strengths and its weaknesses. Oxide was able to come up with a benchmark that allowed GCN to demonstrate its strengths, and thus made the Fury X competitive with a 980 Ti. Does this not strike one as manufactured controversy?

BTW you never clarified what you meant when you said that Maxwell does not support "bindless". Depending on what you're referring to, that could be quite a hilarious accusation.
 
You seem to think very highly of yourself all over the internet. And to be blunt, I am not exactly sure what it is you think you've accomplished. Perhaps part of the problem here is that you are quick to draw conclusions that can not be seriously held with such preliminary and shoddy evidence.

Now everybody knows that each GPU architecture has its strengths and its weaknesses. Oxide was able to come up with a benchmark that allowed GCN to demonstrate its strengths, and thus made the Fury X competitive with a 980 Ti. Does this not strike one as manufactured controversy?

BTW you never clarified what you meant when you said that Maxwell does not support "bindless". Depending on what you're referring to, that could be quite a hilarious accusation.

bindless is just a word I used for relying on dependencies vs working independently. Referring to AWSs vs ACEs.

I've posted slides on this topic in the other thread. You're right though, I conflated the talk of tier 1/2/3 with Async compute. This led the Oxide developer to explain it all over at over lock.net. Prior to this, nobody had reported on these differences.

I don't think highly of myself because I don't care if I'm wrong or not entirely spot on. I wanted to compel a conversation on the topic and to compel the players involved to divulge more information.

It's not about taking me seriously. It's about starting a conversation which leads to more information being released. Think of it more as like what Glenn Greenwald does. His reporting had holes in it, sure he had Snowden as a source but they both didn't know everything. They compelled the State Department to respond by starting up a controversy.

You can think what you want of me. That's fine. You're entitled to your opinions. On my end, I wanted to know what was happening with Ashes of the Singularity. I wasn't content with the conversations about poor drivers or bias on the part of Oxide.

We lacked far too much information. Now information is being released. For me, I accomplished what I set out to do. Now I can sit back and wait as information flows out of AMD, Oxide and hopefully NVIDIA.

We should get a clearer picture of it all fairly soon.

Ps. There's little reason to be upset with me. The more information the better. I think we can all agree on that.

Here's what I did..

1. Placed threads throughout many forums.

2. Was met with opinions and information which countered what I posted.

3. Took this information into account and added it to my thesis.

4. Emailed Oxide, obtained CPU related information. Posted it over at overclock.net.

5. Emailed Oxide for GPU information on Async compute compelling Oxide to come and explain over at over clock.net.

6. This promoted a response from AMD.

And here we are.

That's not thinking highly of myself. I did those things. I simply acted. Now people have far more information than they used too.

Anyone can do that. It's just nobody bothered.
 
Last edited:
Or it could be viral marketing for competing company designed to influence buyers decision with uncertainity factor.

You account on OCN is registered in August, your account on [H] has 6 days.

It might be very unfair on my part but we live in an era of social media marketing so I have much higher trust level to people who have few years of posting history.
 
You seem to think very highly of yourself all over the internet. And to be blunt, I am not exactly sure what it is you think you've accomplished. Perhaps part of the problem here is that you are quick to draw conclusions that can not be seriously held with such preliminary and shoddy evidence.

Now everybody knows that each GPU architecture has its strengths and its weaknesses. Oxide was able to come up with a benchmark that allowed GCN to demonstrate its strengths, and thus made the Fury X competitive with a 980 Ti. Does this not strike one as manufactured controversy?

BTW you never clarified what you meant when you said that Maxwell does not support "bindless". Depending on what you're referring to, that could be quite a hilarious accusation.

QFT. This guy's sense of self importance and patting himself on the back was cringe worthy.
 
Or it could be viral marketing for competing company designed to influence buyers decision with uncertainity factor.

You account on OCN is registered in August, your account on [H] has 6 days.

It might be very unfair on my part but we live in an era of social media marketing so I have much higher trust level to people who have few years of posting history.
The whole situation seems like a perfectly orchestrated shitstorm.
The AotS benchmarks get the "It's just one game" treatment and people stop caring. Then, someone who apparently has a very high interest in GPUs makes a brand new account across multiple tech forums and posts very lengthy details about how Maxwell doesn't support async compute. Then Oxide responds and AMD responds. All of this happens in within about 48 to 72 hours, maybe.

For the rumor mill's sake, I hope there's some truth to the allegations. Otherwise it's going to be really fucking embarrassing for everybody involved. PC building communities have effectively put a hold on Maxwell GPUs entirely. And all of this based on unconfirmed information sourcing back to a single AMD Mantle/DX12 tech demo.

If AotS had been a GameWorks game with Nvidia in the lead, everybody would have shit on the 'black boxes' again and that would have been the end of it. Attach AMD's name to the same situation, however, and suddenly it's a hardware-crippling issue.
 
The whole situation seems like a perfectly orchestrated shitstorm.
The AotS benchmarks get the "It's just one game" treatment and people stop caring. Then, someone who apparently has a very high interest in GPUs makes a brand new account across multiple tech forums and posts very lengthy details about how Maxwell doesn't support async compute. Then Oxide responds and AMD responds. All of this happens in within about 48 to 72 hours, maybe.

For the rumor mill's sake, I hope there's some truth to the allegations. Otherwise it's going to be really fucking embarrassing for everybody involved. PC building communities have effectively put a hold on Maxwell GPUs entirely.


I haven't seen any communities putting Maxwell builds on hold based on some AMD sponsored alpha benchmark.
 
I have an older account here.

ElMoIsEviL. Look it up. Since I haven't used it in a while, and don't have access to my password or email associated with it, I had to create a new account.

I've been away from any tech discussion for almost 2 years. I've been in Morocco with my wife. I don't even have access to my gaming computers and haven't played a video game in two years. I haven't discussed GPUs for far longer than that as well.

Nothing is manufactured. If it were I'd have to admit it for legal reasons. I haven't received a dime from anyone. I haven't been in contact, secretly, with any organization involved either. Again, for legal reasons, I'd be in trouble if I wasn't honest and was in fact some marketing PR person with an interest in all of this.

I don't have an account over at Anandtech. So I created one. I've never been to overclock.net before either, so I created one.

As for tomshardware, I have MANY posts there as ElMoIsEviL. That account I still have access too. My last posts were almost entirely about CPUs. Nothing to do with GPUs. I got bored of DX11. In fact DX10-11 has been relatively boring to me. DX12 sounds promising. Therefore my curiosity kicked into high gear.

If people have put a hold on their builds as a result of all of this information then that is their own prerogative.

I think we need a response from NVIDIA. It will finally explain everything.

No need for conspiracy theories about me. It would be funny if people started creating them though. I'm just a guy, sitting on a lawn chair at a friends house in Morocco LOL

Also, considering all the errors I've made researching this, you'd think I would have been more informed if I were working for some shaddy organization. :p
 
Last edited:
They'll be building engines for consoles.PC's will get the ports which will run better on AMD hardware.

Keep dreaming. The new consoles have been out HOW LONG and this fabled "PC gaymes will run better on AMD cardz cuz AMD is in the cunsolez" hasn't materialized.
 
Wake me up when this shit of a game and its associated benchmark relevant lol.
Also who is Oxide games, haven't even heard about them until now?

Looks like this was last ditch effort by AMD to sell their failure of a card.
 
Keep dreaming. The new consoles have been out HOW LONG and this fabled "PC gaymes will run better on AMD cardz cuz AMD is in the cunsolez" hasn't materialized.

consoles weren't making use of the async hardware, except a few ps4 titles. DX12, on the XBox One exposes the ACEs.
There wasn't an API, on the PC, which could make use of async shading either.

Now you have the consoles, able to use them, and many titles on the horizon making use of them.

You have DX12, on the PC.

You have Microsoft pushing a strategy centered around cross platform gaming (see Fable legends).

And you have asynchronous shading capable hardware on the PC.

We're not in the same market anymore.
 
I buy a new video card every couple of years, as I suspect most of my fellow hardware enthusiasts here do.
By the time DX12 matters, Nvidia will have a card out that's optimized for the salient features.
This is a complete non issue.
 
Keep dreaming. The new consoles have been out HOW LONG and this fabled "PC gaymes will run better on AMD cardz cuz AMD is in the cunsolez" hasn't materialized.

Actually it already happened.

Which is reason for Nvidia Kepler getting left behind by Maxwell cards recently (since it improved weak points of Kepler against AMD GCN).
 
bindless is just a word I used for relying on dependencies vs working independently. Referring to AWSs vs ACEs.

You are not free to re-define tech terms to match some other meanings you desire to convey, at least not without losing credibility. It would be like me saying your car is fundamentally flawed because it doesn't have a drive train... and oh by the way by "drive train" I mean "solar-powered sunroof".

To clear the air, nvidia has supported bindless resources since the Kepler generation, and there is no architectural flaw or weakness with their implementation and in fact this feature has been successfully used in non-gaming applications for years now.

And for heaven's sake keep your wits about you and stop entertaining these fantasies that nvidia is made up of a group of drooling idiots that don't know how to design graphics cards.

It's about starting a conversation which leads to more information being released.

There is such a thing as starting conversation and then there is such a thing as generating so much noise that the signal becomes lost. And now when you canvas the graphics cards forums you see them littered with post after post from panda-faced people lamenting, "I just bought a 980 Ti... and now it's going to be obsolete in three months????"

And I've noticed your posts have contributed to this.

And do I recall that you practically took credit for the delay of ARK: Evolved?

6. This promoted a response from AMD.

Now people have far more information than they used too.

We don't have new information. We have a short PR response from AMD saying their GPUs are the best. We also have a lot of wild speculation. Truth be told, if the initial benchmarks were Fury X vs. 980 Ti nobody would even be talking about anything.
 
Wake me up when this shit of a game and its associated benchmark relevant lol.
Also who is Oxide games, haven't even heard about them until now?

Looks like this was last ditch effort by AMD to sell their failure of a card.

Don't worry man, your emotions are safe with us. Pascal will come out soon. Just keep your head up high... just follow Nvidia's planned obsolescence and everything will be okay.
 
You are not free to re-define tech terms to match some other meanings you desire to convey, at least not without losing credibility. It would be like me saying your car is fundamentally flawed because it doesn't have a drive train... and oh by the way by "drive train" I mean "solar-powered sunroof".

To clear the air, nvidia has supported bindless resources since the Kepler generation, and there is no architectural flaw or weakness with their implementation and in fact this feature has been successfully used in non-gaming applications for years now.

And for heaven's sake keep your wits about you and stop entertaining these fantasies that nvidia is made up of a group of drooling idiots that don't know how to design graphics cards.



There is such a thing as starting conversation and then there is such a thing as generating so much noise that the signal becomes lost. And now when you canvas the graphics cards forums you see them littered with post after post from panda-faced people lamenting, "I just bought a 980 Ti... and now it's going to be obsolete in three months????"

And I've noticed your posts have contributed to this.

And do I recall that you practically took credit for the delay of ARK: Evolved?



We don't have new information. We have a short PR response from AMD saying their GPUs are the best. We also have a lot of wild speculation. Truth be told, if the initial benchmarks were Fury X vs. 980 Ti nobody would even be talking about anything.

Y so mad?

And no I didn't take credit for the delay of Ark. I hinted that perhaps ark developers had run into a similar situation as Oxide. Emphasis on perhaps.

You can quote me rather than misrepresent what I said.

We have info from Oxide and AMD. That's new information.

What did you do?

Yeah...

Nobody said the GTX 980 Ti will be obsolete in 3 months. Find a quote where I said this. Oh you can't... Yeah...

What did I say? Well if the Game Works fiasco with Kepker is an indication of anything, we might find ourselves in a similar situation when Pascal arrives (relative to Maxwell 2). Pascal will likely remedy these claimed shortcomings of the Maxwell 2 architecture. Just like Maxwell/2 remedied Kepler's shortcomings. That's not a controversial statement.

Attacking the messenger rather than tackling the message is a rather absurd way of dealing with a situation.

I'll wait on what NVIDIA has to say. Because talking to partisan folks, such as yourself, is a futile exercise. You don't care about the truth... Because you take the truth personally.
 
Last edited:
I would think Maxwell would have had this feature built in better if they had been working with Microsoft on DX12 for several years as they claim. Regardless I feel that DX12 is gonna be a shitshow for everybody when games support the Chinese menu that is DX12 feature levels
 
Pascal will likely remedy these claimed shortcomings of the Maxwell 2 architecture. Just like Maxwell/2 remedied Kepler's shortcomings. That's not a controversial statement.

"...claimed shortcomings..."

It isn't much of a statement to say that a new architecture will have better features than the previous one. That's just common sense.

Attacking the messenger rather than tackling the message is a rather absurd way of dealing with a situation.

I have not attacked you. For me, there is no "situation" to deal with. I don't see any controversy with the hardware's performance, or the benchmark. What I do see is more people trying to make yet another mountain out of a molehill, which is the real poison in the PC gaming industry today.

I'll wait on what NVIDIA has to say. Because talking to partisan folks, such as yourself, is a futile exercise. You don't care about the truth... Because you take the truth personally.

First, you are partisan. Don't pretend that you aren't, or that you're somehow looking after the PC gaming ecosystem, etc. etc. Maybe you don't intend to come across as partisan, but when you say stuff like "nvidia might need pascal real soon", you damage your credibility as an objective observer.

Regarding NVIDIA... so far from what I have seen, their initial statement has ended up being the most accurate one to describe all of this: AotS is not the best representation of DX12 games. I think even you have to acknowledge that. Just as games that have "over-tessellation" will favor nvidia in benchmarks, games that have "over-compute" will show AMD's strengths. There is no controversy here.
 
actually its compute and graphics programmed in certain ways ;). Maxwell's architecture is well suited for compute performance and is actually better then GCN per unit, efficiency.
 
For everyone trying to say that Oxide games is AMD sponsored, please note that Nvidia (and Intel, Microsoft and AMD) have had source code for over a year and have submitted code to make the game better for everyone... and they didn't have to pay for access like GW titles force you to.

Specifically, that the application has a bug in it which precludes the validity of the test. We assure everyone that is absolutely not the case. Our code has been reviewed by Nvidia, Microsoft, AMD and Intel. It has passed the very thorough D3D12 validation system provided by Microsoft specifically designed to validate against incorrect usages. All IHVs have had access to our source code for over year, and we can confirm that both Nvidia and AMD compile our very latest changes on a daily basis and have been running our application in their labs for months.

Being fair to all the graphics vendors

Often we get asked about fairness, that is, usually if in regards to treating Nvidia and AMD equally? Are we working closer with one vendor then another? The answer is that we have an open access policy. Our goal is to make our game run as fast as possible on everyone’s machine, regardless of what hardware our players have.

To this end, we have made our source code available to Microsoft, Nvidia, AMD and Intel for over a year. We have received a huge amount of feedback. For example, when Nvidia noticed that a specific shader was taking a particularly long time on their hardware, they offered an optimized shader that made things faster which we integrated into our code.

http://www.oxidegames.com/2015/08/16/the-birth-of-a-new-api/
 
"...claimed shortcomings..."

It isn't much of a statement to say that a new architecture will have better features than the previous one. That's just common sense.



I have not attacked you. For me, there is no "situation" to deal with. I don't see any controversy with the hardware's performance, or the benchmark. What I do see is more people trying to make yet another mountain out of a molehill, which is the real poison in the PC gaming industry today.



First, you are partisan. Don't pretend that you aren't, or that you're somehow looking after the PC gaming ecosystem, etc. etc. Maybe you don't intend to come across as partisan, but when you say stuff like "nvidia might need pascal real soon", you damage your credibility as an objective observer.

Regarding NVIDIA... so far from what I have seen, their initial statement has ended up being the most accurate one to describe all of this: AotS is not the best representation of DX12 games. I think even you have to acknowledge that. Just as games that have "over-tessellation" will favor nvidia in benchmarks, games that have "over-compute" will show AMD's strengths. There is no controversy here.

That statement doesn't make me partisan. There are DX12 titles arriving this fall.

There are titles arriving in Q1 2016.

If these titles use Async shading, a 290x might compete with a GTX 980 Ti.

Therefore there's reason to claim..

"NVIDIA might need Pascal and soon".

You read anything negative towards NVIDIA, who arguably deserve it if this all pans out to be true, as partisanship.

Why? Because you're partisan.

Don't push your problems dealing with this information on me. Deal with it yourself. Take a deep breath and think about who deserves your anger, which you clearly have, me or NVIDIA?

Objectively speaking... It isn't me. I've worked to shed light on this issue. Spent a week glued to a laptop while my wife brought me sandwiches and coffee. (she's the best).

I can now disappear, until the next (wtf is happening here?) moment. Do you think I've enjoyed dealing with people like you? It had taken so much patience on my part. Death threats? Check! Insults and belittling? Check! The internet is filled with crazy folks.

Now I have to deal with you. Not exactly amusing, but worth every single insult thrown my way.

When I was a teenager, I acted like you. I was a partisan hack who defended my beloved hardware choices by any means necessary. I've grown up. Learned that this is absolutely pointless. It didn't benefit me (my ego and sense of self worth was tied to the size of my GPU).

Take a step back and look at what has transpired over the last few weeks. I think the entire PC gaming community will end up benefiting from all of this.

peace.
 
For everyone trying to say that Oxide games is AMD sponsored, please note that Nvidia (and Intel, Microsoft and AMD) have had source code for over a year and have submitted code to make the game better for everyone... and they didn't have to pay for access like GW titles force you to.

http://www.oxidegames.com/2015/08/16/the-birth-of-a-new-api/

also, mantle was patched in.

It has always been a DX game not a Mantle game.
 
That statement doesn't make me partisan. There are DX12 titles arriving this fall.

There are titles arriving in Q1 2016.

If these titles use Async shading, a 290x might compete with a GTX 980 Ti.

Therefore there's reason to claim..

"NVIDIA might need Pascal and soon".

You read anything negative towards NVIDIA, who arguably deserve it if this all pans out to be true, as partisanship.

Why? Because you're partisan.

Don't push your problems dealing with this information on me. Deal with it yourself. Take a deep breath and think about who deserves your anger, which you clearly have, me or NVIDIA?

Objectively speaking... It isn't me. I've worked to shed light on this issue. Spent a week glued to a laptop while my wife brought me sandwiches and coffee. (she's the best).

I can now disappear, until the next (wtf is happening here?) moment. Do you think I've enjoyed dealing with people like you? It had taken so much patience on my part. Death threats? Check! Insults and belittling? Check! The internet is filled with crazy folks.

Now I have to deal with you. Not exactly amusing, but worth every single insult thrown my way.

When I was a teenager, I acted like you. I was a partisan hack who defended my beloved hardware choices by any means necessary. I've grown up. Learned that this is absolutely pointless. It didn't benefit me (my ego and sense of self worth was tied to the size of my GPU).

Take a step back and look at what has transpired over the last few weeks. I think the entire PC gaming community will end up benefiting from all of this.

peace.

A big straw man. I haven't attacked you and just because I didn't drink what you were pouring doesn't make you out to be some victim.

For the most part I could care less about DX12 since I don't even game on that platform. But boy some of you really love to track down that little bit of smoke from a cigarette and pump that bellow on it faster and faster to try to trigger a forest fire.

Let's wait until we can get more robust data before drawing any firm conclusions.
 
I buy a new video card every couple of years, as I suspect most of my fellow hardware enthusiasts here do.
By the time DX12 matters, Nvidia will have a card out that's optimized for the salient features.
This is a complete non issue.

This is true. I think internally AMD is justifying their ridiculous prices by says "Wow, the Fury -Vanilla destroys the Titan-X in DX12, at $550 it's a bargain!!!"
 
Nvidia: The way it's meant to be gimped!

Have fun upgrading again for DX12 and 10bit/ultra wide gamut support.

B-b-b-but AMD drivers... yeah that's old news. At least the hardware functions and gives you all the ram you paid for.

With a big shout out to Prime1.
 
Curious what your course of action will be TS haha.
The only reason I bought a 980 Ti was to get funds out of my paypal account (long story). If I can get a full refund in the form of a Newegg gift card, that is an ideal scenario.
Otherwise I will happily keep the card. It's just more performance than I need.

These DX12 issues don't concern me.
 
People are now talking about class action lawsuits against nvidia because their 980 Ti only beats the Fury X by a few frames in some game that nobody will play.

*facepalm*

Really beginning to wonder if 99% of forum accounts out there are fake, or if people are just this stupid. Really hoping it's the former.
 
Back
Top