DirectCompute vs. CUDA vs. OpenCL

mdk30

[H]ard|Gawd
Joined
Mar 29, 2004
Messages
1,028
Wow, I was not expecting the ATI cards to perform that much better than nVidia's with everything else equal. Pisses me off that the green team get's exclusive eye candy though! They need to quit that shit, and soon!
 
Wow, I was not expecting the ATI cards to perform that much better than nVidia's with everything else equal. Pisses me off that the green team get's exclusive eye candy though! They need to quit that shit, and soon!

Then lets get AMD and Microsoft pushing their tech. I don't like nVidia doing this any more than an AMD GPU gamer. It makes the decision of which to buy more compliated based on artificial limitations.

But nVidia is under no obligation to push OpenCL or DirectCompute. So if we want this stuff somebody needs to sell it to the developers. nVidia is still pretty good at that.
 
You AMD users just got shafted on these cool effects that could have been easily developed for all PC gamers instead of just those that purchase from one company.

There is a simple question that is being ignored. Would the game be better if Nvidia specific features were NOT included at all? If the answer is no, then I do not see much of an argument since I imagine the choices are

1. added features that support CUDA
or
2. No added features.

Can anyone give me a good reason to believe that the devs would have gone the extra mile to include this stuff if it was not for Nvidia’s help? I mean, the game is fun without the extras, so likely they could have and would have been left out if not for Nvidia. I just do not see a cost benefit analysis for such features winning out in the end. The MBAs calling the shots care about profit from the game. How much added profit do you think these features provide vs. the cost? My guess would be that it would work out to a loss, but I could be wrong.

So tell me how adding something vs adding nothing is “hurting” the gaming industry? That claim seems completely fallacious and unsupported.

I would also like to talk a little about CUDA. It is a language much like C# for the MS .NET platform that so many devs have adopted because if its ease of use and support. The analogy is nearly perfect except one is for propriety OS and one is for hardware. And yet, we love to have all the good programs that come from .NET. Also, both are free to implement on other systems. There is a Linux version of C# that runs on Mono or something like that. There is also a hacked ATI driver that runs CUDA. In fact, it was an Nvidia Engineer to who helped make that happen as I understand it.

The point I am trying to get at is that ATI can use CUDA if they want, so all these benefits are options for ATI card owners if ATI would be willing to support CUDA on its hardware. I would say that is not likely, but still more likely than the game developers adding in features like this without the added support from companies like Nvidia who likely did most of the work to add this in and thus absorbed much of the development cost.

I freely admit that a lot of this is speculation so there is not need to tell me that. Just sayin...

Fib (gets on flame suite and waits)
 
Then lets get AMD and Microsoft pushing their tech. I don't like nVidia doing this any more than an AMD GPU gamer. It makes the decision of which to buy more compliated based on artificial limitations.

But nVidia is under no obligation to push OpenCL or DirectCompute. So if we want this stuff somebody needs to sell it to the developers. nVidia is still pretty good at that.

This point seems obvious, but it is being ignored. Not to mention what seems like common sense (albeit it could be wrong) seems to contradict the idea that Nvidia is "hurting" the industry by pushing this tech when no one else is.
 
Then lets get AMD and Microsoft pushing their tech. I don't like nVidia doing this any more than an AMD GPU gamer. It makes the decision of which to buy more compliated based on artificial limitations.

But nVidia is under no obligation to push OpenCL or DirectCompute. So if we want this stuff somebody needs to sell it to the developers. nVidia is still pretty good at that.

The whole industry needs to move in the direction of non-proprietary features for not only eye-candy like this but also GPU physics acceleration and 3D gaming (i.e. 3D-Vision) as well... Here's hoping at least.
 
So, you need this special bokeh filter and gpu assisted water simulation to do what crysis already did. Amazing.
 
This point seems obvious, but it is being ignored. Not to mention what seems like common sense (albeit it could be wrong) seems to contradict the idea that Nvidia is "hurting" the industry by pushing this tech when no one else is.

Zuh? The water in Crysis looked and reacted better than Just Cause 2's GPU water simulation, and it wasn't locked into CUDA. Commom sense would say, since CUDA is not needed to render these effects and it is locked into vendor specific API's anyway, that is it hurting. CUDA in no way, shape or form is pushing new technology here.

I'll second necrophile's post. This shit has been done before, and on AMD gpu's as well as NVidia's.
 
Zuh? The water in Crysis looked and reacted better than Just Cause 2's GPU water simulation, and it wasn't locked into CUDA. Commom sense would say, since CUDA is not needed to render these effects and it is locked into vendor specific API's anyway, that is it hurting. CUDA in no way, shape or form is pushing new technology here.

I'll second necrophile's post. This shit has been done before, and on AMD gpu's as well as NVidia's.

I guess you do not get that I am talking about using the GPU to calculate this stuff instead of just using it for graphics. That is the tech I am discussing...

If the results have been done before, and done better so be it....I could care less. I care about it being done on the GPU. I realize that to the gamer it might not matter, but it does matter.
 
The whole industry needs to move in the direction of non-proprietary features for not only eye-candy like this but also GPU physics acceleration and 3D gaming (i.e. 3D-Vision) as well... Here's hoping at least.

I don't dissagree but then WHO'S going to market this stuff? OpenCL, it's great because it works as well on my stuff as my competitor's? Not exactly a brilliant strategy.
 
Looks like I made a typo while commenting on a typo! I was referring to the text under the graph, not the actual graph. Sorry about that.

There is a simple question that is being ignored. Would the game be better if Nvidia specific features were NOT included at all? If the answer is no, then I do not see much of an argument since I imagine the choices are

The game is worse of for ATi GPU users, due to the fact that the effects could easily be implemented on their hardware. You can counter with "they wouldn't have been included anyway", but there's no proof to justify that.The development team might have already planned to add the effects to the game, nVidia could have just persuaded to take them out and use their own engine to produce the same results.

There's a reason it's a TWIMTBP title. In fact, had they lost their special proprietary features, it would be a The Way It's Meant To Have Lower Frame-Rates With No Other Features Compared To Lower Priced Cards From The Competitor. I don't think TWIMTHLFRWNOFCTLPCFTC rolls off the tongue quite as easily as TWIMTBP.
 
Looks like I made a typo while commenting on a typo! I was referring to the text under the graph, not the actual graph. Sorry about that.



The game is worse of for ATi GPU users, due to the fact that the effects could easily be implemented on their hardware. You can counter with "they wouldn't have been included anyway", but there's no proof to justify that.The development team might have already planned to add the effects to the game, nVidia could have just persuaded to take them out and use their own engine to produce the same results.

There's a reason it's a TWIMTBP title. In fact, had they lost their special proprietary features, it would be a The Way It's Meant To Have Lower Frame-Rates With No Other Features Compared To Lower Priced Cards From The Competitor. I don't think TWIMTHLFRWNOFCTLPCFTC rolls off the tongue quite as easily as TWIMTBP.

He didn't counter by saying if Nvidia wasn't involved they wouldn't have the features, his counter was asking a very valid question. Would they have had that stuff without Nvidia's involvement? Outside of Nvidia and the developers no one knows. His statement is a lot more fair and balanced then you are making it out to be.

Great article, but this is the first case where I feel like the ire against team green is misplaced. I find it hard to believe that [H] considers a company pushing their own technology, which they feel is better, as "doing harm". If by "do no harm" they meant "enable ATi" I think you are completely wrong. After all, all ATi has to do is license CUDA, and all of these nice effects will be enabled on their GPUs as well. Why is there no push to get ATi on board with CUDA? I've never seen anywhere Nvidia saying that they would not. And all things considered, I think they probably would license it.

I think nvidia should be applauded for pushing their technology so hard that companies are actually doing innovating things with it. GPU water of this quality and the bookeh effect (correct me if I am wrong) have not been seen before. And I have yet to see any kind of indication that OpenCL or Stream are capable of the same effects. PhysX demos are one thing, games are another, and until I see an actual playable game showing me that OCL and Stream can do the things PhysX/Cuda can, I will not feel bad for red.
Furthermore, it's not like OCL or Stream have gotten much press, and I think it's because they are not pushing their tech. Every once in a while we hear of a new tech that will be available or a new capability, but we never actually see it.

I'm no Nvidia fan boy, being as these two cards in my main system are the first nvidia cards I have ever purchased, but if they are going to go out there and make sure that the games i want to play have special effects I can use, then that makes me happy as a customer.
If ATi would go out there and do the same thing, it would mean a lot for my next card purchase, but as it stands, only Nvidia seems to be going out of their way to enable real post-purchase additional features to their cards. (Sorry eyefinity is not a feature, it's a display mode and one after trying it out I have ZERO interest in).

ATI flat out refused to licence CUDA. Nvidia has offered to licence it to them in the past. ATI's stance is that they don't want to support proprietary technology like CUDA.
 
Where have ati said they were offered and refused?

I believe there was that bit tech interview where they said nvidia have never and will never allow them to use it.
 
Great article, but this is the first case where I feel like the ire against team green is misplaced. I find it hard to believe that [H] considers a company pushing their own technology, which they feel is better, as "doing harm". If by "do no harm" they meant "enable ATi" I think you are completely wrong. After all, all ATi has to do is license CUDA, and all of these nice effects will be enabled on their GPUs as well. Why is there no push to get ATi on board with CUDA? I've never seen anywhere Nvidia saying that they would not. And all things considered, I think they probably would license it.

You're speculating that nvidia would probably let AMD pay them to use their proprietary API that nvidia would have full control over? I don't see how that would benefit anyone but nvidia. AMD works with OpenCL because they pay no licensing fees.
 
He didn't counter by saying if Nvidia wasn't involved they wouldn't have the features, his counter was asking a very valid question. Would they have had that stuff without Nvidia's involvement? Outside of Nvidia and the developers no one knows. His statement is a lot more fair and balanced then you are making it out to be.


ATI flat out refused to licence CUDA. Nvidia has offered to licence it to them in the past. ATI's stance is that they don't want to support proprietary technology like CUDA.

But then it's just an enigma... Would they have had the capabilities to do both effects without nVidia's engine? [H] is saying that they would. Will we ever know? Not really.

If the former is true, ATi paying their competitor 20 or so dollars per card ends up being for naught. If both ATi and nVidia use the same engine, it forces the proprietary physics engine on all developers. It essentially creates a barrier of entry that is already high in an industry that is driven by innovation.

I also wonder why nVidia users want ATi to license CUDA and the like. nVidia would have little reason to add these effects into games because it would not make the opposing company look worse off. If developers weren't being coerced into using physX, CUDA, and such, they'd probably turn to alternatives anyways. Licensing nVidia's software wouldn't do much of anything other than letting nVidia know that they've won... potentially.

That being said, i don't think it would hurt the consumers exponentially if nVidia and ATi worked together. I just don't see it happening, and i can understand why.
 
Where have ati said they were offered and refused?

I believe there was that bit tech interview where they said nvidia have never and will never allow them to use it.

Saw an interview a while back where ATI said they don't want to use property technology. They were asked directly if they would ever use CUDA.
 
ATI flat out refused to licence CUDA. Nvidia has offered to licence it to them in the past. ATI's stance is that they don't want to support proprietary technology like CUDA.
if this is true (and I say that because I've never read it, not because I doubt you :D ) then it's ATi's own fault for their customers missing out on these effects, not Nvidia's, and [H]'s frustration is completely misplaced.
 
ATI flat out refused to licence CUDA. Nvidia has offered to licence it to them in the past. ATI's stance is that they don't want to support proprietary technology like CUDA.

So the question is is CUDA not being multi-platform at this time more AMD's fault or nVidia's?
 
But then it's just an enigma... Would they have had the capabilities to do both effects without nVidia's engine? [H] is saying that they would. Will we ever know? Not really.

If the former is true, ATi paying their competitor 20 or so dollars per card ends up being for naught. If both ATi and nVidia use the same engine, it forces the proprietary physics engine on all developers. It essentially creates a barrier of entry that is already high in an industry that is driven by innovation.

I also wonder why nVidia users want ATi to license CUDA and the like. nVidia would have little reason to add these effects into games because it would not make the opposing company look worse off. If developers weren't being coerced into using physX, CUDA, and such, they'd probably turn to alternatives anyways. Licensing nVidia's software wouldn't do much of anything other than letting nVidia know that they've won... potentially.

That being said, i don't think it would hurt the consumers exponentially if nVidia and ATi worked together. I just don't see it happening, and i can understand why.

As of right now PhysX is the only viable GPGPU physics API. Havok is likely going to release an OpenCL API eventually, but no one really knows when or if them being owned by Intel is going to hinder that.
 
So the question is is CUDA not being multi-platform at this time more AMD's fault or nVidia's?

Both of them. Nvidia designed CUDA specifically to be theirs and theirs alone, making it so it can not be an open standard. While ATI refuses to use something that is not an open technology. So you can't lay the blame at the feet of either of them.
 
I don't dissagree but then WHO'S going to market this stuff? OpenCL, it's great because it works as well on my stuff as my competitor's? Not exactly a brilliant strategy.

Since when can't you distinguish your product based on speed, efficiency, and price? Adding competing standards and features into the mix is just a needless way of segmenting the market. For an extreme example just look at PC vs Mac gaming... the PC gaming industry thrives on the more "open" playing field as opposed to the proprietary bullshit that is the Mac...
 
So the question is is CUDA not being multi-platform at this time more AMD's fault or nVidia's?

What would stop nvidia from making it run better on their own cards while locking AMD out of the top end feature sets? AMD would have painted itself into a corner if they were to pay licensing fees to their competitor for a proprietary API.
 
The problem everyone is missing is that these effects could be done via DirectCompute and then both AMD and NVIDIA cards would benefit. IMO, this all goes back to the developers and their choices they make, it is quite clear this developer chose to provide a different gaming experience based upon what brand card you have installed, a rather shady practice to begin with IMO. Either you enjoy all the effects on NV cards, or you lose a couple of effects on AMD cards, and it is the game developer that decided that their game was going to provide a different experience, I guess they are OK with that, but IMO as a gamer, I'm not. i would have preferred they chose a method that would benefit all, and these effects can be done with industry standards like DirectCompute, even in DX10.
 
Take it a few steps forward. If a game was released that is solely for nVidia users, would that be considered fair? Using the same reasoning, ATi users are not getting cheated out of an experience, because it wouldn't have been there without nVidia's help anyways. Even if a small amount of content is missing, you're applying the same principle.

If nVidia were to continue this exaggerated trend, it would force ATi out of the market simply because nVidia has more power and sway in the game developing world. Correct me if I'm wrong, but that's not legal.

I'm not arguing that this is the end of computing, or that this is even a big deal. I can live with being able to see things off into the distance and not having waves crash against my boat. However, the idea that it isn't objectionable in my opinion is a flawed one. The implications of gamers supporting developers and companies that thrive on this could be widespread. I use the word 'thrive" in the purest sense of the word, considering

There's a reason it's a TWIMTBP title. In fact, had they lost their special proprietary features, it would be a The Way It's Meant To Have Lower Frame-Rates With No Other Features Compared To Lower Priced Cards From The Competitor. I don't think TWIMTHLFRWNOFCTLPCFTC rolls off the tongue quite as easily as TWIMTBP.
 
The problem everyone is missing is that these effects could be done via DirectCompute and then both AMD and NVIDIA cards would benefit. IMO, this all goes back to the developers and their choices they make, it is quite clear this developer chose to provide a different gaming experience based upon what brand card you have installed, a rather shady practice to begin with IMO. Either you enjoy all the effects on NV cards, or you lose a couple of effects on AMD cards, and it is the game developer that decided that their game was going to provide a different experience, I guess they are OK with that, but IMO as a gamer, I'm not. i would have preferred they chose a method that would benefit all, and these effects can be done with industry standards like DirectCompute, even in DX10.

I think the big issue is that no one is pushing DirectCompute or OpenCL. Microsoft quite simply is acting like they don't care at all about the PC gaming market so they're not going to push DirectCompute. There is nothing viable on the OpenCL side yet. I'm not a huge fan of CUDA either, but until AMD or someone gets off their asses and starts pushing these other technologies we're stuck with it.
 
I think the big issue is that no one is pushing DirectCompute or OpenCL. Microsoft quite simply is acting like they don't care at all about the PC gaming market so they're not going to push DirectCompute. There is nothing viable on the OpenCL side yet. I'm not a huge fan of CUDA either, but until AMD or someone gets off their asses and starts pushing these other technologies we're stuck with it.

Push DirectCompute to who? The developers know its there and can use it at any time. I mean, AMD and Nvidia both made their cards work with it. Do you mean push it on the consumer level?
 
I think the big issue is that no one is pushing DirectCompute or OpenCL. Microsoft quite simply is acting like they don't care at all about the PC gaming market so they're not going to push DirectCompute. There is nothing viable on the OpenCL side yet. I'm not a huge fan of CUDA either, but until AMD or someone gets off their asses and starts pushing these other technologies we're stuck with it.

They know it exists, they just chose not to use it. What nVidia is doing is "pushing", but not as softly as you're describing it.

Again, without these features, ATi would have won hands-down in a TWITMTBP title against more expensive cards from nVidia. I think nVidia realized that, and probably "encouraged" the developers to do something about it. It was a last-minute decision to include both features, after all.
 
Push DirectCompute to who? The developers know its there and can use it at any time. I mean, AMD and Nvidia both made their cards work with it. Do you mean push it on the consumer level?

I mean push it to developers like Nvidia is doing with CUDA. Sure the cards support it, but there is no reason for developers to support it. Nvidia's program gives developers a reason to support CUDA. There is nothing like that for DirectCompute. All of use consumers who are in the know can yell all we want about supporting this stuff, but unless someone actively gets developers to start using it we're sunk. Hell look at how long it took for anyone to use tessellation. Its been around for years, but since ATI decided not to actively attract developer attention to it it went nowhere until MS added it to the DX11 API.
 
They know it exists, they just chose not to use it. What nVidia is doing is "pushing", but not as softly as you're describing it.

Again, without these features, ATi would have won hands-down in a TWITMTBP title against more expensive cards from nVidia. I think nVidia realized that, and probably "encouraged" the developers to do something about it. It was a last-minute decision to include both features, after all.

I know Nvidia is pushing it heavy and I want ATI, Intel, and MS to do the same with competing technology. Nvidia provides a ton of help and support to developers that are part of TWIMTBP and they're really the only ones doing that.
 
I mean push it to developers like Nvidia is doing with CUDA. Sure the cards support it, but there is no reason for developers to support it. Nvidia's program gives developers a reason to support CUDA. There is nothing like that for DirectCompute. All of use consumers who are in the know can yell all we want about supporting this stuff, but unless someone actively gets developers to start using it we're sunk. Hell look at how long it took for anyone to use tessellation. Its been around for years, but since ATI decided not to actively attract developer attention to it it went nowhere until MS added it to the DX11 API.

ATi can't throw money at the situation, because AMD needs to make a profit for quite some time to be in the same situation nVidia is financially.
 
I think the big issue is that no one is pushing DirectCompute or OpenCL. Microsoft quite simply is acting like they don't care at all about the PC gaming market so they're not going to push DirectCompute. There is nothing viable on the OpenCL side yet. I'm not a huge fan of CUDA either, but until AMD or someone gets off their asses and starts pushing these other technologies we're stuck with it.

This is all I'm trying to say.
 
ATi can't throw money at the situation, because AMD needs to make a profit for quite some time to be in the same situation nVidia is financially.

Ok so lets count ATI out of that, what about Intel and Microsoft? Intel owns Havok the most popular physics engine out there. They are in a very good position to push technology. Especially whenever Havok decides to support OpenCL. And Microsoft? DirectCompute is part of DirectX. They're in the best position to do something. But no instead all of their energy is focused on the 360.
 
I know Nvidia is pushing it heavy and I want ATI, Intel, and MS to do the same with competing technology. Nvidia provides a ton of help and support to developers that are part of TWIMTBP and they're really the only ones doing that.

Perhaps if they focused more of their attention on hardware and beating the competition fairly we wouldn't have had to endure the problems that the 4 series has had. I don't want ATi to be giving money to a developer behind closed doors, I want them making faster and more efficient video cards in a timely manner.
 
I mean push it to developers like Nvidia is doing with CUDA. Sure the cards support it, but there is no reason for developers to support it. Nvidia's program gives developers a reason to support CUDA. There is nothing like that for DirectCompute. All of use consumers who are in the know can yell all we want about supporting this stuff, but unless someone actively gets developers to start using it we're sunk. Hell look at how long it took for anyone to use tessellation. Its been around for years, but since ATI decided not to actively attract developer attention to it it went nowhere until MS added it to the DX11 API.

here is a good reason to start using it, I'm not going to buy just cause 2, because they chose to use a proprietary API that won't let me enjoy the game as they designed it. DirectCompute has been out for a while, if they rather take $ from NV to use cuda go ahead, they wont get any from me 0.0, and they will cut their potential customers down a fair bit.
 
here is a good reason to start using it, I'm not going to buy just cause 2, because they chose to use a proprietary API that won't let me enjoy the game as they designed it. DirectCompute has been out for a while, if they rather take $ from NV to use cuda go ahead, they wont get any from me 0.0, and they will cut their potential customers down a fair bit.

Thats a nice stance to take, but lets be honest most people simply don't care. They just want games and don't care about all of this stuff. People like us that care are a niche of an already niche market.
 
Ok so lets count ATI out of that, what about Intel and Microsoft? Intel owns Havok the most popular physics engine out there. They are in a very good position to push technology. Especially whenever Havok decides to support OpenCL. And Microsoft? DirectCompute is part of DirectX. They're in the best position to do something. But no instead all of their energy is focused on the 360.

Whatever happened to just making the best engine/API and just letting the developer choose? I'm tired of ATi branded Dirt 2 and CoP, and I'm tired of nVidia branded everything else. If you have an innovative product, you shouldn't have to pay someone to use it.
If Nvidia needs to pay developers to use CUDA, then CUDA is a failure, pure and simple.



nVidia isn't helping developers to be a pretty cool guy, to them it's just an advertisement for their video cards. Once development gets treated like a giant advertising campaign up for grabs, who really wins?
 
And Microsoft? DirectCompute is part of DirectX. They're in the best position to do something. But no instead all of their energy is focused on the 360.

Nobody should have to push the devs, that is ridiculous. Microsoft never pushed DX9, they never pushed DX10, they never pushed DX11, they never pushed shaders, they never pushed anything. They simply provided a product that was more appealing to developers than the competition (OpenGL).

*DEVELOPERS* are the ones who are supposed to be doing the pushing. *DEVELOPERS* are supposed to push the limits of the system.

If the developers want to create an engine like the JC2 devs did, then the developers need to create the damn engine, not let Nvidia help write it to specifically target features of nvidia's cards. The tech needs to stand on its own, it should not need to be pushed by the creator in underhanded tactics to force its adoption. If Nvidia needs to pay developers to use CUDA, then CUDA is a failure, pure and simple.
 
Thats a nice stance to take, but lets be honest most people simply don't care. They just want games and don't care about all of this stuff. People like us that care are a niche of an already niche market.

I'm not entirely disagreeing with you, but it gives that much more justification for pirating if you are an ATi owner. In this day and age, it's not about simply not buying the game. It's about whether or not you want to steal (or "copy") it.

I know four people that flat out refused to buy this game because of the affiliation with nVidia and CUDA. 3 of them pirated it.

The inclusion of DLC put the nail in the coffin for me. If a dev team has no objections to DLC, i don't have a whole lot of faith for their relations with nVidia.
 
Whatever happened to just making the best engine/API and just letting the developer choose? I'm tired of ATi branded Dirt 2 and CoP, and I'm tired of nVidia branded everything else. If you have an innovative product, you shouldn't have to pay someone to use it.

nVidia isn't helping developers to be a pretty cool guy, to them it's just an advertisement for their video cards. Once development gets treated like a giant advertising campaign up for grabs, who really wins?

Thats a nice thought, but its an idealistic one. Most developers look at the PC as a secondary revenue source, not one to put a lot of attention on to. They're going to design their games for the console and for its limitations. When it comes to PC versions most aren't going to go that extra mile. They need something to make them go that way. Whether its consumer demand or help from an outside source convincing them to add features because if they do that company will help them implement it and help them optimize the game.

Nobody should have to push the devs, that is ridiculous. Microsoft never pushed DX9, they never pushed DX10, they never pushed DX11, they never pushed shaders, they never pushed anything.

*DEVELOPERS* are the ones who are supposed to be doing the pushing. *DEVELOPERS* are supposed to push the limits of the system.

If the developers want to create an engine like the JC2 devs did, then the developers need to create the damn engine, not let Nvidia help write it to specifically target features of nvidia's cards. The tech needs to stand on its own, it should not need to be pushed by the creator in underhanded tactics to force its adoption. If Nvidia needs to pay developers to use CUDA, then CUDA is a failure, pure and simple.

This isn't the early 2000s anymore. The gaming market has changed, significantly. We don't have developers that are always looking to push boundaries and take risks on new technology anymore.
 
I'm not entirely disagreeing with you, but it gives that much more justification for pirating if you are an ATi owner. In this day and age, it's not about simply not buying the game. It's about whether or not you want to steal (or "copy") it.

I know four people that flat out refused to buy this game because of the affiliation with nVidia and CUDA. 3 of them pirated it.

The inclusion of DLC put the nail in the coffin for me. If a dev team has no objections to DLC, i don't have a whole lot of faith for their relations with nVidia.

There is nothing wrong with DLC. Most of it is crap, sure, but there is some good stuff out there. They're charging a couple bucks for a few neat things that add some extra fun to a game that is highly moddable. I don't see a problem with it. Its not like they want the MW2 route with it.
 
Back
Top