Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Wow, I was not expecting the ATI cards to perform that much better than nVidia's with everything else equal. Pisses me off that the green team get's exclusive eye candy though! They need to quit that shit, and soon!
You AMD users just got shafted on these cool effects that could have been easily developed for all PC gamers instead of just those that purchase from one company.
Then lets get AMD and Microsoft pushing their tech. I don't like nVidia doing this any more than an AMD GPU gamer. It makes the decision of which to buy more compliated based on artificial limitations.
But nVidia is under no obligation to push OpenCL or DirectCompute. So if we want this stuff somebody needs to sell it to the developers. nVidia is still pretty good at that.
Then lets get AMD and Microsoft pushing their tech. I don't like nVidia doing this any more than an AMD GPU gamer. It makes the decision of which to buy more compliated based on artificial limitations.
But nVidia is under no obligation to push OpenCL or DirectCompute. So if we want this stuff somebody needs to sell it to the developers. nVidia is still pretty good at that.
This point seems obvious, but it is being ignored. Not to mention what seems like common sense (albeit it could be wrong) seems to contradict the idea that Nvidia is "hurting" the industry by pushing this tech when no one else is.
Zuh? The water in Crysis looked and reacted better than Just Cause 2's GPU water simulation, and it wasn't locked into CUDA. Commom sense would say, since CUDA is not needed to render these effects and it is locked into vendor specific API's anyway, that is it hurting. CUDA in no way, shape or form is pushing new technology here.
I'll second necrophile's post. This shit has been done before, and on AMD gpu's as well as NVidia's.
The whole industry needs to move in the direction of non-proprietary features for not only eye-candy like this but also GPU physics acceleration and 3D gaming (i.e. 3D-Vision) as well... Here's hoping at least.
There is a simple question that is being ignored. Would the game be better if Nvidia specific features were NOT included at all? If the answer is no, then I do not see much of an argument since I imagine the choices are
Looks like I made a typo while commenting on a typo! I was referring to the text under the graph, not the actual graph. Sorry about that.
The game is worse of for ATi GPU users, due to the fact that the effects could easily be implemented on their hardware. You can counter with "they wouldn't have been included anyway", but there's no proof to justify that.The development team might have already planned to add the effects to the game, nVidia could have just persuaded to take them out and use their own engine to produce the same results.
There's a reason it's a TWIMTBP title. In fact, had they lost their special proprietary features, it would be a The Way It's Meant To Have Lower Frame-Rates With No Other Features Compared To Lower Priced Cards From The Competitor. I don't think TWIMTHLFRWNOFCTLPCFTC rolls off the tongue quite as easily as TWIMTBP.
Great article, but this is the first case where I feel like the ire against team green is misplaced. I find it hard to believe that [H] considers a company pushing their own technology, which they feel is better, as "doing harm". If by "do no harm" they meant "enable ATi" I think you are completely wrong. After all, all ATi has to do is license CUDA, and all of these nice effects will be enabled on their GPUs as well. Why is there no push to get ATi on board with CUDA? I've never seen anywhere Nvidia saying that they would not. And all things considered, I think they probably would license it.
I think nvidia should be applauded for pushing their technology so hard that companies are actually doing innovating things with it. GPU water of this quality and the bookeh effect (correct me if I am wrong) have not been seen before. And I have yet to see any kind of indication that OpenCL or Stream are capable of the same effects. PhysX demos are one thing, games are another, and until I see an actual playable game showing me that OCL and Stream can do the things PhysX/Cuda can, I will not feel bad for red.
Furthermore, it's not like OCL or Stream have gotten much press, and I think it's because they are not pushing their tech. Every once in a while we hear of a new tech that will be available or a new capability, but we never actually see it.
I'm no Nvidia fan boy, being as these two cards in my main system are the first nvidia cards I have ever purchased, but if they are going to go out there and make sure that the games i want to play have special effects I can use, then that makes me happy as a customer.
If ATi would go out there and do the same thing, it would mean a lot for my next card purchase, but as it stands, only Nvidia seems to be going out of their way to enable real post-purchase additional features to their cards. (Sorry eyefinity is not a feature, it's a display mode and one after trying it out I have ZERO interest in).
Great article, but this is the first case where I feel like the ire against team green is misplaced. I find it hard to believe that [H] considers a company pushing their own technology, which they feel is better, as "doing harm". If by "do no harm" they meant "enable ATi" I think you are completely wrong. After all, all ATi has to do is license CUDA, and all of these nice effects will be enabled on their GPUs as well. Why is there no push to get ATi on board with CUDA? I've never seen anywhere Nvidia saying that they would not. And all things considered, I think they probably would license it.
He didn't counter by saying if Nvidia wasn't involved they wouldn't have the features, his counter was asking a very valid question. Would they have had that stuff without Nvidia's involvement? Outside of Nvidia and the developers no one knows. His statement is a lot more fair and balanced then you are making it out to be.
ATI flat out refused to licence CUDA. Nvidia has offered to licence it to them in the past. ATI's stance is that they don't want to support proprietary technology like CUDA.
Where have ati said they were offered and refused?
I believe there was that bit tech interview where they said nvidia have never and will never allow them to use it.
if this is true (and I say that because I've never read it, not because I doubt you ) then it's ATi's own fault for their customers missing out on these effects, not Nvidia's, and [H]'s frustration is completely misplaced.ATI flat out refused to licence CUDA. Nvidia has offered to licence it to them in the past. ATI's stance is that they don't want to support proprietary technology like CUDA.
ATI flat out refused to licence CUDA. Nvidia has offered to licence it to them in the past. ATI's stance is that they don't want to support proprietary technology like CUDA.
But then it's just an enigma... Would they have had the capabilities to do both effects without nVidia's engine? [H] is saying that they would. Will we ever know? Not really.
If the former is true, ATi paying their competitor 20 or so dollars per card ends up being for naught. If both ATi and nVidia use the same engine, it forces the proprietary physics engine on all developers. It essentially creates a barrier of entry that is already high in an industry that is driven by innovation.
I also wonder why nVidia users want ATi to license CUDA and the like. nVidia would have little reason to add these effects into games because it would not make the opposing company look worse off. If developers weren't being coerced into using physX, CUDA, and such, they'd probably turn to alternatives anyways. Licensing nVidia's software wouldn't do much of anything other than letting nVidia know that they've won... potentially.
That being said, i don't think it would hurt the consumers exponentially if nVidia and ATi worked together. I just don't see it happening, and i can understand why.
So the question is is CUDA not being multi-platform at this time more AMD's fault or nVidia's?
I don't dissagree but then WHO'S going to market this stuff? OpenCL, it's great because it works as well on my stuff as my competitor's? Not exactly a brilliant strategy.
So the question is is CUDA not being multi-platform at this time more AMD's fault or nVidia's?
There's a reason it's a TWIMTBP title. In fact, had they lost their special proprietary features, it would be a The Way It's Meant To Have Lower Frame-Rates With No Other Features Compared To Lower Priced Cards From The Competitor. I don't think TWIMTHLFRWNOFCTLPCFTC rolls off the tongue quite as easily as TWIMTBP.
DirectCompute
The problem everyone is missing is that these effects could be done via DirectCompute and then both AMD and NVIDIA cards would benefit. IMO, this all goes back to the developers and their choices they make, it is quite clear this developer chose to provide a different gaming experience based upon what brand card you have installed, a rather shady practice to begin with IMO. Either you enjoy all the effects on NV cards, or you lose a couple of effects on AMD cards, and it is the game developer that decided that their game was going to provide a different experience, I guess they are OK with that, but IMO as a gamer, I'm not. i would have preferred they chose a method that would benefit all, and these effects can be done with industry standards like DirectCompute, even in DX10.
I think the big issue is that no one is pushing DirectCompute or OpenCL. Microsoft quite simply is acting like they don't care at all about the PC gaming market so they're not going to push DirectCompute. There is nothing viable on the OpenCL side yet. I'm not a huge fan of CUDA either, but until AMD or someone gets off their asses and starts pushing these other technologies we're stuck with it.
I think the big issue is that no one is pushing DirectCompute or OpenCL. Microsoft quite simply is acting like they don't care at all about the PC gaming market so they're not going to push DirectCompute. There is nothing viable on the OpenCL side yet. I'm not a huge fan of CUDA either, but until AMD or someone gets off their asses and starts pushing these other technologies we're stuck with it.
Push DirectCompute to who? The developers know its there and can use it at any time. I mean, AMD and Nvidia both made their cards work with it. Do you mean push it on the consumer level?
They know it exists, they just chose not to use it. What nVidia is doing is "pushing", but not as softly as you're describing it.
Again, without these features, ATi would have won hands-down in a TWITMTBP title against more expensive cards from nVidia. I think nVidia realized that, and probably "encouraged" the developers to do something about it. It was a last-minute decision to include both features, after all.
I mean push it to developers like Nvidia is doing with CUDA. Sure the cards support it, but there is no reason for developers to support it. Nvidia's program gives developers a reason to support CUDA. There is nothing like that for DirectCompute. All of use consumers who are in the know can yell all we want about supporting this stuff, but unless someone actively gets developers to start using it we're sunk. Hell look at how long it took for anyone to use tessellation. Its been around for years, but since ATI decided not to actively attract developer attention to it it went nowhere until MS added it to the DX11 API.
I think the big issue is that no one is pushing DirectCompute or OpenCL. Microsoft quite simply is acting like they don't care at all about the PC gaming market so they're not going to push DirectCompute. There is nothing viable on the OpenCL side yet. I'm not a huge fan of CUDA either, but until AMD or someone gets off their asses and starts pushing these other technologies we're stuck with it.
ATi can't throw money at the situation, because AMD needs to make a profit for quite some time to be in the same situation nVidia is financially.
I know Nvidia is pushing it heavy and I want ATI, Intel, and MS to do the same with competing technology. Nvidia provides a ton of help and support to developers that are part of TWIMTBP and they're really the only ones doing that.
I mean push it to developers like Nvidia is doing with CUDA. Sure the cards support it, but there is no reason for developers to support it. Nvidia's program gives developers a reason to support CUDA. There is nothing like that for DirectCompute. All of use consumers who are in the know can yell all we want about supporting this stuff, but unless someone actively gets developers to start using it we're sunk. Hell look at how long it took for anyone to use tessellation. Its been around for years, but since ATI decided not to actively attract developer attention to it it went nowhere until MS added it to the DX11 API.
here is a good reason to start using it, I'm not going to buy just cause 2, because they chose to use a proprietary API that won't let me enjoy the game as they designed it. DirectCompute has been out for a while, if they rather take $ from NV to use cuda go ahead, they wont get any from me 0.0, and they will cut their potential customers down a fair bit.
Ok so lets count ATI out of that, what about Intel and Microsoft? Intel owns Havok the most popular physics engine out there. They are in a very good position to push technology. Especially whenever Havok decides to support OpenCL. And Microsoft? DirectCompute is part of DirectX. They're in the best position to do something. But no instead all of their energy is focused on the 360.
If Nvidia needs to pay developers to use CUDA, then CUDA is a failure, pure and simple.
And Microsoft? DirectCompute is part of DirectX. They're in the best position to do something. But no instead all of their energy is focused on the 360.
Thats a nice stance to take, but lets be honest most people simply don't care. They just want games and don't care about all of this stuff. People like us that care are a niche of an already niche market.
Whatever happened to just making the best engine/API and just letting the developer choose? I'm tired of ATi branded Dirt 2 and CoP, and I'm tired of nVidia branded everything else. If you have an innovative product, you shouldn't have to pay someone to use it.
nVidia isn't helping developers to be a pretty cool guy, to them it's just an advertisement for their video cards. Once development gets treated like a giant advertising campaign up for grabs, who really wins?
Nobody should have to push the devs, that is ridiculous. Microsoft never pushed DX9, they never pushed DX10, they never pushed DX11, they never pushed shaders, they never pushed anything.
*DEVELOPERS* are the ones who are supposed to be doing the pushing. *DEVELOPERS* are supposed to push the limits of the system.
If the developers want to create an engine like the JC2 devs did, then the developers need to create the damn engine, not let Nvidia help write it to specifically target features of nvidia's cards. The tech needs to stand on its own, it should not need to be pushed by the creator in underhanded tactics to force its adoption. If Nvidia needs to pay developers to use CUDA, then CUDA is a failure, pure and simple.
I'm not entirely disagreeing with you, but it gives that much more justification for pirating if you are an ATi owner. In this day and age, it's not about simply not buying the game. It's about whether or not you want to steal (or "copy") it.
I know four people that flat out refused to buy this game because of the affiliation with nVidia and CUDA. 3 of them pirated it.
The inclusion of DLC put the nail in the coffin for me. If a dev team has no objections to DLC, i don't have a whole lot of faith for their relations with nVidia.