NVIDIA works with devs to add CUDA ocean sim and graphics effects to Just Cause 2

If it was standard then no extra work/code would have been necessary. Batman AA is not standard (as the U3E does not come with it). This water effect is not standard. Also if you are bitching about the results why do you give a flying fuck if your card won't run it?

The AA code Nvidia gave the Batman people *WAS* standard. It was standard DirectX that both ATI and Nvidia recommend. There was *nothing* unique about the AA code Nvidia gave them, except for the vendor id lockout.

I give a flying fuck because I care about PC gaming, what kind of question is that?

Believe me, you're not one to lecture me on the ins and outs of GPU compute APIs ;) I never said they don't support OpenCL or DirectCompute. Nerd rage is really becoming an epidemic it seems and it obviously affects reading comprehension.

Funny, coming from you.

I asked WHY they would use either of those in lieu of CUDA. Now calm down, read my post again and try to come up with an intelligent answer this time.

No you didn't, that isn't what you asked *at all*. Since you seem to write a post and then immediately forget about it, here, let me refresh your memory:

Give this guy a medal. Nobody in this thread is saying otherwise my friend. However, it would require implementing these effects in a compute shader. Now instead of nerd raging so hard sit back and ask yourself why Nvidia would implement something to benefit a handful of ATi DX11 cards instead of the millions of CUDA capable cards on the market.

Nvidia already has a DirectCompute approach. So here is my question to you: Why would you support Nvidia choosing a proprietary solution (CUDA) over a standard one that will work with every DX10 card?


Why doesn't CUDA work on ATI cards?

http://www.extremetech.com/article2/0,2845,2324555,00.asp

There is NOTHING to keep ATI from running CUDA at all. Nothing!!!! ATI can run this stuff too...


So if it is not supported by ATI, then bitch at ATI!!!!

CUDA is IMO a much better environment to code for. It works and it does not take a Ph.D. in ATI specific hardware to make some usable code. I have looked at Brooke, and found it was more difficult with much less support, especially in Linux.

Nvidia has a lot of problems, sure. But this is not one of them.

The company's work in GPGPU is second to none as far as I can tell. ATI is way behind in this, even if the hardware can do it....it does not frackin matter!!!! If ATI is not going to spend the tons of time and money to support the cards gp-computing features, then it is not worth it.

DirectCompute, OpenCL are all great ideas, but ATI has had a lot of seeming good ideas. But they all seem to be a "works on my machine idea." I want to see how much CONTINUED support these APIs get.

Nividia has constantly been improving CUDA in windows and most importantly in Linux, where it matters most. I have seen little if any continued development and support from the ATI side.

If someone knows better, and can offer real world examples of where ATI has gone out and supported GPGPU, and has an API that has continuing support, then let me know I would love to be wrong about this.

In the mean time the ATI owners should be asking....why is it that ATI will not support CUDA? It is a good api that is well supported.....ATI Stream....good luck.

DirectCompute and OpenCL aren't owned or maintained by ATI. One is maintained by Microsoft, and the other by Kronos (the OpenGL guys). Why should ATI support CUDA when Nvidia also supports DirectCompute and OpenCL? Why should ATI support a standard maintained by their competition when both parties already support the open standards?
 
DirectCompute and OpenCL aren't owned or maintained by ATI. One is maintained by Microsoft, and the other by Kronos (the OpenGL guys). Why should ATI support CUDA when Nvidia also supports DirectCompute and OpenCL? Why should ATI support a standard maintained by their competition when both parties already support the open standards?

Fair enough. But then why should Nvidia do the extra leg work to make these game improvements at all? After all, it would be exactly the same as supporting ATI without having ATI having to bear any of the expense. You asked, "Why should ATI support Nvidia's standards?" Well, why should Nvidia support ATI by making a game better for everyone? Same reasoning, right?

So, in the end it comes down to either Nvidia does this for its own benefit, or it does nothing. Would you agree?

Well in that case, I choose the company doing the work for its benefit since that is adding something versus adding nothing.

[rant]
On another different note, Nvidia made a good api with cuda and a lot of people have adopted it. It would also be kind of crappy for Nvidia to suddenly drop support for something that a lot of people have taken the time to learn just so an open standard will be supported. I honestly do not believe ATI is pushing these open standards for the benefit of the industry, as I am sure it seems.

From what I have seen ATI does not do nearly enough in terms of gp computing for me to take them seriously in this matter. I get the feeling that ATI is only now clamoring for OpenCL and CD so as to nullify the existing code base that companies and schools have established with cuda. After all, if you have tested working algorithms set up with cuda, it would make sense to keep buying Nvidia Tesla cards, wouldn’t it? It seems to me that ATI does not want to compete with CUDA....or they have and no one really liked Brook or Stream. ATI has to be wondering how to get those Nvidia customers…

If Nvidia were to drop support for cuda in favor of these other "open" standards that the competition is pushing, then a LOT of people would be pissed and lose a lot of money spent on development. ATI does not have much of a gpgpu customer base because they have NOT invested much in the way of gp computing. But Nvidia has spent a lot and already has a foot hold in this area. So how then does ATI compete with Nvidia? They tried writing Brook and Stream software....did not work well in Linux, if at all. Cuda is not easy, but workable and well supported.....

The solution...

The solution is exactly what we are seeing. An all out marketing campaign for "open standards" that does not cost ATI much in terms of support and zero in terms of development. A good business strategy, but pretty crappy for the people who have been pushing for this technology for the last couple of years.

Anyhow, if ATI really cared to compete on something other than hardware, then how about they show their collective teeth and work with the game devs to get the same thing with directcompute or opencl. I would not hold your breath as such an approach would be helping Nvidia card owners too....I doubt ATI would do that since it is a for profit company.
[/rant]
 
Question:
I've read several times that Nvidia added features instead of removing them? I might have missed it but where can I read up on that?

If that's the case I don't see anything wrong with it. If they hadn't done that there wouldn't be anything to cry about in the first place.

Still, propriety standard isn't the way to go as far as I am concerned. Imagine if ATI would start doing the same. You can bet as the competition gets tougher both would castrate features for their competitor. In the long-run we as customers would get fucked up badly.
 
No you didn't, that isn't what you asked *at all*. Since you seem to write a post and then immediately forget about it, here, let me refresh your memory:

So you claim I didn't ask a particular question then quoted my post with the exact question highlighted? Lol, I can't keep up with this level of hilarity.

Nvidia already has a DirectCompute approach. So here is my question to you: Why would you support Nvidia choosing a proprietary solution (CUDA) over a standard one that will work with every DX10 card?

First of all DX10 cards are limited to CS4.0 which is less mature and has fewer features than even CUDA 1.0. Secondly have you seen the state of ATi or Nvidia's CS4.0 support? Of course not because you're just blabbing about stuff you know nothing about. And finally, as an Nvidia card owner why in the world should I care which API they use since my card supports all of them? Let's assume CS4.0 was actually stable and ready for primetime. Of what benefit is that to me over CUDA?

Imagine if ATI would start doing the same.

Yep, imagine if ATi would have actually made use of the tessellation hardware they've had since 2006 instead of having it sitting idle doing nothing all these years. You're telling me ATi card owners would have revolted because they were getting more features that took advantage of their hardware? Awaiting the hypocritical responses.
 
So, in the end it comes down to either Nvidia does this for its own benefit, or it does nothing. Would you agree?

Well in that case, I choose the company doing the work for its benefit since that is adding something versus adding nothing.

Nice summary. That's exactly what we're looking at in this scenario.

Btw DontBeEvil, my comment wasn't directed at you. Your post just reminded me of ATi's dormant tessellation hardware.
 
I doubt ATI would do that since it is a for profit company.
[/rant]

What does this have to do with anything? nVidia is pushing CUDA for the same reason. SLI, PhysX and CUDA are all designed to create vendor lock in and associated profits, nothing more. They aren't adding this stuff to Just Cause 2 out of the goodness of their hearts, but to keep from bleeding of more customers to ATi. I don't have a problem with this on a whole, they want to sell more hardware. But the TWIMTBP program has gone from vendor outreach to attempted vendor lock in and that should not be tolerated.

I'm not sure if what is being done is Just Cause 2 is clearly a case of this or not, we'll have to see how the game compares on assorted GPUs once it is released. But if it's another Batman:AA debacle then there won't be much to debate. I'm honestly not biased toward one company or the other, I buy the best GPU available at the time in the price range I can afford, I've probably owned 70% nVidia GPUs over the years.But this behavior, the constant re-branding schemes, a CEO who says more dumb stuff than Sony, and shenanigans like the system specs being flouted for Metro 2033 make me doubt almost any action by nV these days.
 
my, I thought its something new which is interesting to me, didn't though this whole thread could turn into a nvidia vs ati.:eek:

I'm glad there are games out there that push for more unique features for its PC platform version, rather than just a copy of the console version.

Someone brought up FEAR's soft shadow. Back then, I remember reading an article saying we won't even be able to get good frame rate with this feature on unless we have like 2 GF6800 in sli. The PC gaming scene was certainly more interesting back then, with new games introducing new features that push the latest hardware capabilities. We don't see much of that anymore these days, so I'm glad there are still some developers doing stuff like this. (btw, I think the shadow in this game is different from FEAR's shadow where they just make the shadow's edge appears blurry. In FEAR, the shadow intensity remains the same everywhere, while if I understood this correctly, JC2 shadow intensity should change according to the surrounding lighting.)

Anyway, I'm also a fan of physics on GPU. I hope ATI will someday jump aboard too. I'm all for using extra GPU to add more special effects to PC games.
 
The AA code Nvidia gave the Batman people *WAS* standard. It was standard DirectX

If it was standard then it would have been included with DirectX or at the very least the Unreal Engine. If ATI recommended it why did they not simply email them this standard code that's just lying around?
 
I'm not sure if what is being done is Just Cause 2 is clearly a case of this or not, we'll have to see how the game compares on assorted GPUs once it is released. But if it's another Batman:AA debacle then there won't be much to debate. (Not sure what you mean by previous sentence) I'm honestly not biased toward one company or the other, I buy the best GPU available at the time in the price range I can afford, I've probably owned 70% nVidia GPUs over the years.But this behavior, the constant re-branding schemes, a CEO who says more dumb stuff than Sony, and shenanigans like the system specs being flouted for Metro 2033 make me doubt almost any action by nV these days.

Yeah....a lot of stuff Nvidia does is crap and it pisses me off too. All I am saying is that this is not one of those things that pisses me off. Granted I can understand if you own an ATI card and you do not get these features why this would suck. But that is business....ATI has its crappy points too, although as of late ATI seems less crappy than Nvidia.

Of course if I were an editor here at [H] it might piss me off to have to do the extra work that B.J. pointed out (funny initials, and yes I am 2 years old.) But of course, that kind of attention to detail is why I like the [H].
 
The whole point of [H] reviews is to compare gaming experience. Different cards give different experiences, you review them and point out the differences - that's why we read the reviews, that's why [H] is so popular - it doesn't just give a straight time demo frame rate count.

That experience is defined by performance, stability and feature support. In the past ati/nvidia had different AA and AF algorithms and pages were spent comparing little details. They are now both identical so if anything from a reviewers stand point I would have thought other differentiators are a good thing as it gives you more to review.

As to nvidia adding extra stuff - as far as I can see the only problem is it shows up ati's lack of pro-active work in this area. Ati compete well when it comes to hardware, but really aren't in the same league as nvidia when it comes to the software. That's ati's fault - until they fix it sales will be lost, same as they are lost by the company with the slower hardware or the flakier drivers or whatever.
 
Ehm, what about dx11?

Whilst available to both camps, who do you think has been liasing with developers on adding it to games? Nvidia?

Nope they ain't had any cards.
 
Ehm, what about dx11?

Whilst available to both camps, who do you think has been liasing with developers on adding it to games? Nvidia?

Nope they ain't had any cards.

Actually, they worked with the developers of Metro 2033 to help them add DX11 effects to their game.
 
If it was standard then it would have been included with DirectX or at the very least the Unreal Engine. If ATI recommended it why did they not simply email them this standard code that's just lying around?

Get it through your head, it IS in DirectX. The UE3 engine isn't the gold standard of engines and whats possible - hell, it sucks. It has boatloads of problems. An ATI developer was even working with the devs. Nvidia decided to mark it as IP because they tossed in a vendor ID lock that actually forced ATI cards to do the first half of the AA process and then ignore the results, which hurt performance on ATI cards. Where were you during that whole saga?

That said, it isn't ATI's job to write code for developers.

On another different note, Nvidia made a good api with cuda and a lot of people have adopted it. It would also be kind of crappy for Nvidia to suddenly drop support for something that a lot of people have taken the time to learn just so an open standard will be supported. I honestly do not believe ATI is pushing these open standards for the benefit of the industry, as I am sure it seems.

Nvidia isn't dropping support for CUDA, they are going to support all 3. And you seem to fail to understand that DirectCompute and OpenCL aren't ATI's. Hell, Nvidia helped to create BOTH of them.
 
Man, I hate it when games don't just use Direct3D. Yet another lost sale from me, because I'm not going to buy it if the eyecandy only works on Nvidia systems.

it looks 99% the same across ATi / Nvidia so w/e, miss out

nvidia gets some blurrier blurs and framerate halving water, woop
 
That said, it isn't ATI's job to write code for developers.

Maybe it should be. Maybe all of us ATI owners should start demanding that ATI get off their ass and actually support developers instead of sitting idly by and letting Nvidia be the only one to do this. If Nvidia offers to help a developer write some specialized code to improve performance and effects on their cards and ATI twiddles its thumbs and does nothing, acting like they don't give a damn, guess what the developer is going to do?
 
So what you want is a visual cue that they do it. ATI do I believe offer dev's support if they need it. They do not plaster something like TWIMTBP accross the screen. I would want an an example of specialised code that nvidia has provided for a game recently. Also, this topic seems to have gotten sidetracked. Why all this talk of opencl and cuda when surely it should be why is gpgpu code needed at all.

Also, what was metro2033 tested on, you cannot implement something and not test it.

Edit: before the obvious trolls get in here, I will say I am aware that ati used to have a program where they put a screen withtheir logo on it but dropped it. Do not bring that up in an attempt to derail the thread.
 
Also, what was metro2033 tested on, you cannot implement something and not test it.

A handful of developers (as of January) have Fermi cards. Surely the number has increased by now. Metro 2033 is one game NVIDIA is demoing to show off new features. We'll certainly be taking a look at this game.
 
So what you want is a visual cue that they do it. ATI do I believe offer dev's support if they need it. They do not plaster something like TWIMTBP accross the screen. I would want an an example of specialised code that nvidia has provided for a game recently. Also, this topic seems to have gotten sidetracked. Why all this talk of opencl and cuda when surely it should be why is gpgpu code needed at all.

Also, what was metro2033 tested on, you cannot implement something and not test it.

Edit: before the obvious trolls get in here, I will say I am aware that ati used to have a program where they put a screen withtheir logo on it but dropped it. Do not bring that up in an attempt to derail the thread.

ATI has worked with a few studios, but it is rare that they do it. This is supported by the fact that its rather surprising when you hear about it. And why not use GPGPU code? Can this stuff work on the CPU? Sure, it probably could, but I'm not seeing AMD or Intel jumping at the chance to help developers do that. And, on the GPGPU side, I'm not really seeing MS pushing DC or anyone pushing OpenCL. Though I suppose once the OpenCL version of Havok is out we'll see some push there from Intel.

Umm...Nvidia has had working cards for a while now. They showed them off, working, at CES in January. They obviously had functioning cards before than.
 
Nvidia isn't dropping support for CUDA, they are going to support all 3. And you seem to fail to understand that DirectCompute and OpenCL aren't ATI's. Hell, Nvidia helped to create BOTH of them.

I understand the whole situation perfectly well which I thought is/was perfectly clear from what I said. You seem to lack understanding on what I am saying. For example, I never said Nvidia was dropping cuda, I said "If Nvidia were to drop support..." (edit: the other case you quoted me with was an obvious implied if too.) Do you understand this and how your response makes no sense in this context?

This is the reason I hate forums. Why is it so frickin hard to communicate effectively. Of course if people lack understanding it can be hard to communicate via any format.
 
Last edited:
Okay so some dev's have got some early hardware.

Derangel, you seem to want again some sort of visual PR/marketing show for most things. I do not think either you or I really know who does what in the gaming industry unless someone mentions it. Of course why they would mention every little thing I don't know.

Also, again, it is not up to anyone but the people making the game to do the work. IMO companies should be neutral and keep their noses out. Microsoft do not need to do any promotion at all as it is microsoft.
 
Also, this topic seems to have gotten sidetracked. Why all this talk of opencl and cuda when surely it should be why is gpgpu code needed at all.

I do not get what you mean? Surely this could have been done without gpgpu code, I agree. How is this thread about that?
 
Last edited:
Okay so some dev's have got some early hardware.

Derangel, you seem to want again some sort of visual PR/marketing show for most things. I do not think either you or I really know who does what in the gaming industry unless someone mentions it. Of course why they would mention every little thing I don't know.

Also, again, it is not up to anyone but the people making the game to do the work. IMO companies should be neutral and keep their noses out. Microsoft do not need to do any promotion at all as it is microsoft.

Actually, in almost every known case where ATI has worked with a developer there is an ATI logo on the game's website and this is how people have figured it out. They don't have a "Plays best on ATI" logo in the games and I'm fine with that. I'd rather not see more splash screen logos when loading a game.

Developers have proven that they're not likely to do this stuff by themselves so if its going to improve my gaming experience I want companies involved. If it going to lead to pushing better effects and better physics then damn it I want all the hardware manufactures to get involved.
 
IMO companies should be neutral and keep their noses out. Microsoft do not need to do any promotion at all as it is microsoft.

I disagree. The future of gaming and the gpu and the whole computer is changing, and Nvidia is trying to take control of the situation and offer some direction. I like that, and I wish ATI was working on this too, may the best team win kind of thing.

Of course ATI has AMD so no matter what direction things go they are fine.
 
The thing about DirectX is that it is not locked down to any single hardware vendor

You are either against proprietary or you are not. Saying that it's ok for Microsoft to do it, but it's not ok for NVIDIA to do it is pure hypocrisy.

Incidentally, PhysX in one form or another runs on far more hardware than DirectX. In fact the only hardware that won't run it is ATI. It is worth noting that they were offered it at one point and declined.
 
You are either against proprietary or you are not. Saying that it's ok for Microsoft to do it, but it's not ok for NVIDIA to do it is pure hypocrisy.

Incidentally, PhysX in one form or another runs on far more hardware than DirectX. In fact the only hardware that won't run it is ATI. It is worth noting that they were offered it at one point and declined.

Implying you can run the advanced PhysX settings on the CPU.
 
I disagree. The future of gaming and the gpu and the whole computer is changing, and Nvidia is trying to take control of the situation and offer some direction.

But the only direction they are offering is vendor lock-in, if all they wanted to do was offer direction PhysX, the Batman:AA debacle and some of this other hinky stuff would not be happening. nVidia might just be making some nice gestures, but based on past behavior they are probably looking for a shot at vendor lock in. nVidia needs to compete where it counts, with hardware and (maybe) we'll see that pretty soon.
 
Small fry in the larger picture when Nvidia is going to be coming to the DX11 table a year late and several million dollars short. Yes, the GTX 470/480 cards are launching at the end of the month, but it'll won't be till this fall (if then) before they are available in quantity.

Meanwhile the devs will have been developing with ATI DX11 cards for a year and a half, and it won't be hard for them to see which way the wind is blowing > bad move to piss off and damage their reputation with ATI DX11 card owners for some short term monetary gain from the graphics card company that is in the process of an epic face plant. In general, it's going to be a bad time to be joined at the marketing hip with Nvidia.

Nvidia has seemingly prepared a Grand, far reaching, multi-pronged Fermi, Cuda, Physx and TWIMTBP marketing campaign intended to leverage their propietary technologies into much wider use, but, man, did it ever have an ACHILLES HEEL. The foundation of the strategy was a successful, competitive, widely available suite of Fermi based chips/cards, without which all the rest also falls on it's face.

Now devs and publishers that are contractually locked into advertising Nvidia technologies and tie-ins that don't exist or are ineffectual can anticipate a rash of mocking articles, blogs and You Tube videos.
 
I do not get what you mean? Surely this could have been done without gpgpu code, I agree. How is this thread about that?

How could you so naively state that? You don't know the algorithm that is used, so you don't know how it would work on the CPU.. chances are if it's good for the GPU (and it is or NVIDIA wouldn't have bothered) then it's probably something that would've run at 1 fps on the CPU. That's typically the type of stuff NVIDIA is pursuing.. not moving crap over to the GPU for the hell of it.. even at the same perf..
 
But the only direction they are offering is vendor lock-in, if all they wanted to do was offer direction PhysX, the Batman:AA debacle and some of this other hinky stuff would not be happening. nVidia might just be making some nice gestures, but based on past behavior they are probably looking for a shot at vendor lock in. nVidia needs to compete where it counts, with hardware and (maybe) we'll see that pretty soon.

AMD had a driver bug running MSAA on the the surface format type that Batman AA used for motion blur (which is all over the game).

If NVIDIA had not locked AA to themselves in Batman, you'd have seen corruption on AMD. AMD added the MSAA format support the month before the game shipped, but it was well beyond gold/Games For Windows submission.

Sure it helped NVIDIA to have AA only, but it wasn't some bizarre plot. If they hadn't locked AMD out they would've been accusing them of intentionally introducing corruption on their hardware. And keep in mind the game wasn't even on AMDs radar. If it hadn't been so successful, AMD wouldn't have given a crap, and may very well have never even spoken to the developers. Meanwhile NVIDIA saw the games potential and was there supporting it with 10 man years of work from early on.

Say what you want but there really wasn't anything else they could've done besides leave the corruption on AMD and give them AA for free (they spent money adding it in), or just remove it altogether.

Basically what I'm hearing in threads like this is that unless a company is willing to do work for free for the other company, they shouldn't do anything at all.. and we should just stick with console ports. But then I hear people complain about console ports being the only thing we get nowadays... make up your mind guys.
 
Small fry in the larger picture when Nvidia is going to be coming to the DX11 table a year late and several million dollars short. Yes, the GTX 470/480 cards are launching at the end of the month, but it'll won't be till this fall (if then) before they are available in quantity.

Meanwhile the devs will have been developing with ATI DX11 cards for a year and a half, and it won't be hard for them to see which way the wind is blowing > bad move to piss off and damage their reputation with ATI DX11 card owners for some short term monetary gain from the graphics card company that is in the process of an epic face plant. In general, it's going to be a bad time to be joined at the marketing hip with Nvidia.

Nvidia has seemingly prepared a Grand, far reaching, multi-pronged Fermi, Cuda, Physx and TWIMTBP marketing campaign intended to leverage their propietary technologies into much wider use, but, man, did it ever have an ACHILLES HEEL. The foundation of the strategy was a successful, competitive, widely available suite of Fermi based chips/cards, without which all the rest also falls on it's face.

Now devs and publishers that are contractually locked into advertising Nvidia technologies and tie-ins that don't exist or are ineffectual can anticipate a rash of mocking articles, blogs and You Tube videos.

Disagree.. arguably TWIMTBP, PhysX, and CUDA have all been things that have *sustained* their sales despite not having Fermi ready.

And where are you getting these bogus timelines? AMD has had the hardware since summer... That's bout 6 months, not a year and a half :rolleyes:
 
Disagree.. arguably TWIMTBP, PhysX, and CUDA have all been things that have *sustained* their sales despite not having Fermi ready.
Heck they even increased their sales.

And where are you getting these bogus timelines? AMD has had the hardware since summer... That's bout 6 months, not a year and a half :rolleyes:

More like the fall and they did not have any stock available until very recently. Not that DX11 is any sort of big deal right now. Maybe in 2012.
 
More like the fall and they did not have any stock available until very recently. Not that DX11 is any sort of big deal right now. Maybe in 2012.

Well to be fair, it was definitely given to developers earlier than that, which is why I said summer.
 
AMD had a driver bug running MSAA on the the surface format type that Batman AA used for motion blur (which is all over the game).

If NVIDIA had not locked AA to themselves in Batman, you'd have seen corruption on AMD. AMD added the MSAA format support the month before the game shipped, but it was well beyond gold/Games For Windows submission.

Sure it helped NVIDIA to have AA only, but it wasn't some bizarre plot. If they hadn't locked AMD out they would've been accusing them of intentionally introducing corruption on their hardware. And keep in mind the game wasn't even on AMDs radar. If it hadn't been so successful, AMD wouldn't have given a crap, and may very well have never even spoken to the developers. Meanwhile NVIDIA saw the games potential and was there supporting it with 10 man years of work from early on.

Say what you want but there really wasn't anything else they could've done besides leave the corruption on AMD and give them AA for free (they spent money adding it in), or just remove it altogether.

Basically what I'm hearing in threads like this is that unless a company is willing to do work for free for the other company, they shouldn't do anything at all.. and we should just stick with console ports. But then I hear people complain about console ports being the only thing we get nowadays... make up your mind guys.

Source? Because ATI claimed that they were working with the Batman guys and I sure as hell believe them over you.

You are either against proprietary or you are not. Saying that it's ok for Microsoft to do it, but it's not ok for NVIDIA to do it is pure hypocrisy.

No, it isn't, not at all. DirectX is a proprietary API, yes, but anyone can write a driver for it. You cannot write your own CUDA driver, at least not legally.
 
Basically what I'm hearing in threads like this is that unless a company is willing to do work for free for the other company, they shouldn't do anything at all.. and we should just stick with console ports. But then I hear people complain about console ports being the only thing we get nowadays... make up your mind guys.

Wow, you don't get to see such a perfect straw man very often.
 
Source? Because ATI claimed that they were working with the Batman guys and I sure as hell believe them over you.



No, it isn't, not at all. DirectX is a proprietary API, yes, but anyone can write a driver for it. You cannot write your own CUDA driver, at least not legally.

I am my own source. ATI was engaged with the devs but they never even visited the studio.. they started emailing as soon as they saw public interest in the demo. They weren't committed from the start at all.
 
No, it isn't, not at all. DirectX is a proprietary API, yes, but anyone can write a driver for it. You cannot write your own CUDA driver, at least not legally.


YES YOU CAN....ATI is free to support this if they want to. :rolleyes:
 
Last edited:
Wow, you don't get to see such a perfect straw man very often.

It's not a strawman.. what exactly would you say I am leaving out here? This is exactly what has been said here. Are you suggesting that there is some third alternative where NVIDIA uses direct compute for the benefit of ATI? Do you really think that's good from a business perspective? They want to enable and reward their loyal customerbase.. I don't see anything wrong with that.

Not to mention they wouldn't have been able to make this work with the gimped version of compute shader for older hardware .. CS4.0.. meaning if they'd used directcompute, they would've limited the install base that could use this to AMD and NV dx11 cards. There are tens of millions more CUDA capable GPUs out there than there are DX11 cards, so it makes sense. Plus its cool that its being enabled back to G80 via CUDA... it's not locking anyone into the latest generation to get the new stuff.
 
But the only direction they are offering is vendor lock-in, if all they wanted to do was offer direction PhysX, the Batman:AA debacle and some of this other hinky stuff would not be happening. nVidia might just be making some nice gestures, but based on past behavior they are probably looking for a shot at vendor lock in. nVidia needs to compete where it counts, with hardware and (maybe) we'll see that pretty soon.

I do not see them capable of being competitive if all they do is hardware, and I think they agree and hence the present business strategy. No one knows the future, so if you disagree that is fine. We will see how it all plays out over then next 5 years or so.
 
How could you so naively state that? You don't know the algorithm that is used, so you don't know how it would work on the CPU.. chances are if it's good for the GPU (and it is or NVIDIA wouldn't have bothered) then it's probably something that would've run at 1 fps on the CPU. That's typically the type of stuff NVIDIA is pursuing.. not moving crap over to the GPU for the hell of it.. even at the same perf..

I think a lot of this could have been done within the directx api. You are right that I do not know this for sure, but I wanted to entertain the possibility (for the sake of argument mainly.)
 
Back
Top