nVidia spends $2 Million on Crysis 2

Is there any links to this story with Kitguru not being the source? IMO, the validitity of this website seems very suspect. It just started up this year and it was the site that claimed Nvidia would be dropping the 470 to make dual GPU 490s and the 480 would be replaced with 485s. Both of which never happened. :rolleyes:
 
Is there any links to this story with Kitguru not being the source? IMO, the validitity of this website seems very suspect. It just started up this year and it was the site that claimed Nvidia would be dropping the 470 to make dual GPU 490s and the 480 would be replaced with 485s. Both of which never happened. :rolleyes:

I was thinking the same thing. ShitGuru ops KitGuru is the most aggressive and story making of the news sites, since is the newest try to hijack as much traffic as it can get. Since they try to get more ADDs.
 
I've been extremely dissapointed with some developers and their willingness to jump into bed with nvidia, the PhysX thing is especially jaring, I'd love to appreciate PhysX in games on my quad core CPU but the optimisation of the code is completely and totally shit, they can't even make use of 50% of the available resources.

It has already been proven that Nvidia uses obsolete coding so CPU implementation of PhysX is very slow no matter how fast your CPU is.
This kind of crap is typical of Nvidia.
 
The price of the game will remain the same to customers, so why would anybody care if the game is going to be improved at someone else's expense?

Two million dollars' worth of extra polish is a good thing.

If you're an ATI card owner, then it just means you're not going to see most of the extra shine, since presumably the extra shine will take advantage of Nvidia hardware. But the game will still end up looking as good as was originally promised.

I agree though that PhysX has been more problematic than anything. The only thing is, you kind of do miss it when it's not there. Like in that Mafia 2 benchmark when you turn PhysX off, and there are no longer little bits and pieces of shrapnel flying everywhere - it just seems kind of stale when you no longer see that.
 
The price of the game will remain the same to customers, so why would anybody care if the game is going to be improved at someone else's expense?

Two million dollars' worth of extra polish is a good thing.

If you're an ATI card owner, then it just means you're not going to see most of the extra shine, since presumably the extra shine will take advantage of Nvidia hardware. But the game will still end up looking as good as was originally promised.

I agree though that PhysX has been more problematic than anything. The only thing is, you kind of do miss it when it's not there. Like in that Mafia 2 benchmark when you turn PhysX off, and there are no longer little bits and pieces of shrapnel flying everywhere - it just seems kind of stale when you no longer see that.

You have a more optimistic outlook. I think many PC gamers feel that when a company gets "support" of this form it often means features removed for ATI rather than features added for nVidia. Similar to how early DX10 games, like the original Crysis, removed features from DX9 rather than adding anything much special in DX10.
 
You have a more optimistic outlook. I think many PC gamers feel that when a company gets "support" of this form it often means features removed for ATI rather than features added for nVidia. Similar to how early DX10 games, like the original Crysis, removed features from DX9 rather than adding anything much special in DX10.

They didn't really remove it.

It just some effect cannot be done under DX9, one major feature is motion blur, DX9 is completely incompatible with DX10 version.

Though Warhead does add some motion blur into it for DX9, but its horrible compare to DX10.
 
The price of the game will remain the same to customers, so why would anybody care if the game is going to be improved at someone else's expense?

Two million dollars' worth of extra polish is a good thing.

If you're an ATI card owner, then it just means you're not going to see most of the extra shine, since presumably the extra shine will take advantage of Nvidia hardware. But the game will still end up looking as good as was originally promised.

I agree though that PhysX has been more problematic than anything. The only thing is, you kind of do miss it when it's not there. Like in that Mafia 2 benchmark when you turn PhysX off, and there are no longer little bits and pieces of shrapnel flying everywhere - it just seems kind of stale when you no longer see that.

Two million will not even pay for the marketing of the game. It's not even nearly enough for Crytek to scrap a custom built physics engine that when ran on the CPU is far superior to PhysX. Add that to the fact that it'll work for every customer instead of a portion. The mention of PhysX seems to be a ploy by the reporting website in order to take advantage of the extreme differences in people's opinion and generate page hits. I believe the only thing Nvidia revived to the money is their logo put on all marketing materials, intro sequence of the game, and an early build in order to get drivers ready before release.
 
Haha, what a waste. The only way I'll be playing this game is if its bundled with my next ATi card.

All Crysis ever will be is a false advertising tool. Remember the days before Crysis hit the shelves? All you heard from Crytek was "Oh, you definitely need a quad core and Windows Vista to see this game in all its glory." Never mind the fact that the game ran better on higher clocked dual cores, and it was a DX9 game engine with the highest settings reserved for those that bought Vista (or those that changed a few numbers in the config file).
 
Two million will not even pay for the marketing of the game. It's not even nearly enough for Crytek to scrap a custom built physics engine that when ran on the CPU is far superior to PhysX. Add that to the fact that it'll work for every customer instead of a portion. The mention of PhysX seems to be a ploy by the reporting website in order to take advantage of the extreme differences in people's opinion and generate page hits. I believe the only thing Nvidia revived to the money is their logo put on all marketing materials, intro sequence of the game, and an early build in order to get drivers ready before release.

The problem with this post is that you're presuming too much without any evidence to back up your claims - a reputable print publication, if you were a reporter, would never print your 'story', since you're presenting opinions rather than facts.
 
The problem with this post is that you're presuming too much without any evidence to back up your claims - a reputable print publication, if you were a reporter, would never print your 'story', since you're presenting opinions rather than facts.

I never maid the claim that I knew exactly what was happening. I just pointed out that the article was also making wild claims. There is no evidence that Crytek is reworking Cryengine 3 to take advantage of Nvidia hardware/software.
 
You have a more optimistic outlook. I think many PC gamers feel that when a company gets "support" of this form it often means features removed for ATI rather than features added for nVidia. Similar to how early DX10 games, like the original Crysis, removed features from DX9 rather than adding anything much special in DX10.

This is typically what happens in situations like this. Lets take a look at what happened with Batman.
 
If you're an ATI card owner, then it just means you're not going to see most of the extra shine, since presumably the extra shine will take advantage of Nvidia hardware. But the game will still end up looking as good as was originally promised.

This is, IMO, the worst implication of a deal. That there will be a bunch of features which could be done on AMD hardware, but which are locked out on non-nvidia hardware.

Although "as good as was originally promised" doesn't mean much, given Crytek's "if it can't be done on a console, it doesn't go in" attitude.
 
If Crysis 2 runs better on a 460 than my 5850 I will not buy it until it hits the bargain bin and nVidia will drop yet another peg in my book.

Paying a developer to hobble their game on a competitors hardware is the worst kind of competitive business practice. It's just as deplorable for the developer to accept the money/terms.

nVidia and Crytek can both suck it if this article speaks the truth, although you'd think given the franchise history of running below expectations, Crytek would want to maximize performance across the board. Cevat is a delusional jackass though so I won't put it past him.
 
I rather see how nVidia put up this thing other than a game that completely turn trash due to console....
 
The problem with this post is that you're presuming too much without any evidence to back up your claims - a reputable print publication, if you were a reporter, would never print your 'story', since you're presenting opinions rather than facts.

Same goes for the source of this story. Most online journalism these days makes reputable sources and evidence almost irrelevant to their purposes.
 
If it's as shitty as the first game, then who really cares? It's kind of like the whole Metro 2033 thing going on, omg it's Metro I get 32.8fps! Yeah and it also is a pretty shallow game when you actually play it.
 
Tessellation isn't so much of a problem, you need to push quite a lot to slow down cards to the point where you probably cannot appreciate the additional detail that's been added, for example tessellation on a typical NPC in a game can only be so high before it's pointless doing more, so if they stress it in order to try and fake some kind of advantage it's not actually going to benefit gamers.

I would like to see Tessellation used more in the environment in a game. I can appreciate NPCs looking better, such as we've gotten with Metro 2033 and the aliens in Aliens vs. Predator, but it makes the game look so weird having great detailed characters, but lackluster environments. I want to see Tessellation used on rocks, trees, buildings, objects in the game etc... I want the environment in games to be pushed using Tessellation, not just characters. Just a little rant I have.
 
I like the idea. Games with physx and tessilation look good, so why all the hate? Skip it if you want, but don't hate the idea of making games look better. For the longest time (like the last half decade) graphics cards haven't brought any new features to the table, - just increases in speed. The feature and graphics set has finally been upped and those who don't own the cards are the ones whinning on the entire first page of this thread. I'm looking forward to it!
 
You guys are ridiculous.

"Damnit NVIDIA quit trying to make money!!!!111!!"

Those of you who claim TWIMTPB is so NVIDIA can make sure the game is "hobbled" on other hardware need to take off the tinfoil hat.

Also, maybe AMD/ATI should spend a little more time interacting with developers..........
 
Jesus some of you guys take shit like this way out of control. Take off your tin foil hats. I hate nvidia as much as the next guy but it's probably just some performance enhancements and getting their logo on the box that says they are the best.

it's prolly things like Nvidia purposely disabling AA on competitors code while using the guise of "we programmed it" when in reality it was standard DX code lol.....

If Nvidia wants to survive, they need to pull their head out of their ass and actually start making decent hardware and most importantly learning from their mistakes.......
 
For you guys saying Crytek is cutting out half of their potential customers...

:)

not quite half - more like 1/3

http://store.steampowered.com/hwsurvey

Nvidia making a game graphically better for potentially 2/3's of the customers by optomizing drivers and adding features specific to their cards - while not intentionally sabotaging the remaining 1/3 customer base -- and still attempting to retain their brand's lead sounds like a good plan for the majority of .... everyone!
 
Last edited:
Those numbers are off for a few reasons. First, not every Nvidia card either has the feature set(DX11/PhysX) or the power to make use of them(sub-$150). Add to this that Steam doesn't completely represent the market. There are more than a few people who have Steam to just play Counter-Strike and other less demanding games.
 
Games with physx look good, so why all the hate?
That's like Internet Explorer in the late 90s: "sites which use <blink></blink> look good, so why all the hate?". You don't just start throwing your own proprietary shit into an application which is only compatible with your hardware and not expect any backlash.
 
For you guys saying Crytek is cutting out half of their potential customers...

:)

not quite half - more like 1/3

http://store.steampowered.com/hwsurvey

Nvidia making a game graphically better for potentially 2/3's of the customers by optomizing drivers and adding features specific to their cards - while not intentionally sabotaging the remaining 1/3 customer base -- and still attempting to retain their brand's lead sounds like a good plan for the majority of .... everyone!

You are aware that a large majority of nvidia GPU's on that survey are 8800/9800 variants, right?
 
Yes, and probably some before that. 9800GT's have Physx - so they shouldn't be included in your comment.. Note However --- Nvidia uses a pretty standardized driver for their cards that in my experience often benefits even the older cards when a new driver comes out. So optimizing a game for the newer cards may help the older cards out as well. At worst an older Nvidia card would be in the same boat as ATI - not able to take advantage of the enhancements that the newer Nvidia cards get to enjoy.

This has been going on since the first generations of graphics cards. You long time techies remember S3TC on the S2000 Video Cards? It made Unreal Tournament look fantastic! It wasn't available to 3DFX owners at the time -- Because Diamond was inovative - - - --- just like Nvidia in this threads controversy. I still enjoyed employing S3TC at the time, and the 3DFX guys enjoyed faster framerates, just like I'll enjoy employing Physx and Tessilation on my GTX460, while you ATI 5890 guys will enjoy faster framerates. If ATI comes out with a graphics improving standard I'll jump in their camp if it warrants my following. Why should traffic be hampered by such pathetic envious enthusiasts? There is more to innovation than faster framerates!

This thread is akin to saying Intel has SSE version 4 instructions. I have an AMD processor - I think Intel shouldn't make use of SSE version 4 for video encoding programs because my AMD can't use SSE version 4. The ATI guys need to quit whining. You won't expierience anything worse --- while Nvidia owners will have a better exp. There is no "lose" here.

For those that don't recall S3TC - take a look at these screenprints! These are screenprints I did back in the day and still saved copies till now. They looked amazing at the time.

S3TC cement
118hi1i.jpg

S3TC block and wood
nlsnl2.jpg

S3TC floor
2nbre3n.jpg

S3TC walls
2iw9b8x.jpg

S3TC box
j7dg1c.jpg

S3TC Diamond Link Steel
fleno5.jpg














Now compare those images to one I took without S3TC enabled. Just plain old OpenGl
OpenGL Box
156cy8z.jpg







Or how about another couple screnprints I took in Direct 3d without S3TC?
211jk3o.jpg

Direct 3d Diamond steel floor
2enu5ms.jpg


Look how much sharper the textures are on the S3TC enabled S2000 card - compare the diamond plate steel, and the box on the alternative standards. Outside of this game the S2000 was a pretty miserable card --- plagued with bad drivers and false promises. But this technology was breathtaking at the time.

I think this is a perfect example for this thread. S3TC eventually was implemented into dx6 and S3/Diamond the makers of the S2000 went kaput. Looking back who should fault Diamond for innovating. The same sort of discussions could have taken place back then. Why doesn't Diamond/S3 share this technology so all can benefit. This pattern continues - remember when Matrox had bump mapping and the other big cards of the day did not? Water based graphics looked spectacular on Matrox. Nvidia now wants to add extra optional enhancing features -- so be it. I for one am glad! In the end - all benefit from such innovation.

I googled S3TC to see if I could find any info so many years past it's prime. I found the following little article
http://web.archive.org/web/20030618083605/www.hardwarecentral.com/hardwarecentral/reports/140/1/
 
Last edited:
nvidia has spent millions flying around their developers to help dev teams optimize their code and reduce bugs. when this happens the devs are not out there trying to break shit for ati users. now that ati (arguably) has better product in some segments, suddenly nvidia is demonized for spending money so that their customers have a better experience.

people should be pissed ati never bothered with such a thing.
Posted via [H] Mobile Device
 
Developers should be developing games, not ATI and Nvidia. Nvidia get in bed with developers to give themselves a competative edge and that only makes life better for SOME of the Nvidia users, where as the rest of us get the shaft. PhysX is a perfect example of that.

Being an Nvidia user is no excuse, you might not always be using their hardware, so really it's in all gamers best interests to voice their opinion on this matter.
 
Developers should be developing games, not ATI and Nvidia. Nvidia get in bed with developers to give themselves a competative edge and that only makes life better for SOME of the Nvidia users, where as the rest of us get the shaft. PhysX is a perfect example of that.

Being an Nvidia user is no excuse, you might not always be using their hardware, so really it's in all gamers best interests to voice their opinion on this matter.

this is not the special olympics where everyone wins a medal. people that spend the money on high end gear should expect a better experience, even if a gpu manufacturer needs to help that along. physx can make a good game even better (batman: aa) and the people with nvidia cards and the game appreciated it.
Posted via [H] Mobile Device
 
this is not the special olympics where everyone wins a medal. people that spend the money on high end gear should expect a better experience, even if a gpu manufacturer needs to help that along. physx can make a good game even better (batman: aa) and the people with nvidia cards and the game appreciated it.
Posted via [H] Mobile Device

It is doubtful this is about PhysX. Cryengine already has a physics engine superior to cpu PhysX built into it.

Though I would not be surprised if AA does not work on AMD cards at launch.
 
It is doubtful this is about PhysX. Cryengine already has a physics engine superior to cpu PhysX built into it.

Though I would not be surprised if AA does not work on AMD cards at launch.

That would make no sense. Cryengine 3 is built off of Cryengine 2, and AMD already has mature drivers for it. Add that to the fact that Crytek has always built AA algorithms into their engine, so I can't see this being an issue as AMD won't have to code a work around such as in UT3 games and Starcraft 2.
 
That would make no sense. Cryengine 3 is built off of Cryengine 2, and AMD already has mature drivers for it. Add that to the fact that Crytek has always built AA algorithms into their engine, so I can't see this being an issue as AMD won't have to code a work around such as in UT3 games and Starcraft 2.


Batman AA kept coming up.
It was a joke referring back to to it. Should have used a ;)..
 
STOP SUPPORTING PHYSX
Crytek has always used their own physics engine, titled CryPhysics. It's the same case with CryENGINE 3. No PhysX, Havok etc.

That Kitguru site does not know what they are talking about...
 
Last edited:
Yes, and probably some before that. 9800GT's have Physx - so they shouldn't be included in your comment.. Note However --- Nvidia uses a pretty standardized driver for their cards that in my experience often benefits even the older cards when a new driver comes out. So optimizing a game for the newer cards may help the older cards out as well. At worst an older Nvidia card would be in the same boat as ATI - not able to take advantage of the enhancements that the newer Nvidia cards get to enjoy.

This has been going on since the first generations of graphics cards. You long time techies remember S3TC on the S2000 Video Cards? It made Unreal Tournament look fantastic! It wasn't available to 3DFX owners at the time -- Because Diamond was inovative - - - --- just like Nvidia in this threads controversy. I still enjoyed employing S3TC at the time, and the 3DFX guys enjoyed faster framerates, just like I'll enjoy employing Physx and Tessilation on my GTX460, while you ATI 5890 guys will enjoy faster framerates. If ATI comes out with a graphics improving standard I'll jump in their camp if it warrants my following. Why should traffic be hampered by such pathetic envious enthusiasts? There is more to innovation than faster framerates!

This thread is akin to saying Intel has SSE version 4 instructions. I have an AMD processor - I think Intel shouldn't make use of SSE version 4 for video encoding programs because my AMD can't use SSE version 4. The ATI guys need to quit whining. You won't expierience anything worse --- while Nvidia owners will have a better exp. There is no "lose" here.

For those that don't recall S3TC - take a look at these screenprints! These are screenprints I did back in the day and still saved copies till now. They looked amazing at the time.

S3TC cement
118hi1i.jpg

S3TC block and wood
nlsnl2.jpg

S3TC floor
2nbre3n.jpg

S3TC walls
2iw9b8x.jpg

S3TC box
j7dg1c.jpg

S3TC Diamond Link Steel
fleno5.jpg














Now compare those images to one I took without S3TC enabled. Just plain old OpenGl
OpenGL Box
156cy8z.jpg







Or how about another couple screnprints I took in Direct 3d without S3TC?
211jk3o.jpg

Direct 3d Diamond steel floor
2enu5ms.jpg


Look how much sharper the textures are on the S3TC enabled S2000 card - compare the diamond plate steel, and the box on the alternative standards. Outside of this game the S2000 was a pretty miserable card --- plagued with bad drivers and false promises. But this technology was breathtaking at the time.

I think this is a perfect example for this thread. S3TC eventually was implemented into dx6 and S3/Diamond the makers of the S2000 went kaput. Looking back who should fault Diamond for innovating. The same sort of discussions could have taken place back then. Why doesn't Diamond/S3 share this technology so all can benefit. This pattern continues - remember when Matrox had bump mapping and the other big cards of the day did not? Water based graphics looked spectacular on Matrox. Nvidia now wants to add extra optional enhancing features -- so be it. I for one am glad! In the end - all benefit from such innovation.

I googled S3TC to see if I could find any info so many years past it's prime. I found the following little article
http://web.archive.org/web/20030618083605/www.hardwarecentral.com/hardwarecentral/reports/140/1/

hate to burst your bubble but S3TC was available to anyone that wanted to use it.......not just S3.......
 
People successfully made wrappers for nvidia's own made tech demos so you can bet your ass they'll get it done if there are any significant features missing on ATI hardware in Crysis 2.
 
blah blah blah

I almost shot myself after reading that, I had the gun in my mouth and everything. First off tessellation is DX11 code and not nvidia's so ATI users will be employing it too and I recall ATI having tessellation options for their cards many many years ago. Secondly you can't compare S3TC to PhysX because there are alternatives to PhysX and there has been for years.
 
This is a very silly argument. There is no such thing as hardware feature entitlement. You and users want physx, buy NVIDIA or don't and stop crying. It belongs to NVIDIA. They bought it and have put time effort and investment into it. Why should they give it away? This era of obama-I.am.entitlited.to.it.cause.I.want.it. needs to stop ffs. It's that same mentality and it makes me laugh.

Not to mention crytek doesn't even use physx and never have.

One more thing. Anyone here remember dirt2 and how it took codemasters 2 months to code sli support back I'm the retail game when in the demo it worked fine? That game was an amd sponsored game and sli was intentionally taken out. So please stfu.
 
This era of obama-I.am.entitlited.to.it.cause.I.want.it. needs to stop ffs. It's that same mentality and it makes me laugh..

You have issues. I see most people saying that no dev should support PhysX and instead use CPU based or open alternatives such as Bullet.
 
hate to burst your bubble but S3TC was available to anyone that wanted to use it.......not just S3.......

Not at that time. I didn't know of a single 'hack' to use s3tc textures on the other cards at the time. I looked extensively and couldn't find one - which is why I bought the s2000 - because I liked UT and was willing to pay the premium to have my games look quite a bit better. Where's your reference?

It's the same thing we have today. If you want the best graphics - you've gotta pay to play. If you don't want the best, nobody is forcing you to upgrade your card or buy a Nvidia card.


I almost shot myself after reading that, I had the gun in my mouth and everything. First off tessellation is DX11 code and not nvidia's so ATI users will be employing it too and I recall ATI having tessellation options for their cards many many years ago. Secondly you can't compare S3TC to PhysX because there are alternatives to PhysX and there has been for years.

a real man would have carried through...
 
Back
Top