Has Nvidia Seceded to ATI in the high end gaming segment?

Crysis does suck considering that even 2 generations later you still cannot run it at acceptable frames............


You sir are the reason why PC gaming is stuck with crap console ports...thanks for proving the point.
 
Crysis does suck considering that even 2 generations later you still cannot run it at acceptable frames.
I wonder how we have determined that Crysis is not properly "optimized," when by nearly all accounts, we have never seen a game like it. To what do we compare its optimization? I don't deny that Warhead seemed to show performance improvements over the original Crysis, but both games still seem to give "average" systems a lot of trouble.

Maybe these games were just ahead of their time?

Regarding frame rates, it has been my experience that only minimum frame rates really affect performance. 60+ fps makes no difference to me, if I am running 30 fps minimum. As such, two 5870's in CF seem to meet the standard requirements (Enthusiast, 1920x1200, 8X AA, 30 fps minimum). Is this not acceptable?
 
I wonder how we have determined that Crysis is not properly "optimized," when by nearly all accounts, we have never seen a game like it. To what do we compare its optimization? I don't deny that Warhead seemed to show performance improvements over the original Crysis, but both games still seem to give systems a lot of trouble.

Crysis has better graphics than Crysis Warhead...hence the "added" performance in the latter game.
Regarding frame rates, it has been my experience that only minimum frame rates really affect performance. 60+ fps makes no difference to me, if I am running 30 fps minimum. As such, two 5870's in CF seem to meet the standard requirements (Enthusiast, 1920x1200, 8X AA, 30 fps minimum). Is this not acceptable?

No, because HIS rig can't run it at +60FPS...so the game most be badly coded :rolleyes:
 
Crysis has better graphics than Crysis Warhead...hence the "added" performance in the latter game.
Thanks. Yeah, that is pretty much in line with what I was driving at, I think. I believe the perceived lack of optimization was largely just graphically-intensive software capability far exceeding current (at the time) hardware capability. Enthusiasts should stand up and cheer, in my opinion, because why else should hardware companies continue to advance? Why else should we get excited about new tech, if there is nothing out there that will ever need it?

For what, benchmarks? Yuck.
 
WTF are you smoking? limited user base? lol lol lol

well vista is still a small percentage in comparison to the amount of gamers still using xp. i would probably say 7 will probably outpace vista in a year or so in usage, as not only xp users, but many vista users as well, migrate to 7.
 
You sir are the reason why PC gaming is stuck with crap console ports...thanks for proving the point.

lol I play most of my games on a PC but if the game cannot run properly on a mainstream rig, why would i waste my money on it?
 
Being ahead of its time makes it "suck" ? Ok...:rolleyes:

That's precisely what makes PC Gaming much more appealing. New games would always be ahead of the hardware, which would make hardware manufacturers much more committed to provide better and faster hardware, to beat the games that "ridiculed" the last generation. Games like FEAR, Oblivion, Supreme Commander, Lost Planet,...(among other very intensive games when they got out). Now, with mostly console ports, only Crysis remains to be "beat", while the majority of other games runs great on current hardware.

gotta call bs on that one man.......very few people upgrade to top of the line hardware just to play a game.........at least when doom 3 came out the next gen hardware could play it at acceptable frames
 
Being ahead of its time makes it "suck" ? Ok...:rolleyes:

That's precisely what makes PC Gaming much more appealing. New games would always be ahead of the hardware, which would make hardware manufacturers much more committed to provide better and faster hardware, to beat the games that "ridiculed" the last generation. Games like FEAR, Oblivion, Supreme Commander, Lost Planet,...(among other very intensive games when they got out). Now, with mostly console ports, only Crysis remains to be "beat", while the majority of other games runs great on current hardware.
Heh, sorry for basically quoting this entire post in a later post of mine. Didn't see this one! :D
 
Thanks again for proving the point.

QFT...

PC gaming was never about what you could run with what your "weekly allowance" could afford. It was going broke while trying to to have the most intensified gaming experience available, and if your money couldn't provide you with that, then you would step up your game and start OCing your shit...

Seriously, when did such a [H]ard place become home to so many ofties?
 
Last edited:
QFT...

PC gaming was never about what you could run with what your "weekly allowance" could afford. It was going broke while trying to to have the most intensified gaming experience available, and if your money couldn't provide you with that, then you would step up your game and start OCing your shit...

Seriously, when did such a [H]ard place become home to so many ofties?


Agreed. I'm not sure when we changed from [H]ard|OCP to [F]rugal|OCP. Attitudes on this site shifted dramatically in the last few years. It used to be that if someone had an purchased some super high end hardware many people would have said "Nice, I wish I could afford that." Now they say "why did you waste your money on that?" :rolleyes:
 
gotta call bs on that one man.......very few people upgrade to top of the line hardware just to play a game.........at least when doom 3 came out the next gen hardware could play it at acceptable frames

Actually it was not just Doom 3. But if you go back in time you can see a similar pattern with Quake 3, Unreal, Quake 2, ect. Each game when if first came out would bring your systems to you knees. But with in one or two new gen of video cards it could max out the game. I remember those famous flame wars on 3DFX vrs NV battle for Quake 3 1600x1200 at 32 bit color res. These AMD vrs NV flame wars dont match up to the good ol days! :p :)

However I wanted to point out that times are a changing. It was acceptable back then to have to spend over 600 bucks to get your two vodoo2 in SLI to run those games. Now days? I am not so sure. We have come to expect faster/better at a lower price point. Factor in the rise of the consoles and economy its a tough sale. It seems to be a chicken vrs egg issue.

As far as Crysis being a poorly coded game or ahead of itself I guess we will not know for sure unless some one has the source code to share? :)


BTW I am pretty sure there is no chance in heck that NV is leaving any part of the gaming segment like the OP said. I think they are going in a different direction and hope they can keep the CUDA users and Gamers happy..
 
Agreed. I'm not sure when we changed from [H]ard|OCP to [F]rugal|OCP. Attitudes on this site shifted dramatically in the last few years. It used to be that if someone had an purchased some super high end hardware many people would have said "Nice, I wish I could afford that." Now they say "why did you waste your money on that?" :rolleyes:

i think once the cost of exceptionally performing components fell so drastically and so quickly in comparison to previous generations, these bottom of the barrel scraping users began feeling entitled to that level of performance. users bitched about how much 6gb of ddr3 cost at the launch of the x58 platform, conveniently forgetting that fast 2gb kits of ddr400 once cost the same and remained so for nearly the entire lifespan of the 939 socket platform.
 
lol I play most of my games on a PC but if the game cannot run properly on a mainstream rig, why would i waste my money on it?

Hmm... You may have a hardware problem. Crysis ran fine on my old 8800GTS 320 and I played it on a PC with an 8400GS. Make sure your drivers are up to date, nothing's underclocked and perhaps...

turn down the settings a bit

Crysis will RUN on damn near anything. Whether your ego permits you to enjoy it is another matter.
 
I think a few members have had their brains secede from their bodies, at least while posting on this thread.

Anyway, I'll cede this topic back to those who care deeply about which multibillion dollar company provides a slight performance advantage in games
 
Hmm... You may have a hardware problem. Crysis ran fine on my old 8800GTS 320 and I played it on a PC with an 8400GS. Make sure your drivers are up to date, nothing's underclocked and perhaps...

turn down the settings a bit

Crysis will RUN on damn near anything. Whether your ego permits you to enjoy it is another matter.

But will it run it at 60 fps contstant 1080p 4xaa/16xaf All enthusiast settings? If not, then I don't think you'll get the full experience as intended.
 
I say that but am having an issue with it. They have been selling at near loss (or outright loss depending on the source) for a while now. why quit 4 months from your next gen?

if this is true though charlie must of creamed his jeans over it.

Well if they have been selling for a loss for a while, and the new gen from ati means they would have to lower prices even more, may make more sense just to EOL the chips, and wait until fermi is ready after new years, athough it will leave a huge hole in the lineup between fermi and the low end.
 
Well if they have been selling for a loss for a while, and the new gen from ati means they would have to lower prices even more, may make more sense just to EOL the chips, and wait until fermi is ready after new years, athough it will leave a huge hole in the lineup between fermi and the low end.

it would basically equate leaving the gpu market. I just don't see it. also getting rid of the mid section would be a death knell for their GPGPU strategy. with all their effort into this why now? they knew this was coming months ago. they just had their first real success with physx and just now getting some decent cuda projects going (a anti virus scan that doesn't kill your boot time or loading time would be awesome).

also I don't understand why they just don't do a die shrink of the GTX285, and how hard would it be to drop it down to 256 bit bus with GDDR5? at 40nm it would be cheaper to produce and so would the AIB. maybe I am missing something but it seems that would have been a card that would still have been competitive with ATI and their GPGPU strategy.
 
it would basically equate leaving the gpu market. I just don't see it. also getting rid of the mid section would be a death knell for their GPGPU strategy. with all their effort into this why now? they knew this was coming months ago. they just had their first real success with physx and just now getting some decent cuda projects going (a anti virus scan that doesn't kill your boot time or loading time would be awesome).

also I don't understand why they just don't do a die shrink of the GTX285, and how hard would it be to drop it down to 256 bit bus with GDDR5? at 40nm it would be cheaper to produce and so would the AIB. maybe I am missing something but it seems that would have been a card that would still have been competitive with ATI and their GPGPU strategy.

There was a planned die shrink for the gt200 chip, I believe it was the g212 chip, however it was canned. That was there plan however not much came of it. They are making some 40nm chips on the low end, the g210/g220, however those chips mostly only show up on OEM machines, and are the low end.

If there was a successful shrink of the gt200, it would have been release of it a few months ago. Of coarse TSMC 40nm process also has been sucking, so that hasn't helped things.
 
There was a planned die shrink for the gt200 chip, I believe it was the g212 chip, however it was canned. That was there plan however not much came of it. They are making some 40nm chips on the low end, the g210/g220, however those chips mostly only show up on OEM machines, and are the low end.

If there was a successful shrink of the gt200, it would have been release of it a few months ago. Of coarse TSMC 40nm process also has been sucking, so that hasn't helped things.

I never did hear what happened exactly. what was the issue? (if you have heard please) esp with the 40nm issues now corrected.
 
I never did hear what happened exactly. what was the issue? (if you have heard please) esp with the 40nm issues now corrected.

Production problems was the issue. I don't know the specifics why but I guess it was an issue with a bad batch of fabrications? There's an interview that's posted at the top of the video card section atm that talks a bit about 5850/5870 supply issues which states there were soem production issues but once again doesn't really give specifics. I think most 'specifics' come from sources like Charlie were you can only take them with a grain of salt.

Really, I think its kind of an internal company sort of thing they won't want to share. The same way Microsoft probably doesn't really want to share their RROD figures. There were hardware 'issues' that they have been working hard to correct is probably what they'll always reiterate until say were only the xbox '5' or equivilent when its such old news it won't matter anymore.
 
Back
Top