Doesn't Crysis make you wonder...

zamardii

2[H]4U
Joined
Nov 22, 2004
Messages
3,106
...whether even the developers can play their own game at the highest graphical settings and get no lag? They made it, so wouldn't they be able to play and/or showcase their game to people with all the settings cranked up? Maybe they have Nvidia hardware they we haven't even heard of yet. Conspiracy!!! Just kidding...


Thoughts?

P.S. I know there are enough Crysis threads, but this one is at least different...:cool:
 
the could always render gameplay demos at 30fps or whatever after theyve finished playing it on low settings.
 
I've listened to a few interviews with the devs. Those Crytek guys purposely make their game ridiculously high spec because they feel it will keep their game relevant years after the game has been released. I guess they have a point - how long had Far Cry been a bench mark ?
 
When Doom 3 was being showcased back when the top dog was still GF 4600 Ti (or w/e)
They showcased D3 with a 9700 Pro which was only known as 9700 and no benchies where out

So it wouldnt surprise me if they run this game on unannounced tech

Or, a pair of 8800 Ultra and beta drivers that enable sli should do the trick
 
It's possible they were running on the next gen Nvidia stuff. The 8800 series is over a year old now, they must have something on the backburner waiting for the market to get competitive again.
 
No, they been running on single 8800 cards. I think it's just that there are a few driver issue that need to be ironed out with vista.
 
Traditionally, when developing and or testing, you use systems that best reproduce the customer experience.

While I imagine they have tested this product with quad core chips and dual Ultra's in SLI, I would fathom that most of the product was run out on a pretty average 8800 system.

They have no access to special equipment, think about it, what good would it do for them to create and test their product on equipment the consumer can't even buy?
 
This is how the Hardware companies get us to shell out top dollar for a slight improvement in performance:

8800’s in SLI wont play this game with all the visual details turned to max but its ok cause we have this new card called XXXX Ultra that will play the game with all the eye candy you desire. The new card costs $800 but if you mail us you 6 month old 8800's we will give you a $50 rebate on the new card.



I think the coding is done the way it is done for a reason and that is to push the high end consumers to spend their money on the latest and greatest computer hardware.
 
Traditionally, when developing and or testing, you use systems that best reproduce the customer experience.

While I imagine they have tested this product with quad core chips and dual Ultra's in SLI, I would fathom that most of the product was run out on a pretty average 8800 system.

They have no access to special equipment, think about it, what good would it do for them to create and test their product on equipment the consumer can't even buy?

The customer can buy it though, just a few months later. Although a few specs would change the customer would still be buying the base architecture that the pre-release engineering samples run on. So they would be testing to see how well their game performs on next generation cards.

I'm not saying it's the case here, but it's not dumb to think that they have access to what Nvidia will be running next year.
 
The customer can buy it though, just a few months later. Although a few specs would change the customer would still be buying the base architecture that the pre-release engineering samples run on. So they would be testing to see how well their game performs on next generation cards.

I'm not saying it's the case here, but it's not dumb to think that they have access to what Nvidia will be running next year.

I never said it was dumb to think that, I'm just saying they don't.

Trust me.

I work inside the business, I know how this stuff works.

They have nothing to gain from testing their product on architecture that isn't shipping, things change, there is no promise that a product NVidia gives them today will be anything at all like what NVidia ships down the line.

All they have to do is test on what's out there now and make sure the code works with exsisting DirectX code, that's it.

As I said, they have access to top of the line machines, quad core, SLI Ultra's, etc, but the bulk is done with the consumer in mind, that would be, basic DX10 video cards and dual core CPU's. Which, still, within the terms of current technology is quite advanced as it is.

Let me repeat, I do not claim it's a dumb thought to have, it sounds logical, but, knowing what I know, they do not work this way.
 
Video games are not made to run resolutions over 1280x1024. I am sure Crysis runs fine at highest possible settings on a 8800gtx at 1280x1024. With SLI 8800GTXs I am sure you can enable AA too. Hell, I remember being ecstatic to run Quake at 1024x768.
 
Traditionally, when developing and or testing, you use systems that best reproduce the customer experience.

While I imagine they have tested this product with quad core chips and dual Ultra's in SLI, I would fathom that most of the product was run out on a pretty average 8800 system.

They have no access to special equipment, think about it, what good would it do for them to create and test their product on equipment the consumer can't even buy?

what are you talking about...??? how would they know their game will run on the next gen hardware without testing it on it first? i think it's definitely plausible that they have run it on hardware nobody knows about yet, like the X3800's and 8900's or whatever they will be called.....you can bet your ass that NV and AMD want to get their newest hardware in the hands of cutting edge software devs so they can make sure well in advance that the game will run well on it.....anything less is suicide
 
Video games are not made to run resolutions over 1280x1024. I am sure Crysis runs fine at highest possible settings on a 8800gtx at 1280x1024. With SLI 8800GTXs I am sure you can enable AA too. Hell, I remember being ecstatic to run Quake at 1024x768.

It doesn't, I tried; and i have a 8800GTX, and slightly overclocked too to 612/1022
 
what are you talking about...??? how would they know their game will run on the next gen hardware without testing it on it first? i think it's definitely plausible that they have run it on hardware nobody knows about yet, like the X3800's and 8900's or whatever they will be called.....you can bet your ass that NV and AMD want to get their newest hardware in the hands of cutting edge software devs so they can make sure well in advance that the game will run well on it.....anything less is suicide

Nope.

Trust me.

This is my job.

You are wrong.

Sorry.
 
It doesn't, I tried; and i have a 8800GTX, and slightly overclocked too to 612/1022
That's odd. I'm using a 8800GTS 640mb and it runs fine for me on those settings. What drivers are you using? I'm using the 169.01 beta.

It feels to be running around ~40fps at 1280x1024 with everything on high. I haven't FRAP'd it yet but it feels pretty smooth. The effect of motion blur might be adding to the illusion of smoothness. But at those settings, it's absolutely playable.
 
I remember when they had a tech video for the new UT3 Engine maybe 3yrs ago? Ran like crap on the machine they viewing it on.
 
Video games are not made to run resolutions over 1280x1024.

I guess you haven't heard of the Xbox 360 or the PS3 and their silly 1080p (1920x1080) games. Games are meant to run at whatever resolution gamers want to run them at, and if your hardware can't handle it, it's time for an upgrade. Some games handle high resolutions better than others, and it seems Crysis doesn't have any major performance issues with very high resolutions. I was able to run the demo quite nicely on all High settings at 1920x1200.
 
I think they have the hardware to push it , only thing is there not selling it to the public yet because of the price, they obviously give it to the military and other places first , then worry about consumer products.
 
I guess you haven't heard of the Xbox 360 or the PS3 and their silly 1080p (1920x1080) games. Games are meant to run at whatever resolution gamers want to run them at, and if your hardware can't handle it, it's time for an upgrade. Some games handle high resolutions better than others, and it seems Crysis doesn't have any major performance issues with very high resolutions. I was able to run the demo quite nicely on all High settings at 1920x1200.

Every single Xbox 360 games runs at 720p, 1280x720, which is lower resolution thant 1280 x 1024. Most of them don't even use AA and AF.

90% of PS3 games are also 720p.
 
I remember in the manual for Mechwarrior 2 they recommended a cray supercomputer to run the game with the highest settings :eek:
 
I never said it was dumb to think that, I'm just saying they don't.

Trust me.

I work inside the business, I know how this stuff works.

They have nothing to gain from testing their product on architecture that isn't shipping, things change, there is no promise that a product NVidia gives them today will be anything at all like what NVidia ships down the line.

All they have to do is test on what's out there now and make sure the code works with exsisting DirectX code, that's it.

As I said, they have access to top of the line machines, quad core, SLI Ultra's, etc, but the bulk is done with the consumer in mind, that would be, basic DX10 video cards and dual core CPU's. Which, still, within the terms of current technology is quite advanced as it is.

Let me repeat, I do not claim it's a dumb thought to have, it sounds logical, but, knowing what I know, they do not work this way.


Yeah, I'd have to agree. I'm a game developer as well, and we only use single 8800 cards as our standard. And our title is set to release in 2-3 years so by the time we release, there will already be next generation cards available.
 
That's odd. I'm using a 8800GTS 640mb and it runs fine for me on those settings. What drivers are you using? I'm using the 169.01 beta.

It feels to be running around ~40fps at 1280x1024 with everything on high. I haven't FRAP'd it yet but it feels pretty smooth. The effect of motion blur might be adding to the illusion of smoothness. But at those settings, it's absolutely playable.

I'm not using BETA drivers. You said you have everything on "High," but you said in your previous post that you should be able to play "everything at the highest settings with a 8800GTX," and I was talking about the settings on Very High at 1280x1024 in Vista.
 
I remember in the manual for Mechwarrior 2 they recommended a cray supercomputer to run the game with the highest settings :eek:
Ahh, the memories.

If you take a look at the readme you'll see the recommended detail settings for each class of processors (P1s, P2s, etc...) the funny thing is at the end of the list you'll see the specs for a Cray supercomputer!!! The recommended settings for it are high detail on everything and 1024x768 resolution (which is impossible for the game to achieve).... guess that was "super-computer" -class detail in 1996....

From: http://www.mobygames.com/game/mechwarrior-2-mercenaries

I have that game and I will snag the readme when I get home from work, :)
 
Another developer ditto on running next-gen cards.

While ATI or Nvidia *may* send us an engineering sample or two for compatibility testing on a few machines, those samples are of very soon to be released hardware, just weeks away from release, not a year down the line. Partly because samples are in short supply prior to launch, and partly to keep NDA leaks down.

About half the time in the studios I've worked at, those sample cards sit in the box for months before a busy coder decides to take the time to install one, if they get installed at all before the retail version comes out.
 
I never said it was dumb to think that, I'm just saying they don't.

Trust me.

I work inside the business, I know how this stuff works.

They have nothing to gain from testing their product on architecture that isn't shipping, things change, there is no promise that a product NVidia gives them today will be anything at all like what NVidia ships down the line.

All they have to do is test on what's out there now and make sure the code works with exsisting DirectX code, that's it.

As I said, they have access to top of the line machines, quad core, SLI Ultra's, etc, but the bulk is done with the consumer in mind, that would be, basic DX10 video cards and dual core CPU's. Which, still, within the terms of current technology is quite advanced as it is.

Let me repeat, I do not claim it's a dumb thought to have, it sounds logical, but, knowing what I know, they do not work this way.

I would have to disagree. I too have worked in the industry (recently, (though now I'm in IT,)) and we tested on engineering samples for all manner of upcoming products. However, we also tested hardware that was several years old. To think that testing of a PC game product is limited in scope to only what is currently available is silly. We even tested on obscure/unpopular hardware that most people were not likely to have (like mid/high-end workstation 3D chipsets, etc.) We weren't pushy about bugs on this sort of hardware, but we at least documented it.

Anyway, lots of high-end devs get engineering samples long before release, and while they don't test exclusively on them, they do test them. They also provide feedback to the hardware manufacturers for drivers issues etc.
 
Another developer ditto on running next-gen cards.

While ATI or Nvidia *may* send us an engineering sample or two for compatibility testing on a few machines, those samples are of very soon to be released hardware, just weeks away from release, not a year down the line. Partly because samples are in short supply prior to launch, and partly to keep NDA leaks down.

About half the time in the studios I've worked at, those sample cards sit in the box for months before a busy coder decides to take the time to install one, if they get installed at all before the retail version comes out.

I would however agree to your point about them not being "year off" samples. Only the next iteration. However, I would expect top-tier devs to have G9X samples right now.
 
Every single Xbox 360 games runs at 720p, 1280x720, which is lower resolution thant 1280 x 1024. Most of them don't even use AA and AF.

90% of PS3 games are also 720p.

Maybe (I don't follow consoles closely anymore) but that doesn't change the fact that many of the screenshots released for most PC games (including Crysis) are much larger than 1280x1024. Nevermind the fact that 1280x1024 is an unusual aspect ratio (5:4). 24" 1920x1200 LCDs are very popular right now, so naturally developers can expect people to be gaming on them.

Saying that games aren't mean to be run at resolutions higher than 1280x1024 is just silly. Anyone who believes that needs to acknowledge the present and stop living in the past.
 
What's strange about Crysis for me is that lowering the res to 720p gave me a performance boost, but not THAT much. Details made a much larger impact.
I play at 1920x1080 with the details at medium and everything looks great.
I guess the one thing I'm curious about is what they were running those early demos on. The ones with the snakes seemingly moving at 100 fps and the jungle looking like a photograph. That footage looks better than anything in the demo and it's running at an insane framerate, although vsnyc is clearly off.
I'm curious what kind of hardware/setting that was taken from.

Are most people still at 1280x1024? I'd guess so. New monitors are getting more and more common with higher resolutions, but as you all know - the monitor's usually the LAST thing people upgrade.

As for Crytek having an 8900, it wouldn't shock me. I don't think it has anything to do with Crysis' current performance, but you have the think Nvidia probably hooks up one of the most prominent developers around...at least for a little early testing.
 
I would however agree to your point about them not being "year off" samples. Only the next iteration. However, I would expect top-tier devs to have G9X samples right now.

Maybe Valve, id, Epic and Crytek have them them...we don't, and we're a top tier studio with a shipped, strong-selling DX10 product.

Granted, we're an RTS studio, not FPS.
 
Yep same here, but the boss said we will all get them as soon as we finish the beta for our game.

We are skipping DX6, 7, 8, 9 and 10 and going straight for DX11 for DN4.

:rolleyes:
 
What's strange about Crysis for me is that lowering the res to 720p gave me a performance boost, but not THAT much. Details made a much larger impact.
I play at 1920x1080 with the details at medium and everything looks great.
I guess the one thing I'm curious about is what they were running those early demos on. The ones with the snakes seemingly moving at 100 fps and the jungle looking like a photograph. That footage looks better than anything in the demo and it's running at an insane framerate, although vsnyc is clearly off.
I'm curious what kind of hardware/setting that was taken from.

Are most people still at 1280x1024? I'd guess so. New monitors are getting more and more common with higher resolutions, but as you all know - the monitor's usually the LAST thing people upgrade.

As for Crytek having an 8900, it wouldn't shock me. I don't think it has anything to do with Crysis' current performance, but you have the think Nvidia probably hooks up one of the most prominent developers around...at least for a little early testing.

Especially when its in the "The Way Its Meant to be Played" program :)

you get all kinds of lil perks and help testing when you're buds w/ nVidia or ADM/ATi
 
...whether even the developers can play their own game at the highest graphical settings and get no lag?

Lag should not be caused by graphical settings; it is merely an issue of internet connectivity.
 
The usage of the term lag has been blurred throughout the years to incorporate both internet connectivity and/or framerate issues. I think he's speaking of the latter ;)
 
So I am reading this thread and a re-occuring thought keeps coming to mind: After I have set up the graphics so it does not lagg/ is playable do I ever bother to go back and check my settings/ tweak / benchmark or do I just play?

The answer is I just go ahead and play. Sometimes we can get a little too wrapped up in the details and not enjoy the bigger picture.

For the record I was a little disappointed my single 8800GTX could not run ubber high at 1920x1200 :rolleyes:. But if it could then I may have been disappointed with the game. So I guess I have SLI to look forward to (I hope they have it fixed now [they just went gold]).
 
Every single Xbox 360 games runs at 720p, 1280x720, which is lower resolution thant 1280 x 1024. Most of them don't even use AA and AF.

90% of PS3 games are also 720p.

Actually some of them are only 640p, not even high definition.
 
The usage of the term lag has been blurred throughout the years to incorporate both internet connectivity and/or framerate issues. I think he's speaking of the latter ;)

Not a very thoughtful use of the term. If you think lag- think back when you were playing quake and you got the phone jack popping up or high ping whatever.
 
My big complaint about making a game that can not be run smoothly at MAX on the current high end gear is this. I'm going to play the game now! in a year when there is a videocard/cpu setup available that will play it at max I will not be playing the game! I will be on to bigger better newer titles. So really I will never see Crysis for what it is capable of.
 
Back
Top