Will there ever be a day....

TheGooch69

Gawd
Joined
Dec 7, 2007
Messages
633
...when video cards are released that can play the newest games at max settings and high resolutions without break a sweat? My biggest grip with PC gaming is that games are released and the hardware out can barely run them at their maximum potential. Best example of this was Doom3. Today it's Crysis. It won't be until the 9 series or higher come out that you can run Crysis on max settings smoothly.

I'll step off the soap box now.
 
IMO, no. developers and nvidia/amd are constantly pushing the boundaries and improving/advancing their technology.
 
Resolutions keep getting higher, and graphics keep getting better. I don't think the graphics card market has enough competition to keep up with the pace right now.
 
It's a chicken and egg scenario... we want photo realistic gaming and we want hardware that makes it possible... we wouldn't have one if the other didn't keep pushing boundaries. I'm happy Crysis slows most systems down to a crawl. Push the hardware guys to make better cards... and with better hardware... better software comes out.

+1 for us.
 
I'm pretty sure the 8800GTX accomplished this on release. Also, my 6800GT played Doom 3 easily at high settings and I bought them on the same day.


 
It's a chicken and egg scenario... we want photo realistic gaming and we want hardware that makes it possible... we wouldn't have one if the other didn't keep pushing boundaries. I'm happy Crysis slows most systems down to a crawl. Push the hardware guys to make better cards... and with better hardware... better software comes out.

+1 for us.

Couldn't have said it better meself. :D
 
I'm happy Crysis slows most systems down to a crawl. Push the hardware guys to make better cards... and with better hardware... better software comes out.
I personally don't think too much emphasis is placed on how new games perform with respect to decisions made on newer architectures. Both AMD and NVIDIA follow a fairly rigid release schedule with parts being designed generations in advance. It's very likely at this point that NVIDIA is in the final design stages of D10E/G100, or whatever the codename for the next high end part after D9E will be, and in the early design stages of D11E/G110/whatever. We already know the D9x is quite imminent, and no major architectural changes are in store that wouldn't have already been in store if Crysis didn't exist. I think NVIDIA's more concerned with the ratio of pixel shading operations to other operations in a number of games and not at all concerned at how far X game overshoots hardware targets.

GPU advancements are primarily governed by API advancements (predominantly D3D), and I don't really believe that how well Crysis performs at 1920x1200 has that much of an effect on what's 18-36 months down the road. It might be a gentle nudge, but the hardware manufacturers aren't getting their knickers in a twist about it.

I think the big problem is that games like Crysis set a very high bar for production quality (with respect to graphics, anyway) that we now expect other future titles within the same genre to start hitting.
 
Gooch, the horse goes in front of the carriage. It is a matter of fact that software can always out pace hardware, especially when it comes to games. These guys are trying to sell engines, not just a game, and that engine needs to have a life of it's own for at least a few years.

Hell, I remember when Quake came out... if you had a 3dfx card, you were sitting pretty, but it still taxed the system. Resolution and tweaking were required to make it run as you liked, but it took a while for the hardware to catch up.

Get used to it, and don't let it bug you because it will never change.
 
It would be dumb of the designers to do so, because the game will be obsolete and forgotten within one year of release.
 
Well, if you were present around a year ago, you would have witnessed this phenomena in the form of the 8800 GTX
 
Well, as I recall, for about 9 months or so up until the release of Crysis, my Ultra kicked the crap out of every game thrown at it. It cost a lot of course but I was very pleased. We're talking 1920X1200, 16XAA and 16XAF. The ultra still plays Crysis fairly well, but now I have to turn it down to 1680X1050 and 4XAA with everything on high settings. Still looks pretty damn nice.
 
You know, it's easy to forget that our idea of "Maximum settings" is a moving target as well.

It wasn't too long ago that 1024x768 was the maximum resolution you'd even consider running a game at. I recall the first video card that allowed me to run Deus Ex at 32-bit color instead of 16-bit. There once was a time when 2xAA and trilinear texture filtering was "max settings". You've got all sorts of options for "volumetric effects" and "shader quality" and "shadow detail" in games these days that you didn't used to have.

So when you talk about running at the maximum and the best, what are you really talking about? I believe that the 8800GT I'm buying will run Crysis very playable at 1440x900 with 4xAA and other settings at or around medium. Is that maxing out the best and most expensive hardware out there? No, but it's doing a damn good job with what's on my desk.
 
Well , you also have to factor in all the other components in your system aswell, not just the video card , althought it plays the lead roll, it needs support from the CPU. But in answer to your question , no, its all about the money.
 
I would much rather the software come out and push hardware than the other way. Also, it has been nice to replay my older games "the way they were meant to be played."

Call of Duty 2 looks sick with my 8800GT @ 1680 x 1050 max everything

as do all of the other "older" games
 
Some good points. I actually just got SLi working on my computer a few minutes ago. Two BFG Tech 8800 GTS 640mb. Not the cream of the crop, but it's up there.

And Crysis STILL runs only decent! You don't want to know how low of a resolution I am playing at either.

UT3 runs perfect however, but the real test comes when I get my new monitor and can run at high resolutions.
 
I'm pretty sure the 8800GTX accomplished this on release. Also, my 6800GT played Doom 3 easily at high settings and I bought them on the same day.



Exactly. On the month of release of a new generation, those cards can play everything without breaking a sweat.
 
Back
Top