Crysis 2 Graphics: PC Vs. Xbox 360

How 'bout we compare them after the PC version is completely finished,(i.e. DirectX 11)?
 
As much as i adore my xbox, i think consoles are going to fall pretty far behind in the graphics in the next three years.

I mean, the PC version in the video looks similar to the Xbox's But its the little details that the PC provides that completely the entire look of the game.

Of course i know that graphics do not make the game, but they do enhance the experience when the Mechanics are well thought out.
 
Maybe the game should have been "finished" before it was launched? This is another game I won't be buying (on xbox or PC) until it hits the $5 bin. I suggest more people start doing the same.
 
How 'bout we compare them after the PC version is completely finished,(i.e. DirectX 11)?
Maybe the game should have been "finished" before it was launched? This is another game I won't be buying (on xbox or PC) until it hits the $5 bin. I suggest more people start doing the same.
 
How 'bout we compare them after the PC version is completely finished,(i.e. DirectX 11)?

Agreed. That would be ideal. But this is what we have right now. as an example of a title that both is system intensive on a PC and runs on a console.

There are subtle differences on the PC version. A few more details and items in some places. Textures seem to have higher quality things look clearer.

That being said the biggest issue I have with this comparison is the scaled nature of each side. Not seeing a pixel for pixel comparison is misleading.

The PC is rendering this natively at 1280x720. The Xbox is likely rendering this at some lower resolution, and then upscaling it to 720p.

Doing some sort of split screen instead of a squished full version on each side likely would have made this difference rather striking. In this video it isn't.
 
As much as i adore my xbox, i think consoles are going to fall pretty far behind in the graphics in the next three years.

I mean, the PC version in the video looks similar to the Xbox's But its the little details that the PC provides that completely the entire look of the game.

Of course i know that graphics do not make the game, but they do enhance the experience when the Mechanics are well thought out.

They were already "pretty far behind," three years AGO. Three years from now?
 
I honestly doubt Crytek has any intentions of patching in DX11 and if they do it will be something thrown together to shut people up.
 
As much as i adore my xbox, i think consoles are going to fall pretty far behind in the graphics in the next three years.

I mean, the PC version in the video looks similar to the Xbox's But its the little details that the PC provides that completely the entire look of the game.

Of course i know that graphics do not make the game, but they do enhance the experience when the Mechanics are well thought out.

I don't think you have anything to worry about until Microsoft/Sony start unveiling their next generation.

Developers and publishers are tied to console technology now as they view it as their biggest lucrative market and the PC market as a possible liability as far as going out on a limb to produce exclusives or pushing the hardware limits beyond what their ideas of what mass consumption can sustain.

CryEngine 3 was developed for consoles and even though it has some scalable goodies for the PC iteration of Crysis 2, it does not represent any real generational leap in technology when that is certainly possible given how far our hardware has advanced.
 
Maybe the game should have been "finished" before it was launched? This is another game I won't be buying (on xbox or PC) until it hits the $5 bin. I suggest more people start doing the same.

I didn't even know Crysis 2 had been released yet. I think I might have to go console (rental) on this one, I want to play it in 3D on my TV but I have a 5870 card so 3D is pretty iffy on my computer.
 
Maybe the game should have been "finished" before it was launched? This is another game I won't be buying (on xbox or PC) until it hits the $5 bin. I suggest more people start doing the same.

Those are pretty strong words for a game that hasn't been released yet. This is a beta release which doesn't necessarily reflect the final version -- and the PC version says something to that effect when you start the game.
 
Zarathustra[H];1036947296 said:
The PC is rendering this natively at 1280x720. The Xbox is likely rendering this at some lower resolution, and then upscaling it to 720p.

Doing some sort of split screen instead of a squished full version on each side likely would have made this difference rather striking. In this video it isn't.

Either way I don't understand why this comparison is even necessary.

Comparing a 2005 era GPU to a modern one is kind of a silly exercise. The XBOX 360 is completely outclassed by a PC with a GTX580 :p

Its tough to tell from this video (again due to the post processing to squeeze the frames together) but if this isn't the case, the only reason why would be inefficiencies due to the cross-platform engine.
 
IIRC, people were digging through the leaked beta and the Xbox and PS3 developer files that were also inside it and concluded that the consoles are basically just running the game on the equivalent of medium or low graphics with a few tweaks here and there.
 
It looks like I'm alone on this, but I'm actually looking forward to this game. It looks great and from the sounds of it, they fixed a lot of the issues I had with the suit and other gameplay mechanics from the first one/warhead.

Hopefully it performs better than the original as well.
 
Zarathustra[H];1036947337 said:
Either way I don't understand why this comparison is even necessary.

Comparing a 2005 era GPU to a modern one is kind of a silly exercise. The XBOX 360 is completely outclassed by a PC with a GTX580 :p

Its tough to tell from this video (again due to the post processing to squeeze the frames together) but if this isn't the case, the only reason why would be inefficiencies due to the cross-platform engine.

The Xbox may be completely outclassed by a modern GPU, but what if the software can't properly take advantage of all the features and power of that GPU because the engine was designed from the ground up to be for the lower powered consoles with minor scalability for PCs?

We really don't know anything at this point. We'll have to wait until the real release and when modders begin digging into the game.
 
Tons of shader differences, and texture resolution differences. But wow are they subtle in most cases.

However, turn the PC resolution up to 1920x1080 from that 720p upconverted crap, throw in mouse and keyboard controls, and instant win.

Haven't seen any real gameplay, but from what i can tell they did an amazing job optimizing it for a console.

One thing that sticks out is the HDR brightness, i've often wondered if thats even a good thing, sometimes when an enemey runs into some bleachingly bright spot and they can still see me im like, thats cool looking, but it got me deaded. From a competitive stand point im not sure its worth having enabled.
 
Meh! will pass on the console version at this point not sure about the pc version yet, son downloaded it on Xbox360 and it looked and played like crap.
 
Hard to tell the actual quality with those squashed, extremely contrasty videos. Obviously, the PC version looks quite a bit better, but how much better will it be at release?
 
I'm hoping that we can make technically impressive levels with the Sandbox editor in the PC version at least. I have low expectations for this game, I'm sure I'll find it a major letdown from Crysis and Crysis: Warhead as I did FEAR 2 from FEAR 1. :(
 
Yeah after playing the demo it confirms like the previous too games all I'm going to care about with Crysis 2 is the single player so I'll wait till I can get it dirt cheap.
 
Zarathustra[H];1036947296 said:
Agreed. That would be ideal. But this is what we have right now. as an example of a title that both is system intensive on a PC and runs on a console.

Crysis2 is poorly optimized for the PC, that's the only reason it's "system intensive".

The PC is rendering this natively at 1280x720. The Xbox is likely rendering this at some lower resolution, and then upscaling it to 720p.

Thus, the PC version has the sharper image.
 
I really like how the hud moved with your sway on the PC version, esp underwater. It's minor, but I like it.
 
Say what u want about the Xbox 360- but it is really impressive what they can do with it's existing graphics power. Kind of makes you wonder if we really do need GPU advancements every 6-12 months for the PC. Is it really necessary? or is it just lazy programming and optimization?
 
Say what u want about the Xbox 360- but it is really impressive what they can do with it's existing graphics power. Kind of makes you wonder if we really do need GPU advancements every 6-12 months for the PC. Is it really necessary? or is it just lazy programming and optimization?

Not very difficult when all the hardware is the same. PC's everyones PC is different so its harder to code for it to work on every single PC, although I think its just lazy shits. Consoles look great on the screen most of the time, but when you look at an object further away it looks like pure garbage.
 
Say what u want about the Xbox 360- but it is really impressive what they can do with it's existing graphics power. Kind of makes you wonder if we really do need GPU advancements every 6-12 months for the PC. Is it really necessary? or is it just lazy programming and optimization?

Play Crysis 1 on PC on very high, 4xAA at a high resolution, and see if xbox can even remotely compare.
 
Maybe the game should have been "finished" before it was launched? This is another game I won't be buying (on xbox or PC) until it hits the $5 bin. I suggest more people start doing the same.

What are you going on about? It has not even been launched yet.

Apparently we should all prematurely judge a game based on a video of a guy standing in place?
 
Say what u want about the Xbox 360- but it is really impressive what they can do with it's existing graphics power. Kind of makes you wonder if we really do need GPU advancements every 6-12 months for the PC. Is it really necessary? or is it just lazy programming and optimization?
Please don't mistake the AVAILABILITY of enhanced hardware for a requirement.

My laptop from 2006 still performs at the same graphics level in games as my sister's Xbox 360, and it should, the laptop has a slightly faster graphics processor than the 360.

Remember that the 360 is usually playing at a lowly 720p, apparently zero AA from what I can tell with all the jaggies, textures are usually pretty poor, object pop is EXTREME in some titles, lighting is usually mediocre, and framerates are rarely all that great.

Whats nice about the 360 is that its a turn-key solution and its cheaper than buying an entire computer with a good GPU (although more expensive than just getting a GPU that outperforms the 360's in an otherwise halfway decent computer). But the fact is, the 360 is obsolete technology, and it has been for a while. The fact that no replacement is available doesn't change that fact. Its been very unreliable hardware to boot, with failure rates around 50% which is not seen with computers.
 
Hard to tell the actual quality with those squashed, extremely contrasty videos. Obviously, the PC version looks quite a bit better, but how much better will it be at release?
And how can you tell with youtube low bitrate compressed 720p footage?

On my 30", I'm running 1600p, whereas my PS3 although in theory should be able to at least do 1080p is always running in 720p as it just doesn't have the performance capabilities to run at higher resolutions without choking more than it already does.
 
I am not comparing the Xbox to the PC really. I am just saying did programmers really take advantage of a current available GPU or did they abandon it because a faster better one came out 6 months later? From my memory, isn't the Xbox 360 using a chip similar to the ATI 1900? I was just wondering if the 360 games take advantage of what that chip offers way more then what the PC games did when the 1900 was out. I am not trying to start a PC vs Console war.
 
Kind of makes you wonder if we really do need GPU advancements every 6-12 months for the PC. Is it really necessary? or is it just lazy programming and optimization?


A little of both. We're at the point where the hardware is outpacing software which only a few short years ago was completely opposite. I wouldn't say that it's lack of optimization or lazyness as much as it is economics. When the majority of their profits are coming from PC's with mid-level hardware and console sales, catering to the uber high end isn't a smart business decision. When the 360 was released, I was lucky to get anywhere near the kind of performance from my PC that the 360 was capable of. Now, 5 years later and it 360 looks like junk and even an entry level GPU could give it a run for it's money.

When new consoles are released, we'll start seeing titles that eat our machines for dinner again and eventually the hardware will once again outpace it and we'll be reading threads on [H] about how much consoles are ruining PC gaming lol It's a viscous cycle
 
Anyone notice the PC side contrast was jacked and Gamma was screwed up? I mean look at it at 0:51. The boat is glowing white, and even the ammo at the bottom right is so screwed you can't see how much there is.

Lets do a 16xAA/4xAF 1920x1080 vs 1080p comparison, and not a 480p blowup of 1024x768 with screwed up settings.
 
Anyone notice the PC side contrast was jacked and Gamma was screwed up? I mean look at it at 0:51. The boat is glowing white, and even the ammo at the bottom right is so screwed you can't see how much there is.

Lets do a 16xAA/4xAF 1920x1080 vs 1080p comparison, and not a 480p blowup of 1024x768 with screwed up settings.

Crysis 2 uses Temporal AA and is not adjustable by the user. What you see is what you get.
 
I am not comparing the Xbox to the PC really. I am just saying did programmers really take advantage of a current available GPU or did they abandon it because a faster better one came out 6 months later? From my memory, isn't the Xbox 360 using a chip similar to the ATI 1900? I was just wondering if the 360 games take advantage of what that chip offers way more then what the PC games did when the 1900 was out. I am not trying to start a PC vs Console war.
What they can do is tweak the settings in the game to eek out the best graphics while maintaining an acceptable (usually) minimum frame rate.

What confuses people in general is that with new game titles they try to play the game with all settings maximized at 1200p, and their six month old higher end video card can't do it. SHENANIGANS!!!

Well, the Xbox 360 can't do it either, the difference is that its the game essentially on low settings in some areas and medium settings in others, with limited draw distance and 720p true resolution upscaled to 1080p. If you reduced an old computer to those limited settings, it wouldn't have an issue either.

So it is far less about "optimization" and far more about limiting graphics settings on new games.
Lets do a 16xAA/4xAF 1920x1080 vs 1080p comparison
I don't think you can. There are only a handful of 360 games that are true 1080p output, and those are the less graphically intense games.

There's only so much you can do with hardware from 2005. What we need is an Xbox 720 that has backwards compatibility with Xbox 360 games, which would be easy to do. New games could then be configured with two simple graphics setting presets to easily scale between the two... heck, PCs do it and they have thousands of possible hardware configurations, so with just two its a piece of cake. :)
 
Zarathustra[H];1036947337 said:
Either way I don't understand why this comparison is even necessary.

Comparing a 2005 era GPU to a modern one is kind of a silly exercise. The XBOX 360 is completely outclassed by a PC with a GTX580 :p

Its tough to tell from this video (again due to the post processing to squeeze the frames together) but if this isn't the case, the only reason why would be inefficiencies due to the cross-platform engine.

Yeah.. X1900XT vs a GTX 580.. I mean Compare Crysis 1 to Crysis 2..LOL
 
LOL...Dearly beloved, we are gathered here today to pay our final respects to the PC as a gaming platform. Due to consoles becoming the next "big thing" for the corporate whores, we PC gamers are a dying breed. A truly sad time is coming....Are there ANY gaming software companies out there who actually still believe in the PC?.....
 
When new consoles are released, we'll start seeing titles that eat our machines for dinner again and eventually the hardware will once again outpace it and we'll be reading threads on [H] about how much consoles are ruining PC gaming lol It's a viscous cycle
That didn't happen with the Xbox 360 or PS3 release, so why would it happen this time around?

This is from my 2006 Dell Laptop @ 1080p:
http://farm1.static.flickr.com/153/436936698_953d771fe4_o.jpg

And Xbox 360 @ 720p:
http://xbox360media.ign.com/xbox360...der-scrolls-iv-oblivion-20060324070902088.jpg

Ground textures on the 360 are horrible, the draw distance ridiculous, no AF enabled at all, low res LOD textures, etc. Now you can say that perhaps Oblivion was horribly ported to the Xbox, but try Mass Effect side-by-side for example, its like night and day.
 
Back
Top