Hellgate: London DX10 vs. DX9 Gaming Experience @ [H]

Why is this motion blur abomination becoming popular with video game makers? Motion blur only occurs with film cameras operating at fairly low frame rates (24 fps). I'm playing a video game that has no film, no grain, and no "exposure" time to thus get "blurred", but why are they blurring my gaming experience?

Your eyes don't see motion blur because your eyes follow the action quite quickly AND your brain accounts for any missing or obscured details not seen. Eyes jump, they don't pan. Try it, look from the left to right and try and force yourself to "pan". It won't happen. Your eyes jump from key focal points to the next. There is no motion blur at all. Motion blur is only a result of a 80+ year old technology.

Motion blur is not natural and is a bit disconcerting while playing a modern video game. I get the feeling that it's just another bunch of challenging, but useless, "eye candy" game makers use just to say their game (or engine) does this and the other doesn't.

Don't ask me about "depth of field" effects either. A similar argument can be made against that abomination too. Hint: camera's have smaller depth of field than your eyes (someone with 20/20 vision). Which means the blurring of "unfocused-on" details is quite minimal compared to a typical camera. The director would not allow such a distraction from the action at hand, in a movie.
You're just wrong on so many levels...I'll try to point out a few things.

Now, yes, your eyes generally lock onto objects - they don't pan in a smooth way like a camera would - which is why the Image Motion Blur when you move the camera really fast IS unnatural, BUT...If you look at anything and an object moves through your field of view, you'll see it blurred. There's no maybe about it. Anything moving fast will look like a blur to human eyes, therefore Object Motion Blur is indeed realistic, and if it weren't used in movies, anything 3D that moves would always look fake right away.

Now, depth of field works too, and directors use it all the time. When a camera's behind a person, but focuses on a person 10 feet away because they're the one talking, the camera will blur whatever's close to it. So yes, directors would allow it and use it all the time. Games also tend to use it during cinematics (best time to use it) and when you zoom in with weapons. Obviousely if you ALWAYS blurred the background (like in Hellgate), it would look unrealistic, since the game can't know what your eyes Want to focus on in the first place. That doesn't mean DOF effects are always bad and unrealistic. Screenshots from games with proper post-effects (DOF, motion blur, glows, etc.) will always look more realistic than screenshots without those effects. In motion, the game simply can't know what your eyes are trying to look at, so DOF effects won't always be perfect.

Motion blur though, per object, is definitely always good, although slow to process so it hasn't been used properly so far (Crysis and Hellgate are the games closest to using it well). The way GoW and TF2 used it it's unrealistic and pointless, since fast-moving objects are always crystal clear and they're never like that in real life unless your eyes follow them.
 
Something is really starting to bug me about these game reviews.

Where is the cpu part of it?

We all know supreme commander runs like crap on any dual core, and some others are happy on a 2ghz single core. If [H] wants to stay on top of the consumer review foodchain I want to see some cpu useage statistics.

The test setup includes a near 3ghz intel dual, what about the e6300 guys? It isn't hard to run a couple extra tests with any given video card to see what kind of impact a single core, or slower dual, or quad has on a game. I'm sure everyone is interested.

These aren't necessarily whole system game reviews, but rather a 'video card gaming experience' game evaluation. We seek to find out how these games perform on various video cards and what DX10 brings to the table over DX9 and if it is worth it or not and look at IQ. Whole CPU scaling game evaluations take a tremendous amount of time and money to generate, we have done a few in the past, notably Doom 3 and SupCom, we will do more in the future where necessary to show off multi-core utilization.
 
These aren't necessarily whole system game reviews, but rather a 'video card gaming experience' game evaluation. We seek to find out how these games perform on various video cards and what DX10 brings to the table over DX9 and if it is worth it or not and look at IQ. Whole CPU scaling game evaluations take a tremendous amount of time and money to generate, we have done a few in the past, notably Doom 3 and SupCom, we will do more in the future where necessary to show off multi-core utilization.

Maybe what he wants is an update to the [H] CPU scaling article. I think it could probally use a refresher (new cpus); in peoples minds atleast.

What I was / am really hoping for is a Crysis review/ video card evalluation thingy. Sadly there is no new hardware out that can do that title justice. This article should have some of the dual core "promises" covered as well.

Come on NV give us a Merry Christmas I want a $500-$600 9800GTX!! with Crysis on "VERY HIGH" across the board @ 60fps...

Oh, and a Pony. :D
 
Uggghhh.. Damn bugs..
This game is angering me greatly so I'm just going to wait a couple patches.
Yes It looks a little better in DX10 but I really want to see a fully D3D10 game because it should look better and run better.
 
This is another weak attempt to try to show there's difference between DX10 and DX9. All the differences were artificially made and DX9 would probably have run them faster if applied.

And I agree to above posters, motion blur in games sucks because it's overused. An object moving 1 inch per second should not get blurred like it does on games now. All it does is lag the system and make the view harder to read.

Same goes with HDR. Every implementation seen till today have been horrible. If HDR means the whole game scenery will be near black&white and washed out, dont' use it at all. Horrible when idiots apply new technology without even understanding how it's supposed to be used.
 
I'm 50/50 on the DX10 effects. The shadow, weather, and smoke effects do look great. However, there is no way I'd play a game without any AA enabled (yep, call me a snob). Not to mention...what the hell is this motion blurring crap. I don't care how much 'realism' it adds, but it will never be an effect I 'want' enabled in any game I play. HDR (I know I'm not blinded whenever I walk outside yet it happens in games...come on), now motion blurring...what will devs think of next? Still no reason for me to downgrade to Vista for DX10 (or even buy a card solely for DX10).
 
Something is really starting to bug me about these game reviews.

Where is the cpu part of it?

We all know supreme commander runs like crap on any dual core, and some others are happy on a 2ghz single core. If [H] wants to stay on top of the consumer review foodchain I want to see some cpu useage statistics.

The test setup includes a near 3ghz intel dual, what about the e6300 guys? It isn't hard to run a couple extra tests with any given video card to see what kind of impact a single core, or slower dual, or quad has on a game. I'm sure everyone is interested.
I think the ultimate reason is that the games are, largely, not cpu-limited.
YES, supreme commander is, but it is an exception. As a result, for sake of simplicity, and making their results readable, comparable, and understandable, they switch the component of the pc that is most easily controlled: the video card.
I mean consider this: at 1600X1200 ish resolutions, with AA, AF and other capabilities enabled, the cpu will be used almost the same amount as at 1280X1024. The graphics card on the other hand....

That's not to say I wouldn't like to see some performance comparisons btw some of the new cpu's, but realistically, it would be hard to do cpu and vga comparisons on each game...
 
Great article,I find graphics can be buggy in DX9 or DX10 in HGL...

The game needs a SP patch yesterday,as well as stand alone patches made available to all,but FSS have all but abondened the SP support(check the offical forums).My wife and I have played it off and on for weeks now(in for TWO founders club's),with varying success online.

With the 169.01's AA makes for semi invisible characters.Havent tried the 169.09's yet in XP Pro SP2.Anyone else ?

Everything after the 01's give me crap OpenGL performance.I do agree that the DX10 effects look nice though.
 
Something is really starting to bug me about these game reviews.

Where is the cpu part of it?

We all know supreme commander runs like crap on any dual core, and some others are happy on a 2ghz single core. If [H] wants to stay on top of the consumer review foodchain I want to see some cpu useage statistics.

The test setup includes a near 3ghz intel dual, what about the e6300 guys? It isn't hard to run a couple extra tests with any given video card to see what kind of impact a single core, or slower dual, or quad has on a game. I'm sure everyone is interested.

CPU performance is somewhat irrelevant for the majority of todays graphically intensive games as most of the load has been handed off to the graphics processor (as it should). It's all about the GPU now and this article attempts to illustrate some comparisons between different cards and formats.
 
Great article guys! I have been playing the game since beta and have been going back and forth between DX9 and DX10. DX10 version of the game just tend to cause memory issue little bit more often then the DX9 version :(

About your gameplay evaluations. I actually have a setup that better then the one you guys used, but I found that in some levels, the FPS just drop too much in DX10 mode. Have you guys tried levels with a lot of fire, and act 3-4 levels where there are like 10+ other NPCs that are fighting with you?

Also, does your mouse feel more "floaty" in DX10 mode? For some reason, even when my FPS is around 30, the mouse just doesn't feel very responsive.

Finally, have you guys noticed any difference in the physics being used in DX9 and DX10 mode? I have been playing last night in DX10 mode (inspired by this article) and I noticed that when you break things (esp. those wooden racks that break apart into individual wood planks) in DX10 mode, the motion of the resulting pieces from my destructive habit are much more realistic, where the pieces sorta fall together in a general direction, unlike in DX9 mode where there seem to have no regard to the "weight" of the material that is broken, since the pieces just fly everywhere.

Thanks for the insight guys!!!


My spec:
Q6600 @ 3.53 GHz
Asus P5K Delux/WiFi
2x1GB Crucial PC -1066
EVGA 8800GTX @ 660/1060 w/ Forceware 169.09 beta (I didn't go for the .12 since it doesn't look like it had anything for HG:L)
RaptorX 150GB
Logitech G9 with Setpoint V5.0 Pro
 
Regarding motion blur, If developers just miimize the intensity of the effect it can be used more in games. Same thing with the bloom. There really is no need to blind people if in real life you don't get blinded; especially since this effect was made to increase realism. Since I havent played HG:L personally, I cannot comment directly on that game. But in other games like Crysis and GoW if they turned down the intensity by like 50%, it'd be much nicer to experience. It should also free up some precious fps.

I dunno, what do you guys think?
 
The massive performance hit of DX10 is not worth the small image improvement. DX10 is embarrassed yet again.
 
Regarding motion blur, If developers just miimize the intensity of the effect it can be used more in games. Same thing with the bloom. There really is no need to blind people if in real life you don't get blinded; especially since this effect was made to increase realism. Since I havent played HG:L personally, I cannot comment directly on that game. But in other games like Crysis and GoW if they turned down the intensity by like 50%, it'd be much nicer to experience. It should also free up some precious fps.

I dunno, what do you guys think?

Couldn't have said it better myself. The reason I have such a strong dislike for these effects is that nobody has implemented them correctly yet, IMO. Maybe if there was an intensity slider bar... Til then I refuse to wear sunglasses while playing a game with HDR enabled so I'll stick with leaving it disabled.
 
Couldn't have said it better myself. The reason I have such a strong dislike for these effects is that nobody has implemented them correctly yet, IMO. Maybe if there was an intensity slider bar... Til then I refuse to wear sunglasses while playing a game with HDR enabled so I'll stick with leaving it disabled.

UT3 has a Post-Processing Intensity selector.
 
Yeah good article.

I feel sorry for anyone that subscribed to this game as right now the only perk you get is a larger stash. They gave the 24 character slot perk to everyone now.

This game would be a 9.5/10 in my book if they could fix the stupid memory leak problem. :mad: I love alt+tabbing to see the game is using 100% of my memory and 2gigs worth of the pagefile.
 
I'm really liking the whole "Diablo" feel of the game and it (the DX9 portion) runs well enough on my system at 1920X1200 - I think I may try 1440X900 and crank everything up.

The DX10 issues in multiplayer are maddening, I have made it to the character creation screen and entered a name only to have it become unresponsive, and need to pop out and kill the process.

The DX9 experience is good, but I'm really looking for all the effects possible to add to the immersion.
 
CPU performance is somewhat irrelevant for the majority of todays graphically intensive games as most of the load has been handed off to the graphics processor (as it should). It's all about the GPU now and this article attempts to illustrate some comparisons between different cards and formats.



I have to agree.On my X2 system (3800 X2 @ 2.5Ghz) HGL uses about 99% of one core and 20 to 40% of the second.

On my sig system its barely 90% of core 0 @ 19x12 maxxed.

Many games are the same.I wont be needing a cpu upgrade for a while :)
 
In the apple to apple comparison chart, the 3870 seem to be much smoother. Is it like that in other game?

I'm trying to decide between 3870 and 8800gt. If the 3870 is generally smoother overall but slower, I would still lean heavily in it's direction.
 
It's not hard to run DX9 path under Vista64, correct? I am planning to get my new PC with Vista64 but might settle on XP64 instead.

All these blurry effect are the opposite of what I want. I could just as well play in 640x480 to get the same effect.
 
It's not hard to run DX9 path under Vista64, correct? I am planning to get my new PC with Vista64 but might settle on XP64 instead.

All these blurry effect are the opposite of what I want. I could just as well play in 640x480 to get the same effect.
Yes, you can run dx9 path quite easily. or for that matter, turn off the effects in the DX10 path.

And, no, playing in low res is a completely different effect.
 
It's not hard to run DX9 path under Vista64, correct? I am planning to get my new PC with Vista64 but might settle on XP64 instead.
All these blurry effect are the opposite of what I want. I could just as well play in 640x480 to get the same effect.

XP64 would be a pointless move.
It's slower than 32-bit XP and doesn't include DX10 support.

Development-wise it's even more of a dead-end street than 32bit since MS has focused on Vista. You'll end up with more frustration and no benefit.

XP64 is to this generation of OS's what Win9x was to the last generation.
Last generation, MS got scared by OS/2 Warp 3 and couldn't understand why no one wanted to move from Win/DOS to Win NT workstation.

This time around, MS got scared by 64-bit Linux and had to put out something since Vista wasn't (and still isn't) ready for distribution.
 
Riddle me this, Batman:

How does DX10 performance in 32-bit Vista compare to 64-bit Vista at this point in time ?

Has anyone compared the two in any games ?
 
Riddle me this, Batman:

How does DX10 performance in 32-bit Vista compare to 64-bit Vista at this point in time ?

Has anyone compared the two in any games ?
Tough so say since many newer games have 64-bit modes (HalfLife2, Hellgate etc...)
In my experience, it's about the same, but there are enough bugs with with Helgate that they might be masking any benefit.
 
In the apple to apple comparison chart, the 3870 seem to be much smoother. Is it like that in other game?

Did you also notice the other interesting thing in the Apples to Apples section?

The GF8800GT is the only one to have the Green 'above 60' line to actually run through the 55fps mark. :confused:

http://enthusiast.hardocp.com/image.html?image=MTE5NjA1MDI5MWhLbUdnOXZLdG1fNF8yX2wuZ2lm

It seems like a strange anomaly that lessens the appearance of those dips below 60 by lowering the bar.

herrordx9.jpg
 
Just keep in mind that those FPS graphs are just what they are...FPS graphs. Hellgate's levels and combat action are not repeatable enough to make nice graphs like we see with some other games, like WoW and Oblivion. The FPS we measure is a secondary consideration. Our primary goal is to make sure the game is playable and show what settings we used. The A2A stuff is there to show the relative differences between the performance levels of the video cards at a glance.

Don't get all scientific about it, because there isn't that much significance other than "this card's average fps at this setting is higher than this card's".

Take them just for what they are.

On a closer look, it does look like that greenline was misplaced in error.
 
Just keep in mind that those FPS graphs are just what they are...FPS graphs.

Yeah but why would the 60fps line flow through 55 fps? If it's a graph that should be a function of your graphing software. The strange thing is it didn't occur on one of the test where it wouldn't have mattered, like the GTX or 3870.

Edit: Ok, well you noticed it now.

Don't get all scientific about it, because there isn't that much significance other than "this card's average fps at this setting is higher than this card's".

The point of using a histogram instead of a min/avg/max number is so people can SEE the smoothness of the playability your write about. If you were to show just the average fps like you suggest a card that fluxates all over the map from unplayable 5fps to 300fps , could have a much better avg fps than one that's consistently at 90fps and never below and never above. The histogram gives us a feel for the frequency and the depth of any performance dips.

But I wouldn't want to get 'all scientific' about mathematical representations of computer hardware. ;)

Take them just for what they are.

Well in an apples-to-apples test they're the only non-subjective tests there are in these reviews, so having anomalies in them is a little disappointing.
 
Yeah but why would the 60fps line flow through 55 fps? If it's a graph that should be a function of your graphing software. The strange thing is it didn't occur on one of the test where it wouldn't have mattered, like the GTX or 3870.


Well in an apples-to-apples test they're the only non-subjective tests there are in these reviews, so having anomalies in them is a little disappointing.


We use Excel for graphing, which does not seem to have the option of forcing the lines to stay on a particular number line (like to stay on 60, wherever 60 happens to be). So we have to move the red 30 line and the green 60 line every time.
 
Even though there was a big hit in the average fps for Directx10, the minimum fps did not drop any.
 
Great article,I find graphics can be buggy in DX9 or DX10 in HGL...

The game needs a SP patch yesterday,as well as stand alone patches made available to all,but FSS have all but abondened the SP support(check the offical forums).My wife and I have played it off and on for weeks now(in for TWO founders club's),with varying success online.

With the 169.01's AA makes for semi invisible characters.Havent tried the 169.09's yet in XP Pro SP2.Anyone else ?

Everything after the 01's give me crap OpenGL performance.I do agree that the DX10 effects look nice though.

Semi transparent characters are caused by MSAA Transparency AA. Turn it off and it will be fixed. The game will still be a steaming pile, however. For the last day and a half you can't even play in a party because everyone either gets sent to their own instance, or you can't party portal to them period. There are so many gameplay bugs right now, it just isn't worth messing around with.

Maybe in six months.
 
Nice reveiw. I have xp with dx9 and the graphics on Hellgate are nothing special. It's good to know that when I upgrade to Vista I can expect a significant improvement in graphics quality.
 
anyone try the directx 10 hacks in XP yet. I don't feel like upgrading to Vista and paying all that money. Do they work ?
 
Back
Top