Hellgate: London DX10 vs. DX9 Gaming Experience @ [H]

Brent_Justice

Moderator
Joined
Apr 17, 2000
Messages
17,755
Hellgate: London DX10 vs. DX9 Gaming Experience - In 2038, London lies in ruins, but will Hellgate: London also lay your video card to waste? We'll run this new hybrid RPG/FPS through the paces and show you if your video card is up to the task for DX10 in Hellgate!

Like few games before it, Hellgate: London gives us a real reason to play in DirectX 10. It looks better and more realistic in DirectX 10, and though the performance penalty is stiff, it is not without merit.

Please Digg to share! Thanks
 
first :p

nice article, i really like these DX10 comparisons. shows what i'm missing if anything.
 
Good article. I'm kind of surprised Hellgate crashed when you turned on AA. But the description of it sounds like a bug I was having. I had been running fraps, and closing that fixed it right up and let the game play. Something you might want to try. I won't help you capture the stats on the framerate, but it will clear up the bug.
 
During your evaluations, did you find any way to turn off certain Direct X 10 features? For example, I don't like motion blur and depth of field blurring. I'd like to turn those off without affecting the other DX10 features.
 
The difference in Hellgate btw DX9 and DX10 is very noticeable, for me....I have played both, and cannot go back to DX9
 
Great comparison! Thanks for taking the time to test everything and write it up, Mark and Brent.
 
During your evaluations, did you find any way to turn off certain Direct X 10 features? For example, I don't like motion blur and depth of field blurring. I'd like to turn those off without affecting the other DX10 features.

shaders to very high from extreme. Extreme adds those effects...

If you turn those off, there is literally no reason to even run dx10. Pretty smoke is not worthy.
 
shaders to very high from extreme. Extreme adds those effects...

If you turn those off, there is literally no reason to even run dx10. Pretty smoke is not worthy.

Aye.

dugg. interesting read.

Is this game worth checking out?

I was a person that had no interest at all in this game before I played it. Perhaps it was because it was constantly compared to Diablo, and I don't like diablo. But as it turned out, I really like this game a lot.
 
Pretty smoke is not worthy.
Smooth volumetric smoke effects, that don't have the sprite clipping is HUGE graphical improvement to me. Sprite clipping drives me insane.

But... I suppose true, maybe not $130 Vista + $230-270 DX10 videocard worthy if you don't already have it.
 
Smooth volumetric smoke effects, that don't have the sprite clipping is HUGE graphical improvement to me. Sprite clipping drives me insane.
Yeah, the smoke effects are definitely a huge improvement. We're not talking about just "soft particles" so they don't "cut" into other objects, this actually uses volumetric smoke.

Here's a video showing it (NVIDIA demo)

This is what's typically used in movies, so to have it in games is pretty sweet, although it's only used in certain places, to my knowledge, and it's on a fairly small scale...the larger it is (or the more defined), the more your system will crawl trying to process it. :p

Still, the highlight is it's actual 3D smoke that looks good and BEHAVES like smoke...instead of like large particles that can't bend.

Anyway, going past that though, it's really pathetic to see the 3850 run 50% slower using LOWER settings in DX10. I'm not talking about the card itself, but the DX10 mode. Honestly, same graphics and HALF the performance? So much for using Vista and DX10. I'm starting to think it's not even the programmers' fault, but just that DX10 is slow as hell instead of "faster" like they claimed it would be. Imagine when all games will be DX10-only. Sure, they'll look good, but if they'll run twice as slow as they should, we'll waste 50% of the new video cards' performance for nothing. That'll be like skipping a generation of video cards just because DX10 is slow.
 
Great article guys. Having just installed an 8800GT and Hellgate in the past week this article was exactly what I wanted to read this morning. I would like to echo the thoughts of others though in that I wish the DX10 effects could be individually controlled, as Distance Blur is one "feature" I would rather not use. Seems a waste of resources to me...
 
Looks cool. I'll have to check it out. :)

But the thing I don't understand is what's with the near-halving of framerates in most DX10 titles. Wasn't Microsoft bragging about how much more efficient DX10 was? Guess there's still more work to be done on the driver department (or Vista's to blame).

BTW Brent, did you guys get a chance to see how this game runs on 8800GTS 320MB cards? (you just mentioned the 640...)

Great article as always, so keep up the good work. :)
 
Looks cool. I'll have to check it out. :)

But the thing I don't understand is what's with the near-halving of framerates in most DX10 titles. Wasn't Microsoft bragging about how much more efficient DX10 was? Guess there's still more work to be done on the driver department (or Vista's to blame).

BTW Brent, did you guys get a chance to see how this game runs on 8800GTS 320MB cards? (you just mentioned the 640...)

Great article as always, so keep up the good work. :)

The decision was made to exclude the GTS cards before I had a chance to plug the GTS 320 MB in. I would expect it to perform somewhere just below the HD3870.
 
Thanks for the review, I have played this game since Beta and I'm glad the DX10 is enjoyable, Of course now I have to get new Hardware :D

~CN
 
Good article. I'm kind of surprised Hellgate crashed when you turned on AA. But the description of it sounds like a bug I was having. I had been running fraps, and closing that fixed it right up and let the game play. Something you might want to try. I won't help you capture the stats on the framerate, but it will clear up the bug.

I had problems with fraps and this game, too. Fraps seems to cause random crashing when opening HG:L in DX9, too - sometimes it will, sometimes it won't. You can find fps by typing /fps in the console, though, and take ss with print screen (probably obvious, but trying to be helpful).

The game is very buggy, but there is fun to be had. It isn't, IMHO, anywhere near as good a game as Diablo 2 (not even pre-LOD D2), but if you keep your expectations in check, it is fun for casual gaming.
 
A nice article on the directx10 effects, but for me the game play is horrible. There are too many game breaking bugs at the moment (memory leaks that eventually crash the game, network error that knocks people off randomly, not being able to see your group mates in zones, etc.) for me to enjoy the game. Coupled with a HORRIBLE chat interface and non configurable, non scalable UI it really takes the fun out of the game. The worst part is that all these issues were raised in beta, yet they still released with these problems (the memory leak is inexcusable).

Graphics are one thing but solid code is another more important priority to me- who cares how nice the game looks if the game keeps crashing. And before someone says "they are working on a fix," releasing a game riddled with problems is inexcusable, especially if they were well aware of them. It really makes me wonder if these guys left Blizzard or were fired...

I'll get off my soapbox now.
 
A nice article on the directx10 effects, but for me the game play is horrible. There are too many game breaking bugs at the moment (memory leaks that eventually crash the game, network error that knocks people off randomly, not being able to see your group mates in zones, etc.) for me to enjoy the game. Coupled with a HORRIBLE chat interface and non configurable, non scalable UI it really takes the fun out of the game. The worst part is that all these issues were raised in beta, yet they still released with these problems (the memory leak is inexcusable).

Graphics are one thing but solid code is another more important priority to me- who cares how nice the game looks if the game keeps crashing. And before someone says "they are working on a fix," releasing a game riddled with problems is inexcusable, especially if they were well aware of them. It really makes me wonder if these guys left Blizzard or were fired...

I'll get off my soapbox now.

I don't disagree with you about the beta - many of the major problems in the game were brought up long before the game ever went gold - the memory leak has been around for months. It is kind of worrying that the game was released anyway despite the fact that they knew about them. Subscribing right now is a very bad idea - I would definitely wait several months to see how the content updates go.
 
Nice review
Once again I suspect the developers have simply spent more time enhancing features under DX10. There is next to nothing in there we have not seen DX9 do in other games with a few exception of things like smoke not looking as good.
 
Why is this motion blur abomination becoming popular with video game makers? Motion blur only occurs with film cameras operating at fairly low frame rates (24 fps). I'm playing a video game that has no film, no grain, and no "exposure" time to thus get "blurred", but why are they blurring my gaming experience?

Your eyes don't see motion blur because your eyes follow the action quite quickly AND your brain accounts for any missing or obscured details not seen. Eyes jump, they don't pan. Try it, look from the left to right and try and force yourself to "pan". It won't happen. Your eyes jump from key focal points to the next. There is no motion blur at all. Motion blur is only a result of a 80+ year old technology.

Motion blur is not natural and is a bit disconcerting while playing a modern video game. I get the feeling that it's just another bunch of challenging, but useless, "eye candy" game makers use just to say their game (or engine) does this and the other doesn't.

Don't ask me about "depth of field" effects either. A similar argument can be made against that abomination too. Hint: camera's have smaller depth of field than your eyes (someone with 20/20 vision). Which means the blurring of "unfocused-on" details is quite minimal compared to a typical camera. The director would not allow such a distraction from the action at hand, in a movie.
 
man, perfect timing, I was playing around with this over the weekend on the 3850, I'd say dx10 @ 1600x1200 is not playable for me dx9 though runs amazing. i couldn't really see the diff in the smoke that much, i tried a few different areas with smoke, they both looked good for double the fps though dx9 for sure ^^

dug
 
Yet another game that artficially cripples dx9 to make dx10 look better. Guaranteed all the DX10 'enhancements' could be done on dx9 with very little performance hit. On the plus side there's bound to be a way to enable the extra settings ala Crysis.
 
A nice article on the directx10 effects, but for me the game play is horrible. There are too many game breaking bugs at the moment (memory leaks that eventually crash the game, network error that knocks people off randomly, not being able to see your group mates in zones, etc.) for me to enjoy the game. Coupled with a HORRIBLE chat interface and non configurable, non scalable UI it really takes the fun out of the game. The worst part is that all these issues were raised in beta, yet they still released with these problems (the memory leak is inexcusable).

Graphics are one thing but solid code is another more important priority to me- who cares how nice the game looks if the game keeps crashing. And before someone says "they are working on a fix," releasing a game riddled with problems is inexcusable, especially if they were well aware of them. It really makes me wonder if these guys left Blizzard or were fired...

I'll get off my soapbox now.

hmh, i really never experienced any of these issues you have more than maybe once or twice a day, and hardly ever since the last patch.

Too bad, the game is fun.

~CN
 
" you will see the difference, the gameplay experience is improved. "

Until you run out of memory and crash to the desktop. Memory leaks FTL.
 
I can say with 4GB and 64bit OS, Hellgate uses ALOT of mem, but it never crashed, and load times were fast.
 
For me bugs aside, DX10 is still unplayable. The performance hit it too great. DX10 has been a complete bust in my opinion.
 
I just want to say that quite a few of those "DX10 renderings" are available now in dx9. Having distanced objects blurring, available in dx9 crysis. Having motion blur, available in Team Fortress 2. Not to mention i've seen some really impresive smoke in dx9 games, like serious sam 2.
 
Can't we enable all these things (motion blur etc) in DX9? We sure could with Crysis... all these things are possible in DX9 with a much better FPS. Be nice to see more developers put a little more effort into DX9 coding until the DX10 libraries/Vista drivers get more optimized. Case in point: Bioshock, where the game looked nicer and delivered better frame rates under DX10 (as I recall). 45% frame rate loss is unacceptable in my book even if the game is still playable.
 
I just want to say that quite a few of those "DX10 renderings" are available now in dx9. Having distanced objects blurring, available in dx9 crysis. Having motion blur, available in Team Fortress 2. Not to mention i've seen some really impresive smoke in dx9 games, like serious sam 2.

Hell, Crysis has motion blurring as well too, in DX9 mode. You just have to turn Shader Quality (I thjink) to maximum to get it. And I agree with the above statements too about hating motion blur, get rid of it, its stupid and completly unrealistic.
 
More artificial DX9 crippling as they try to force us to buy vista and play in the almighty DXslowmode.
 
gaming experience is easy to sum up with this game: bad.
matters little how the graphics are too.
 
Good read, but could i ask why the 8800 GTX clock speeds are always listed as something random in these reviews. In the WIC DX10 review, the LOTOR DX10 review and now the HL DX10 the GTX is listed as 3 different clock speeds. Wouldn't it make sence to use a standard GTX or are you just listed it as the wrong speeds? Its alittle confusing :p.
 
Good read, but could i ask why the 8800 GTX clock speeds are always listed as something random in these reviews. In the WIC DX10 review, the LOTOR DX10 review and now the HL DX10 the GTX is listed as 3 different clock speeds. Wouldn't it make sence to use a standard GTX or are you just listed it as the wrong speeds? Its alittle confusing :p.

Heh, interesting, I just noticed that, that would be a typo because we are indeed using a standard clocked GTX.
 
Hell, Crysis has motion blurring as well too, in DX9 mode.

There's motion blur in Need For Speed: Most Wanted and newer installments of the game, as well.

Hell, motion blur was introduced and available in DX8 and chased down big time by 3Dfx back in the Voodoo 5 development days.
 
"Image Quality Summary

There are some pretty big differences between the DX10 versions of Hellgate: London and the DX9 version. In general, the DX10 code produces a more realistic image, at the expense of some pretty considerable chunks of raw performance. The important thing here is that these changes are persistent and highly visible from the second you enable the DX10-specific options, you will see the difference, the gameplay experience is improved." (article)

I'm not seeing any evidence of this in the screen shots - but they're only screen shots.

Actually, I was thinking how good the graphics looked and how well the game performed in DX9...
 
I think the DX10 version looks MUCH better than the DX9 version, seems to be a good upgrade, but not one I would have expected from a whole version, maybe DX9.5, but I don't think it should require new hardware, either.
 
Something is really starting to bug me about these game reviews.

Where is the cpu part of it?

We all know supreme commander runs like crap on any dual core, and some others are happy on a 2ghz single core. If [H] wants to stay on top of the consumer review foodchain I want to see some cpu useage statistics.

The test setup includes a near 3ghz intel dual, what about the e6300 guys? It isn't hard to run a couple extra tests with any given video card to see what kind of impact a single core, or slower dual, or quad has on a game. I'm sure everyone is interested.
 
Back
Top