Lord of the Rings on & DX10 @ [H]

Crysis was supposed to be THE dx10 game according to the hype... and it turns out you can enable almost all the dx10 graphics in dx9 under XP with some config tweaks.


No, you can't. Why do people keep saying this? Most of those Very high setting have nothing to do with DX-10. Crytek simply lumped in all the "Very High" settings under "Very High", rather than separate the "DX-9 Very High" and "DX-10 Very High" settings.


Aye, I don't know why people are so gullable. They even try to cough up comparitive screenshots which are always hand-picked. I suppose it doesn't occur to them that DX10 isn't just about visual quality. In large part, it's actually not. It's a different engine, that allows graphics to be rendered differently, and yes it does it better.

The problems mainly are that A) Developers aren't as comfortable with it and B) The things that DX10 are doing, aren't always going to be "OMGWTFBBQSAUCEAWESOME!" noticeable.


Admittedly, Vista, like any other OS can have issues, and likewise DX10, but the average consumer's computer is a bloody nightmare that's poorly managed, pseudo-maintenanced, and they themselves as an end-user are often technically behind the curve, despite what they spout off in forums.

I have zero problems with Vista, DX10 and performance. But then again, I know what I'm doing, and what I may have to do, to compensate for anything that developers couldn't.

Unfortunatley, in the case of DX10 code for LOTRO, there's nothing I can do.

The developers made it very clear that it was essentially in BETA, and would take some time to improve, which they will continue to strive at.

Rather than just accept this, as GOOD news, or at least SOME news; unnappreciatve, computing-backwards trolls like to jump on the anti-MS bandwagon, yet they love to cling to their precious XP, and at the end of the day, the only reason they can come up with is because 'well it works for me!'

Yes, it works for you, because YOU are clueless...

I'm sure such feedback really gives developers an insight to your computing woes.

;thumbs up;
 
Well I don't see how pointing out that running the same game with the same settings under XP and Vista results in Vista performance being worse is anti-MS/fanboy crap. They're just facts.

When Vista and DX10 are as fast or faster than XP/DX9, I'll gladly switch over and not complain about it. Also, I don't really know of any proof showing that Crysis or any other game is fully DX10...all I've seen so far are DX9 games with extra effects thrown in. I doubt that either Crysis or LOTR are actual DX10 games.
 
Crysis was supposed to be THE dx10 game according to the hype... and it turns out you can enable almost all the dx10 graphics in dx9 under XP with some config tweaks.

Aye, I don't know why people are so gullable. They even try to cough up comparitive screenshots which are always hand-picked. I suppose it doesn't occur to them that DX10 isn't just about visual quality. In large part, it's actually not. It's a different engine, that allows graphics to be rendered differently, and yes it does it better.

...

I have zero problems with Vista, DX10 and performance. But then again, I know what I'm doing, and what I may have to do, to compensate for anything that developers couldn't.

...

Rather than just accept this, as GOOD news, or at least SOME news; unnappreciatve, computing-backwards trolls like to jump on the anti-MS bandwagon, yet they love to cling to their precious XP, and at the end of the day, the only reason they can come up with is because 'well it works for me!'

Yes, it works for you, because YOU are clueless...
a) The tweaks are not JUST the very high settings tweak. There are others as well. All of which bring the majority of the crysis dx10 graphical experiance to XP and dx9. I'm talking about just one game. I'm convinced there will be dx10 games that come out in the future where this will not be possible, and thats probably when I'll start dual booting vista, but that future isn't here yet. So far all the "dx10" games haven't added anything that significant compared to dx9. Bioshock added some extra particle processing and some minor water stuff. Wow! There are other similar examples.
b) Why would I want to cough up screenshots? I don't have vista, I'll let others do that.
c) You are imagining things, XP in general has lower framerates in games than vista, ceteris paribus. Feel free to argue with the [H] editors about this point, I recall at least 1 article (maybe 2?) about the gaming performance differances. Better drivers likely will improve this, but I doubt they will entirely cure it.
d) Yeah, thats me, computer backwards with a 37" monitor and a 8800 video card.

Let me mention some of the things about the reasons why I have not upgraded to vista.
MS removed many of the features and/or exclusive items for vista. For instance, they removed the new file system. That alone probably would have gotten me onto vista already, the hdd is one of the larger bottlenecks to a computer system. Things like IE 7 were changed to work on XP (which is fine, its just that I got the impression it was going to be a vista only feature at first).
MS added in nutty things like UAC. That thing is just plain stupid. Any knowledgable person is going to disable it because its too annoying, and any clueless person is going to panic the first few times a prompt comes up, and forever after just ignore them by always clicking YES, which defeats the whole purpose. And how are they expected to know if its something to worry about anyway? I mean come on, is behavior like deleting an icon from the desktop really something to be alerted about?
MS removed features from things like the vista equivalent of the scanner and camera wizard. You can no longer selectively pick the items on your camera card (like you could in XP) to upload, and so it uploads everything to the computer. Many people I know who use cameras and windows leave pictures from multiple vacations on a card (probably for backup reasons) and they will probably end up with a number of extra copies of the pictures. I know that when/if my grandparents upgrade to vista, I'm going to get calls about this "feature"
This may be a little extreme, but windows default power button behavior is now standby rather than shutdown (or even hibernate). This will increase national power usage (which increases pollution and foreign oil dependance) as people and businesses upgrade to vista. Plus, is it really a good idea to get rid of reboots? Reboots tend to fix many many problems in windows, especially if you don't know how to fix something. For example, I have a USB optical drive that very occasionally (like once every couple weeks usually happens when I'm playing games), windows will think its disconnected when its not. Easy way to fix though, I just unplug the drive and plug it back in. But I know the problem and why its happening. If I didn't know and instead followed the advice of a computer guy and rebooted, the problem would be fixed that way.
Note that I have NOT touched on gaming at all yet.
But I'll mention that a bit. Average FPS tends to be lower in vista, probably mostly because of immature drivers, but probably also because of the additional overhead of vista. Plus they have to program in mandatory things like checking tilt bits in graphics drivers. Then you have the whole thing with MS making titles like the years old xbox 1 tech game of Halo 2 vista exclusive, supposedly to enable a streaming install (Steam has had streaming installs for years, not that I'm a fan of steam, there's a number of things I don't like about steam either).

So yeah, I don't like vista. And I'm not alone in that, last time I checked (last week I think it was) only 2.5% of average gamers had vista with dx10 capable card.

Oh, and thanks for calling me clueless. I guess I must be because you called me that. Notice I didn't do any name calling of you above.

Side note: I went back and requoted my quote rather than using the part of your message where you "quoted" me (and in my quote I removed the previous guy's "quote" that you bolded, since it wasn't particularly relevant to this post). Your taking my quote and merely italisizing it makes it confusing when you are quoted, because quoting automatically italisizes something. I know that it can get difficult with the forum not allowing nested quotes, but it tends to work out better if you go to the original post and quote that as well if a previous quote inside a quote has relevance to a discussion. As well, without proper attribution it becomes more difficult to distinguish who said what. The first textual line of your post was something I said, the 2nd line of text was something SlimyTadpole, and the rest of your post was you, yet nowhere does your post mention who said what.


yes it is at 1:1 pixel mapping 1600x1200 is the native 4:3 res for a 1920x1200 wide screen
just like 1400x1050 is the native 4:3 for monitor which is 1680x1050 when run with 1:1 pixel maping
but then its not my fault that you got a monitor with no 1:1 pixel mapping support

Ah, but I didn't say I was talking about MY monitor. My monitor is a 37" 1920x1080 (16:9), which if you looked my sig, you would have noticed :) I was talking about a hypothetical situation, and I mentioned the 1920x1200 (16:10) because thats what resolution the monitor [H] tested on can hit.

Theoretically, my monitor could only do a max of 1440x1080 if I wanted to maintain 4:3 AR. And my monitor is configurable to if you want to stretch, or zoom, or leave alone. Also in many cases this is somewhat configurable on nvidia drivers too.

Anyway, unless I run at a widescreen 16:9 resolution, my maximum resolution is more limited if I want to maintain a proper AR.




Aren't we getting a little off topic in the LotR thread? I mean maybe we should discuss crysis stuff in a crysis thread?
 
When Vista and DX10 are as fast or faster than XP/DX9, I'll gladly switch over and not complain about it. Also, I don't really know of any proof showing that Crysis or any other game is fully DX10...all I've seen so far are DX9 games with extra effects thrown in. I doubt that either Crysis or LOTR are actual DX10 games.

Depends on how you look at it.
DX9 and DX10 are mutually exclusive APIs, that aren't source-level compatible.
So technically, anything that uses any kind of DX10 feature, must be completely written for DX10.

On the other hand, a lot of engines are just DX9 engines that have been backported to DX10, rather than taking DX10 as the basis of the engine design and starting from scratch. With all of the hype surrounding Crysis, I would expect that this would be a DX10 engine written from scratch, but I'm not too sure now.
We'll have to look out for 3DMark Next. This will surely be a complete DX10 engine from scratch. We'll have to see how that performs. Sadly there will probably not be a direct comparison to DX9.
 
Ah, if only this review had waited two days.

Yesterday (11/14) a patch to the DX10 functionality was rolled out which was intended to improve performance in the very areas cited as being most FPS-expensive in this review. A followup article would be warranted.

(subnote: the ContentLink ad generator is flawed ... someone's post above includes the word 'Raptor' (referring to their HDD) and the resulting ad references a cat hospital. Raptors are birds...)
 
Lotro dx10 after patch 15/11

I use a E6600 3ghz, 8800GT, 3gb ram, Vista 32bit, abit ip35-e motherboard.
Watercooling.

I played the beta dx10, it was maybe playable, I switched to dx9, and had maxed out settings 60-100fps.
However,
the patch 15/11 made dx10 playable and it is around 30fps.
I still get the following though,
quota error some virtual ram issue, however less frequent now.
Drop some fps when a lot going on in cities, but much better since patch.

Dx10 now works well on my machine and it isnt just playable, its enjoyable.
I find the shadows in dx10 adds the atmosphere I like in a MMO.
Wandering aorund in the forests now, seems like forests.

I am sure the developers will tweak and improve dx10 in lotro but as it is now, it simply is a enjoyable experiences watching the shadows dance the sun.
 
I tried dx10 2 days ago and saw the big performance hit I saw the update last night I may go back and try this again. Flopper I have the same machine as u my c2d is not overclocked but i have 8 gigs of ram vista 64 and a 8800gt ssc. Do you have a screenshot of the options u have selected besides the dx10 so i can try ur settings tonight?
 
Hum, no screenshots.

I blend them a little, out in the open i go for a little higher since the hit is less, and in city I might lower them.
If there is a instance and I get sluggish behaviour, i just turn off post processing.
That is a huge hit. (180fps swimming is fun to see without post processing)
I have however only had one such event as far, one bad hard quest in evindem.
wasnt balanced properly and we had a good group where people knew how to play the class.
Not so common in LOTRO ;)
I wished they had different settings you could save and just choose one you want.
city, forest and raid etc..
You could then tweak it as you wanted.

I like the game tho, the shadows truly adds a lot.
I use ultra on models, often high to very high on shadows, dx10 shadows I use max in forests.
I wished they could fix the mem leak though.
I still crash and get sluggish due to some residue not working properly.
 
thanks for the options you used i havent had a chance to try it out yet but will this week. good to know the tips :)
 
Heaven forbid Microsoft come out with a new version of DirectX. I think people just like to trash Microsoft for no reason. We might as well be running DirectX 1.0 if thats the case.
 
Heaven forbid Microsoft come out with a new version of DirectX. I think people just like to trash Microsoft for no reason. We might as well be running DirectX 1.0 if thats the case.

Why would I downgrade my preformance with DX10?
So that 1-2 games may look better...but the rest run slower for no apparent reason?
 
Why would I downgrade my preformance with DX10?
So that 1-2 games may look better...but the rest run slower for no apparent reason?

Professional gamers fps games always turn down all settings for maximal fps and for having as little distraction as possible on the screen.
That is why CS looks like shit.
That is why quake look like shit.
I seen pro game.
Its ugly but deadly.

I play MMOG a lot, I want the visual aid as much I can get.
LOTRO benefit from dx10.

Running down the Rivendell bridges, seeing Frodo looking better, Gandalf and Aragorn and that weak Gondorian Boromir with ths shadows lurking around.
Its a treat.
The game for me is so much better.
I draw around 60fps atm.
Using a 3800mhz dualcore and a 8800GT in dx10.
Most settings on high or almost maxed out.
2gb rama tm, gonna get 2 gb more for overall Vista performance.

We are in a transition period where game makers are gonna go to dx10, build games when they learn the tech to do so better than today. Most games if not all are dx9 based with some dx10 slapped on.
We see today with crysis what dx10 will bring us in the future.

Its in need of better videocards than what we have today.
I dont want todays visuals or tomorows, I want the new tech and the visuals that comes with that.
R700 can not come along to slow.
 
All the shaders have to be re-written for DX10 technically you can just re-compile your DX9 ones but the difference between generated code and re-writing it, is the difference between a tweaked computer and using the defaults set in the bios. The shaders are what tell the engine how light is supposed to bounce off an object, so impact any scene you are walking through. Those shadows are far more impressive considering they are real time, in most graphic packages that would be the difference between ray trace and depth map shadows. Soft shadows are very expensive to render. That said it should be interesting to see if MS can ever justify making ever call to the hard go through four people when under dx9 it only needs to go through one person. (one person = Simple version for the long version msdn has a great section on DX10). Several Devs saw this coming when they found out that hardware calls have more hoops to jump through and less optimization can be done in hardware, as software is simply not as fast as hard coded instructions. Most of the changes are to cut done on M$ having to write a new version of DX every time nvidia or ati came up with some really cool feature they wanted in the API. Unified hardware means less coding for MS, and in turn it is supposed to make Devs life easier by not having to code multiply code paths yet what is DX9 and DX10? Two API which require even more coding and testing.
 
There was another update to LotRO yesterday (December 10) that promised further improvements in DX10 performance.

So far, the feedback has been overwhelmingly positive. The improvement is apparently so noticable that I'm having a hard time believing the numbers in some of these posts.
 
I can run the game at maximum settings @ 1680x1050 with the PC in my sig. I don't recall what my AA settings were, at least 4x.
 
Back
Top