UT3 DX9 Performance

People are going to see the 2900 beating the GTSs and then complain that the tests don't have any AA in them - let's face it, outside of [H] style users not many people run AA, a lot of the hardcore gamers just want the best FPS, you need the best FPS possible in this!
 
This game is unique so far in my experiences that the texture quality is so good that it sort of hides the aliasing, so far, and I stress so far because I haven't played the full version game, it looks great without antialiasing.

There is hope that the DX10 version may support AA, this demo is DX9.
 
as i said before ati cards are great in games that use new unreal engine
 
WTF if you look at the Min/Max/Avg graphs, those points for the ATi card don't match up with the points plotted on the Graph :S

Looks fiddled to me...
 
PCPER has it up, the ATI cards seem to be doing pretty well, beating out the 640GTS in almost every test, and even beating the GTX in 1 or 2(1 of them in SLI vs Xfire)
the 2600XT even seems to be going toe to toe with the 8600GTS here :O

http://www.pcper.com/article.php?aid=464

One thing I noticed was the reviewer didn't mention when the 8800gts beat the 2900xt (said all cards did well), but whenever the 2900xt was up (even a few fps) they mentioned it.

Also, w/o AA and capped @ 60FPS the full story isn't being told.

And the 2900 in CF didn't beat the 8800gts/gtx in SLI. Did you look at the graph?

Shangri La - 2560x1600 - 0xAA/8xAF (not 16x?)

Code:
SLI/CF
      Min / Max / Avg
GTX:  35  / 62  / 53.5
GTS:  36  / 63  / 52.3
2900: 29  / 63  / 51

So you can see, the 8800GTS beats the 2900xt when TWO cards are used, the 2900 does not beat the GTX. Also the minum is the real killer, and its much higher on the nv based cards.

You have to look at the avg and min, as the max is capped @ 60ish.
 
What's the big surprise? In realistic terms, the 640 GTS and 2900 XT are matched, and so are the 8600 GTS and 2600 XT (no folks, 5fps doesn't really make a difference). Nothing to see here.

Bah, the screenshots are disappointing. There's something about their lighting model that I really don't like, and the textures they chose kinda feel...washed out.
 
One thing I noticed was the reviewer didn't mention when the 8800gts beat the 2900xt (said all cards did well), but whenever the 2900xt was up (even a few fps) they mentioned it.

Also, w/o AA and capped @ 60FPS the full story isn't being told.

And the 2900 in CF didn't beat the 8800gts/gtx in SLI. Did you look at the graph?

Shangri La - 2560x1600 - 0xAA/8xAF (not 16x?)
GTX:
36 / 59 / 46.6

GTX SLI:
35 / 62 / 53.5

GTS:
26 / 43 / 34.6

GTS SLI:
36 / 63 / 52.3

2900:
25 / 53 / 39.3

2900 CF:
29 / 63 / 51.1

So you can see, the 8800GTS beats the 2900xt when TWO cards are used, the 2900 does not beat the GTX.

You have to look at the avg and min, as the max is capped @ 60ish.

Defensive aren't you? I can hardly consider 1.2 FPS a decisive win. I'm sure if the test was run again the frame rates would be slightly different. Also, the other maps were not tested in CF/SLI like the single benchmarks were. Therefore, you are really pulling straws at this point until all the maps are benchmarked. As far as 60 FPS cap is concerned it's pretty moot. As long as the game is running at or near 60 FPS you are splitting hairs (unless there is consistent frame rate drops in the mid 20 or something like that).
:D

Also, lets discuss your observation regarding:
the reviewer didn't mention when the 8800gts beat the 2900xt (said all cards did well), but whenever the 2900xt was up (even a few fps) they mentioned it

Did you read his comments regarding the 51% scaling the GTS vs the 30% scaling of the HD 2900? Or his comments regarding the GTS 320 when it beat the 8600 and 2600? What about his conclusion that the GTX was the best card in this review? I am sure if you go through it again, you will find it.
 
Defensive aren't you? I can hardly consider 1.2 FPS a decisive win. I'm sure if the test was run again the frame rates would be slightly different. Also, the other maps were not tested in CF/SLI like the single benchmarks were. Therefore, you are really pulling straws at this point until all the maps are benchmarked. As far as 60 FPS cap is concerned it's pretty moot. As long as the game is running at or near 60 FPS you are splitting hairs (unless there is consistent frame rate drops in the mid 20 or something like that).
:D

No, just proving the OP was wrong.

PCPER has it up, the ATI cards seem to be doing pretty well, beating out the 640GTS in almost every test, and even beating the GTX in 1 or 2(1 of them in SLI vs Xfire)

Thats completely false if you look at the numbers.

Also, the max DOES matter because you are comparing min -> max to get the avg, and if the max is capped @ 60, that effects your avg.
 
Also, the max DOES matter because you are comparing min -> max to get the avg, and if the max is capped @ 60, that effects your avg.

It doesn't matter for this review. Your just complaining and arguing.
Does increasing the frame rate cap from 60 to say 100 increase the min frame rates?
 
Also for my example of the reviewer not saying anything when the 8800gts is faster..

Its slower by 4ish fps in one test, and is "a bit slower than the 2900xt".
Its faster by 10ish fps in another test, and nothing is mentioned, just that all cards performed well.
Its slower by 5ish fps and "And once again, AMD's HD 2900 XT takes the win over NVIDIA's 640MB GTS card. ".

When each card is performing very well (similar to the 8800gts's 10fps lead above):
"All three of these cards have no problems producing great results at 1600x1200 though the 8800 GTS 640MB card is at the back of pack".

Its clear bias.
 
Also for my example of the reviewer not saying anything when the 8800gts is faster..

Its slower by 4ish fps in one test, and is "a bit slower than the 2900xt".
Its faster by 10ish fps in another test, and nothing is mentioned, just that all cards performed well.
Its slower by 5ish fps and "And once again, AMD's HD 2900 XT takes the win over NVIDIA's 640MB GTS card. ".

When each card is performing very well (similar to the 8800gts's 10fps lead above):
"All three of these cards have no problems producing great results at 1600x1200 though the 8800 GTS 640MB card is at the back of pack".

Its clear bias.
You are just complaining and arguing...I've already proven that he made several compliments to the G80 in my previous post.
 
One thing I noticed was the reviewer didn't mention when the 8800gts beat the 2900xt (said all cards did well), but whenever the 2900xt was up (even a few fps) they mentioned it.

Also, w/o AA and capped @ 60FPS the full story isn't being told.

And the 2900 in CF didn't beat the 8800gts/gtx in SLI. Did you look at the graph?

Shangri La - 2560x1600 - 0xAA/8xAF (not 16x?)

Code:
SLI/CF
      Min / Max / Avg
GTX:  35  / 62  / 53.5
GTS:  36  / 63  / 52.3
2900: 29  / 63  / 51

So you can see, the 8800GTS beats the 2900xt when TWO cards are used, the 2900 does not beat the GTX. Also the minum is the real killer, and its much higher on the nv based cards.

You have to look at the avg and min, as the max is capped @ 60ish.

http://www.pcper.com/article.php?aid=464&type=expert&pid=10

the 2900XT does beat the GTX there 0.0 I don't look at min / max i look at average

I said 1 or 2 because I wasn't sure :p I kinda read the CF/SLI part in a hurry since I don't really care about it!

by the way Average is most likely not calculated from min + max / 2 it's calculated by total frames rendered / time I believe :p so 6000 frames in 1 minute = 100fps average regardless of the min max,

also whats the point in looking at min fps? it's bloody useless considering all of the dips cards take, its a bad indicator of performance, Average works the best IMO
 
.
You are just complaining, I've already proven that he made several compliments to the G80 in my previous post.

The reviewer never mentioned when the 8800gts beat the 2900xt, only the other way around. I even pulled quotes.

Please answer my previous question. Does increasing the frame rate cap from 60 to 100 increase min. frame rates for UT3?

I never said it did. I said it effects the Average. And to answer my question since you can't.

Min: 30
Max: 60
Avg: 45

Min: 30
Max: 90
Avg: 60

Having the max capped makes a huge difference in your results.

http://www.pcper.com/article.php?aid=464&type=expert&pid=10

the 2900XT does beat the GTX there 0.0 I don't look at min / max i look at average

I said 1 or 2 because I wasn't sure :p I kinda read the CF/SLI part in a hurry since I don't really care about it!

:rolleyes:

57.8 vs 59.6 is a win? I swore to Eastcoast it wasn't...
Notice the 2900 has 7fps less in its min? That will make much more of a difference, esp as the fps is capped @ 60 skewing these results.
 
The reviewer never mentioned when the 8800gts beat the 2900xt, only the other way around. I even pulled quotes.



I never said it did. I said it effects the Average. And to answer my question since you can't.

Min: 30
Max: 60
Avg: 45

Min: 30
Max: 90
Avg: 60

Having the max capped makes a huge difference in your results.



:rolleyes:

57.8 vs 59.6 is a win? I swore to Eastcoast it wasn't...
Notice the 2900 has 7fps less in its min? That will make much more of a difference, esp as the fps is capped @ 60 skewing these results.

the point was that the 2900XT has some decent performance in UT3, don't get your panties in a knot =p. and look at my previous reply, minimum fps is as useless as max as none of the cards spend most of their time at min or max
 
the point was that the 2900XT has some decent performance in UT3, don't get your panties in a knot =p. and look at my previous reply, minimum fps is as useless as max as none of the cards spend most of their time at min or max

But min is where the stuttering occurs, so it matters a lot more than max, where everything is smooth (or tearing :p)
 
I wonder how the 7900GTX/X1950 cards run the game. Seems pretty damn well optimized in its beta stages though, gj Epic Games.
 
the point was that the 2900XT has some decent performance in UT3, don't get your panties in a knot =p. and look at my previous reply, minimum fps is as useless as max as none of the cards spend most of their time at min or max

Min is not useless as when the frames start to drop at specific point, it gets annoying, because its usually at a point when theres alot of stuff goin on that you need to react to.
 
But min is where the stuttering occurs, so it matters a lot more than max, where everything is smooth (or tearing :p)
'

the only time min makes a game stutter is when it goes down drasticly, eg you're playing at 40fps, all of a sudden you hit 0fps, or 10fps for a split second. these are not minimums like that, usually its a result the game not of the cards(look at fear when it came out), the minimums in this game looks completely different, they look like part of a regular map when you turn around or when a few characters make their way to ur screen, having the game dip once from 55-35 won't cause a stutter :p
 
... i look at average


...also whats the point in looking at min fps?...


min fps matters much more when dealing with higher end cards at high resolutions. you might not notice dips in fps when games dip to 30fps but when they drop below that threshold it's nogoodnik and CAN be the difference between a lost frag and a killing spree.

i'd say min fps matters much much more than avg or max, personally.


edit: i just played a few rounds of the beta demo and i've gotta say that it's very underwhelming. i think enemy territory: quake wars looks a whole bunch better than UT3, at this point. i've got the settings cranked as much as possible, in-game.

edit 2: it is hella fun tho :D
 
min fps matters much more when dealing with higher end cards at high resolutions. you might not notice dips in fps when games dip to 30fps but when they drop below that threshold it's nogoodnik.

i'd say min fps matters much much more than avg or max, personally.
yea except in this case if you look at the charts / graphs the min fps stays above 30 except in the 2560x1600 so again, its not a drastic drop to shitty fps, so I don't think you'd notice it.
 
Min is not useless as when the frames start to drop at specific point, it gets annoying, because its usually at a point when theres alot of stuff goin on that you need to react to.

Exactly, min's happen when the big stuff is happening aka when you need to be on the top of your game, not stuck lagging behind.

Also, I am just sick of the double standards by some of the fanboy's like EastCoastHandle. I show that the 2900 is slower than the 8800gts / 8800gtx not faster, and its "1-2 fps doesn't matter", yet the only test where the 2900 is faster than the 8800gtx is by 1-2 fps. :rolleyes:

Also please feel free to provide a quote from when the 8800gts is faster than the 2900 that the reviewer points it out, because I couldn't find any, even when it was a 10ish fps difference vs some of the 2-7fps losses that were pointed out.
 
'

...having the game dip once from 55-35 won't cause a stutter :p



i'm gonna have to go ahead and disagree with you on this point too :eek:


edit: you're right tho, i don't notice the min ratings at all playing at 1680x1050. at a higher resolution i think i would, examining that chart and the good 2 second periods when the action dips to the low 20's.
 
i'm gonna have to go ahead and disagree with you on this point too :eek:
edit: you're right tho, i don't notice the min ratings at all playing at 1680x1050. at a higher resolution i think i would, examining that chart and the good 2 second periods when the action dips to the low 20's.

well everyone has an opinion, thats mine, I turn my fps off now cuz I don't care as long as the game runs smooth :) and for me 35fps is still good, specially when it only hits once in a while

and like i said only at 2560x1600 does it dip blow 30fps :p
 
well everyone has an opinion, thats mine, I turn my fps off now cuz I don't care as long as the game runs smooth :) and for me 35fps is still good, specially when it only hits once in a while



there's no doubt, we're both using cards that can handle this engine exceptionally well.

edit: boy i'm sure doin alot of this tonight :p my barton rig pretty much chokes on this game @ 800x600. :(
 
I just played the demo.

Smooth with maxed settings at 1680x1050, no AA/AF. x1950 XTX.

And it looks HAWT!
 
i really can't understand how many are claiming it's a graphical masterpiece. i cannot discount how much fun it is, but the graphics are meh, in my opinion.
 
Kind of pointless bench atm, no real filtering turned on. I mean who buys the top-end cards not to get all the goodies turned on at high resolutions.

With all the hype and crazy power suggestions, was really surprised playing at 1920@1200 with forced 4AA/16AF and not having any graphics lag at all. Now I wish I could find a server with <100ms ping on the west coast that aint full, hehe.
 
i really can't understand how many are claiming it's a graphical masterpiece. i cannot discount how much fun it is, but the graphics are meh, in my opinion.

Yeah the graphics dont seem that killer, they are pretty dark, maybe to hide something, IDK.
 
Kind of pointless bench atm, no real filtering turned on. I mean who buys the top-end cards not to get all the goodies turned on at high resolutions.

With all the hype and crazy power suggestions, was really surprised playing at 1920@1200 with forced 4AA/16AF and not having any graphics lag at all. Now I wish I could find a server with <100ms ping on the west coast that aint full, hehe.

really you played with AA? AFAIK AA is not currently working under DX9 with UT3 =p?
even if you've forced it in the CP it won't affect the game me thinks!

and I havn't played it yet! just got home from work :)
I usually wait till I have a card that can max a game out before playing it, I still wait to get QC before playing Sup-Commander and a decent DX10 card for WiC, Almost got an 8800gtx last nite :p
 
really you played with AA? AFAIK AA is not currently working under DX9 with UT3 =p?
even if you've forced it in the CP it won't affect the game me thinks!

and I havn't played it yet! just got home from work :)
I usually wait till I have a card that can max a game out before playing it, I still wait to get QC before playing Sup-Commander and a decent DX10 card for WiC, Almost got an 8800gtx last nite :p

go for WIC! it's tons of fun even at resolutions lower than your native.

...but i digress. Whooooooo UT3! :p
 
OH MY GOD SOMEDAY WE WILL ALL DIE AND REMEMBER WE SPENT HALF OUR LIVES ARGUING ON THE INTERNET ABOUT WHICH NUMBER IS HIGHER.
 
really you played with AA? AFAIK AA is not currently working under DX9 with UT3 =p?
even if you've forced it in the CP it won't affect the game me thinks!

and I havn't played it yet! just got home from work :)
I usually wait till I have a card that can max a game out before playing it, I still wait to get QC before playing Sup-Commander and a decent DX10 card for WiC, Almost got an 8800gtx last nite :p

This will probably work, as Bioshock is UT3 as well:

http://www.hardforum.com/showthread.php?t=1216510
 
OH MY GOD SOMEDAY WE WILL ALL DIE AND REMEMBER WE SPENT HALF OUR LIVES ARGUING ON THE INTERNET ABOUT WHICH NUMBER IS HIGHER.

Actually its easy to tell what number is higher, its the commentary regarding those numbers that is biased ;)
 
I'm getting ~30fps with all options maxed on an X1800XT. The demo must be pretty stripped down, graphically. Impressive none the less.
 
Back
Top