oblivion benchmarks - something is not right?

CaiNaM

Limp Gawd
Joined
Apr 13, 2004
Messages
215
even tho this review showed the GTX and XTX providing comparable image quality and performance, a few people made a big deal about the fact that the 1900xt should in fact be considered alot faster than the 7900GTX as it was doing "more work", since the grass shadows were enabled as well as HQAF. didn't seem right to me that these settings were enabled on the XTX yet made no noticeable improvement to image quality.

i reasoned that if there's no increase to IQ, there shouldn't be more work being done.. so i did some testing, and the results are in this post over at rage3d.

could someone with an x1800/x1900 test this for me? running "grass shadows" or HQAF has no impact on IQ or performance in my testing. are these features broken in oblivion, or am i doing something wrong?
 
I think people are misunderstanding grass shadows. With grass shadows enabled, character shadows will "spill over" onto grass. With grass shadows disabled, shadow maps do not cast on grass sprites.

Unless I'm misunderstood? I haven't seen any screenshots where grass actually cast shadows - the performance hit would be outrageous.

I have an XTX and will take a look at this later. I haven't bothered with HQAF - I don't see any appreciable difference in image quality versus angle dependent filtering.

EDIT: Changed "imagine quality" to "image quality", though I do admit I quite like the idea of referring to it as "imagine quality".
 
Shadows on Grass has an impact, it allows shadows being cast to fall on the grass blades. Zoom out to third person view, you'll see your character casting a shadow on the grass blades with this feature enabled in the game. If you disable it no shadows on grass blades, but you still get shadows on the ground. Note there is a bug with current ATI cards with Cat 6.3 that causes the shadows on grass to flash. Disabling it stops the textures from flashing.

HQ AF does also help image quality in this game, remember it is angle-independent so look for improvements on angles in terrain.
 
And keep in mind, you can't just jump out of the game, flip on HQ AF, fire it up again and have it work (at least, with 'Oblivion' and AF - I haven't really messed with the setting enough in other games to know if this is a widespread issue). The ATI display driver has to reload after changing settings for them to take effect. Tinkering with a few of the other settings back and forth - v-sync or mipmaps, for example - will do it. Or, if using ATI Tray Tools, you can always re-init the display driver from in there, too. Either way, you'll see the screen flash to black for a moment before coming back.

Or you could just reboot outright, of course.
 
I don't trust the Bit-Tech review...it's saying that the X1900XTX/7900GTX are limited to 1280x1024, even though I'm playing Oblivion completely fine on my X1800XT at 1680x1050, HDR, no AA, max in-game settings.

30+fps outdoors
60+fps indoors


 
title of the rage3d thread:

Oblivion Benchmarks: X1900 XTX stomps 7900 GTX

lol what a website

But i noticed performance difference with my 6800, it isn't high-end certainly
 
chinesepiratefood said:
title of the rage3d thread:

Oblivion Benchmarks: X1900 XTX stomps 7900 GTX

lol what a website

But i noticed performance difference with my 6800, it isn't high-end certainly

Lol yeah...I hung out there when I had a 8500/9800Pro. Not so much recently hahah.
 
Bona Fide said:
even though I'm playing Oblivion completely fine on my X1800XT at 1680x1050, HDR, no AA, max in-game settings.

30+fps outdoors
60+fps indoors


You NEVER drop below 30 outside at those settings ?
Impressive.
My 1800XL running at close to XT speeds drops below 30 and even 20 quite frequently
at only 1280x1024 HDR, not even maxed (no grass/self shadow)
 
if they dont see a difference in image quality, why did they:

- enable the grass shadows on x1900xtx and disable it on the 7900gtxoc?

- use hqaf on x1900xtx when the stock balanced settings is better then the nvidias HQ setting?

-why did they use 16x hqaf ,knowing its a performance hog , when no hqaf or 4x hqaf would work better then the hq nvidia settings?

-measure the performance of overclocked 7900gtx and not a stock gtx against a stock x1900xtx?

-not test the cards at the same detail level?

-not saying what .ini tweaks were done and for which card?

-at the end of the page advertise a QUAD SLI?

i know why , because 7900gtx would get owned like you wouldnt believe ,


nice nvidia commercial tho
 
Stereophile said:
You NEVER drop below 30 outside at those settings ?
Impressive.
My 1800XL running at close to XT speeds drops below 30 and even 20 quite frequently
at only 1280x1024 HDR, not even maxed (no grass/self shadow)

Well I have no grass/self shadows and no HQAF, but apart from that, I'm pretty sure the settings are at max. Definitely no AA either...I never saw the need at such a high resolution. HDR is on though, as you can see in the screencap. Yeah, I enabled the setting in the INI that shows your framerate while in-game. The only time it's dropped into the 20s is when I'm outside and there's a lot of lighting (a night fight in a town or a big fire usually does it). Apart from that, it usually hovers in the low-mid 30s.
 
Bona Fide said:
I don't trust the Bit-Tech review...it's saying that the X1900XTX/7900GTX are limited to 1280x1024, even though I'm playing Oblivion completely fine on my X1800XT at 1680x1050, HDR, no AA, max in-game settings.

30+fps outdoors
60+fps indoors



I call shinanigans on your claims. Prove it to me with a FRAPS benchmark.
 
bobrownik said:
if they dont see a difference in image quality, why did they:

- enable the grass shadows on x1900xtx and disable it on the 7900gtxoc?

- use hqaf on x1900xtx when the stock balanced settings is better then the nvidias HQ setting?

-why did they use 16x hqaf ,knowing its a performance hog , when no hqaf or 4x hqaf would work better then the hq nvidia settings?

-measure the performance of overclocked 7900gtx and not a stock gtx against a stock x1900xtx?

-not test the cards at the same detail level?

-not saying what .ini tweaks were done and for which card?

-at the end of the page advertise a QUAD SLI?

i know why , because 7900gtx would get owned like you wouldnt believe ,


nice nvidia commercial tho

What are you talking about. They enabled all those options because its a "best playable" review. The X1900XTX was able to play at 1280x1024 with higher settings than the 7900GTX, so they just kept enabling the options, what did you expect?

The XTX was able to have max details, 16HQ AF and grass shadows and maintain playability at 1280x1024. The 7900GTX did get "owned".
 
As far as people able to play at higher resolutions at "max in game settings", I really believe there is large differences between what people are viewing as "max settings".
 
How much does this game cost? Is it a pay once and that's it then the servers are free to play on game or a pay then pay monthly game?

~Adam
 
ryankenn said:
What are you talking about. They enabled all those options because its a "best playable" review. The X1900XTX was able to play at 1280x1024 with higher settings than the 7900GTX, so they just kept enabling the options, what did you expect?

The XTX was able to have max details, 16HQ AF and grass shadows and maintain playability at 1280x1024. The 7900GTX did get "owned".

yeah , i know its the best playable settings review,

thats why they shouldnt stick the benchmark numbers right next to each other for camparison ,
 
bobrownik said:
yeah , i know its the best playable settings review,

then they shouldnt stick the benchmark numbers right next to each other for camparison ,

I guess so, I just don't even look at the min/avg/max numbers anymore. If HardOCP or Bit Tech tells me the game is playable at 1280x1024 with XXX settings, I just take their word for it. I'm sure pulling the details to similar levels would see the minimum frame number go up, but the avg was already higher than the 7900, so I think the overall performance was still better. I just was focusing on the playable settings stats, I didn't really look at the FPS numbers they put up.
 
CleanSlate said:
How much does this game cost? Is it a pay once and that's it then the servers are free to play on game or a pay then pay monthly game?

~Adam


SIngle player only, 40 bucks.
 
CleanSlate said:
How much does this game cost? Is it a pay once and that's it then the servers are free to play on game or a pay then pay monthly game?

~Adam
......it is a single player only game......Elder Scrolls has ALWAYS been single player only.
 
Bona Fide said:
I don't trust the Bit-Tech review...it's saying that the X1900XTX/7900GTX are limited to 1280x1024, even though I'm playing Oblivion completely fine on my X1800XT at 1680x1050, HDR, no AA, max in-game settings.

30+fps outdoors
60+fps indoors

i get that as well on an x1800xt (700mhz/1.6ghz), but it does dip below 30fps outdoors (and i'm only running 1440x900):


(fps counter in top right)

oh.. and hdr is nice, but the jaggies drive me crazy! people don't pay $4-500+ for a video card to have jaggies! don't they know that? :mad:
 
Erasmus354 said:
......it is a single player only game......Elder Scrolls has ALWAYS been single player only.

I'm not a big RPGer, I've never really ever looked at any Elder scrolls, sorry if I offended you at all ;)

~Adam
 
CaiNaM said:
i get that as well on an x1800xt (700mhz/1.6ghz), but it does dip below 30fps outdoors (and i'm only running 1440x900):


(fps counter in top right)

oh.. and hdr is nice, but the jaggies drive me crazy! people don't pay $4-500+ for a video card to have jaggies! don't they know that? :mad:
Tell that to Bethsheba and using a HDR format that is incompatible with AA on ALL video cards currently.
biggrin.gif
 
Cainam/Maniac, guess you missed what I posted at R3D in that thread:


16x AF (click the thumbnail):


16x HQ AF (click the thumbnail):



Just like I said in the R3D thread and AT, the X1900 cards are clearly doing more work with HQ AF enabled as well as grass shadows. I'll be putting up some FRAPs results later tonight or tomorrow depending on when I finish. So far you've been wrong on all counts.
 
You should upload your save in that same spot so I can start and take a screenshot at the same place comparing 7 series AF 16x. Let me know res etc and we can see a comparo :D
 
Xenozx said:
You should upload your save in that same spot so I can start and take a screenshot at the same place comparing 7 series AF 16x. Let me know res etc and we can see a comparo :D

What would be the point? Unless you have a special edition 7900 that has hardware supported angle independent AF.
 
5150Joker said:
Cainam/Maniac, guess you missed what I posted at R3D in that thread:


Just like I said in the R3D thread and AT, the X1900 cards are clearly doing more work with HQ AF enabled as well as grass shadows. I'll be putting up some FRAPs results later tonight or tomorrow depending on when I finish. So far you've been wrong on all counts.

umm.. no i haven't. you said yourself in that thread HQAF doesn't work w/o a reboot (as someone pointed out earlier in this thread btw)... and you've yet to show me anything regrading performance differences.

also, the grass shadows have to be run in third person btw (no clue how bt performed theirs) as in first person the char does not cast a shadow (which is why earlier there was no perf diff between it on/off).

and what are you doing here anyway? in that same thread you stated brent (and/or hocp in general) is just an "nvidia mouthpiece"?
 
5150Joker said:
What would be the point? Unless you have a special edition 7900 that has hardware supported angle independent AF.

heh.. he doesn't want anything that might in any way show anything detremental to the ati part. it's not about discussing stuff.. it's about getting an opportunity to talk smack about nvidia.

check the title he game the thread he referenced: Oblivion Benchmarks: X1900 XTX stomps 7900 GTX
 
Wasn't framerate difference between AF and HQAF on ATI cards proved to be relatively insignificant( <1%)? So it's sort of a guaranteed leave it on if you have a 1900 because it'll look better and run just as fast scenario. On older cards there's a slightly larger difference but for the 1900's it's minimal.

And for grass shadows I think trees, people, buildings cast shadows so you'll have to start checking the backside of some of those to really see the effect. I'm sure it's a little more work but not a huge deal.
 
CaiNaM said:
umm.. no i haven't. you said yourself in that thread HQAF doesn't work w/o a reboot (as someone pointed out earlier in this thread btw)... and you've yet to show me anything regrading performance differences.

Yet you claimed HQ AF didn't work at all and the XTX wasn't doing any extra work until proven wrong by me and others in that thread. So much for your crediblity. I'll have the benchmarks up soon, don't worry about it though it's just going to prove you wrong yet again.

CaiNaM said:
heh.. he doesn't want anything that might in any way show anything detremental to the ati part. it's not about discussing stuff.. it's about getting an opportunity to talk smack about nvidia.

check the title he game the thread he referenced: Oblivion Benchmarks: X1900 XTX stomps 7900 GTX

So are you now fantasizing that the 7900 will magically have angle independent AF? :rolleyes: BTW the other poster at R3D confirmed HQ AF works fine for him in CCC so it's only ATi Tray Tools that has the problem of needing a reboot so far.
 
5150Joker said:
Cainam/Maniac, guess you missed what I posted at R3D in that thread:


16x AF (click the thumbnail):


16x HQ AF (click the thumbnail):



Just like I said in the R3D thread and AT, the X1900 cards are clearly doing more work with HQ AF enabled as well as grass shadows. I'll be putting up some FRAPs results later tonight or tomorrow depending on when I finish. So far you've been wrong on all counts.


The HQ aniso shot looks worse than the 16af shot. There are some pink flowers to the left that aren't rendered not to mention many blades of grass to the very far right. I'm not saying ATI has bad iq in this game but your not doing any justice with these shots.

I ask anyone to open both pics up and maximize them then click back and forth between them on the taskbar and tell me if you see it any different. Let me know :D
 
5150Joker said:
Yet you claimed HQ AF didn't work at all and the XTX wasn't doing any extra work until proven wrong by me and others in that thread. So much for your crediblity. I'll have the benchmarks up soon, don't worry about it though it's just going to prove you wrong yet again.

i claimed it wasn't working, and asked others to check.

i also stated the review said there was no difference in image quality. my reasoning was if there was no difference in the image, then it wasn't doing "more work". perhaps they thought they turned on hqaf but in essence they didn't? i don't know, as they really don't explain HOW the tests were done.

all you've done so far is taken a review which concluded they were close, and turned it into your own "nv gets it's ass kicked" from extrapolating your own conclusions based on little knowledge of how they conducted the tests.

now, i think the methodology of that "test" was flawed (was it done in 1st person or third? how many runs? what areas? how were they duplicated to ensure they were the same? we have no clue whatsoever), but it stated what it's stated:

The BFG Tech GeForce 7900 GTX OC wasn't quite as fast as the Radeon X1900XTX, as we were unable to run the game at its maximum settings. We were able to use almost maximum settings and high quality drivers too, as there were some areas where texture filtering could have been a little better - this was removed when optimisations were removed.

The difference between the image quality on the Radeon X1900XTX and GeForce 7900 GTX wasn't noticeable

you change that by applying how one arhcitecture is affected and applying it to a different architecture. that doesn't work. ati loses most non-aa/af tests, yet wins many of the aa/af tests. why is that? because the architecture is different and it's more efficient when rendering with aa/af. that's a known fact.

you've had what.. 5 days to actually prove your theory (which you state as fact - that's what i have a problem with) and you haven't done anything but toot ati's horn.

this isn't about which card is faster. i've stated all along imo the XT is the faster card (tho frankly i could care less which one edges out the other]. what it's about is your ati crusade and your taking someone else's "tests" and adding your own conclusions to them based on faulty logic.

if your want to give your conclusion, then the proper way to do it is set it up, run tests, and prevent the conclusion and the logic used to reach it. you haven't done that. all you've done is taken someone else's work, and changed their conclusion to suit your agenda.

So are you now fantasizing that the 7900 will magically have angle independent AF?

has nothing to do with that. read what i said.

BTW the other poster at R3D confirmed HQ AF works fine for him in CCC so it's only ATi Tray Tools that has the problem of needing a reboot so far.

you were using tray tools then when it wouldn't work for you? i'm using CCC and it's not switching back and forth...
 
Lord_Exodia said:
The HQ aniso shot looks worse than the 16af shot. There are some pink flowers to the left that aren't rendered not to mention many blades of grass to the very far right. I'm not saying ATI has bad iq in this game but your not doing any justice with these shots.

I ask anyone to open both pics up and maximize them then click back and forth between them on the taskbar and tell me if you see it any different. Let me know :D

that's the way oblvion dynamically renders... even at the exact same save spot, when you come back in it will be different...
 
^^ yeah you're right about the pink flowers, not to mention quite a bit of grass/stalks not being rendered in the bottom right corner. Maybe it sacrifices close-range for that "slight" detail on the grass "horizon" (the farthest you can see the grass)?

Meh, I don't even have the game yet. When I'm finished upgrading and actually getting the game, I may check this out myself.
 
Lord_Exodia said:
The HQ aniso shot looks worse than the 16af shot. There are some pink flowers to the left that aren't rendered not to mention many blades of grass to the very far right. I'm not saying ATI has bad iq in this game but your not doing any justice with these shots.

I ask anyone to open both pics up and maximize them then click back and forth between them on the taskbar and tell me if you see it any different. Let me know :D

That's what I first noticed when he posted those. You might want to choose a scene that better represents any improvments that HQ offers, because that one is worse with it on. Look at the top of the hill dead center, the jaggies are identical, but in the middle tree ( the three big ones in the middle left ish), the center one's lowest cluster of leaves is more defined in terms of shape of the leaves and detail in the "lower" quality setting.

This thread is getting derailed, the point is that some guy with an X1800XT is telling us he can run at 1600x with all HQ settings, and other are saying ( myself included ) they can't. I tend to lean towards Bit Tech's review until I see a more detailed evaluation from another site or member to prove otherwise.
 
Cainam:

edit: nm. I'll put up the results at R3D. Arguing about it right now is pointless.
 
Lord_Exodia said:
The HQ aniso shot looks worse than the 16af shot. There are some pink flowers to the left that aren't rendered not to mention many blades of grass to the very far right. I'm not saying ATI has bad iq in this game but your not doing any justice with these shots.

I ask anyone to open both pics up and maximize them then click back and forth between them on the taskbar and tell me if you see it any different. Let me know :D


Wow guess the point of HQ AF vs Standard AF went right over your head. The game renders grass/rocks/tree branches randomly each time you load it, that wasn't what the picture was highlighting. Look at the angle in the hill where the cursor is and you'll see the difference between HQ AF vs AF right away. The entire purpose of those shots is to show Cainam/Maniac that he was wrong about assuming HQ AF not working in this game or making any visual impact.
 
Back
Top