oblivion benchmarks - something is not right?

ryankenn said:
That's what I first noticed when he posted those. You might want to choose a scene that better represents any improvments that HQ offers, because that one is worse with it on. Look at the top of the hill dead center, the jaggies are identical, but in the middle tree ( the three big ones in the middle left ish), the center one's lowest cluster of leaves is more defined in terms of shape of the leaves and detail in the "lower" quality setting.
That's only because the trees aren't identical because they are swaying in the wind and the screenshot didn't capture them at the exact same moment. Additionally, all of the foliage is randomly generated and slightly different during each run. It isn't the HQAF causing the visual differences in the trees/flowers/grass in those shots, its just how the engine works.

The only difference you can see in those shots where HQAF is taking effect is in the grass texture near the bottom right of the cursor. The HQAF shot has a nicer grass texture in that spot.
 
NoDamage said:
That's only because the trees aren't identical because they are swaying in the wind and the screenshot didn't capture them at the exact same moment. Additionally, all of the foliage is randomly generated and slightly different during each run. It isn't the HQAF causing the visual differences in the trees/flowers/grass in those shots, its just how the engine works.

The only difference you can see in those shots where HQAF is taking effect is in the grass texture near the bottom right of the cursor. The HQAF shot has a nicer grass texture in that spot.

Now that you've pointed that out, I can see a very slight difference. The fact you have to freeze frame and isolate the area to even notice a hairline difference amplifies the uselessness of the HQ function. Until a magical mushroon suddenly becomes visible with the HQ mode, it looks like a waste of performance in this game. If you were walking through this section there is now way you'd spot that difference.
:D
 
5150Joker said:
Wow guess the point of HQ AF vs Standard AF went right over your head. The game renders grass/rocks/tree branches randomly each time you load it, that wasn't what the picture was highlighting. Look at the angle in the hill where the cursor is and you'll see the difference between HQ AF vs AF right away. The entire purpose of those shots is to show Cainam/Maniac that he was wrong about assuming HQ AF not working in this game or making any visual impact.

that's not the point at all.

it's apparent there is something messed up with enabling/disabling it (the same thing happened to you, when originally you could not get haqf to work).

here is what i found:

1. set std AF in CCC
2. start oblivion - the game correctly renders std af
3. exit oblivion
4. enable HQAF in CCC
5. start oblivion - the game still renders std AF

if you reboot after enabling HQAF, oblivion will render in HQAF

i found that there is a workaround using ati traytools.
after changing the AF settings, click on att and go to display > reset display driver
oblvion will now render with the correct settings.

in the following shots the difference is easily seen:





this leads me to believe that in the review BT was not actually running HQAF, as the difference would have been easily noticeable - therefore in the benchmarks the XT was likely not doing "more work" as joker claims.

also notice the XT is able to render the scene with no impact in framerate (it actually is 1fps faster). this is certainly a fantastic feature, but shows that even if BT was running HQAF and didn't notice it (which is hard to believe) it had no impact on the benchmark numbers.
 
Man, I started at those two pictures forever...and I can't see a difference between them. Must be me...
 
must be, i see a huge difference, and I notice a difference in-game in Oblivion with HQ AF, I like the feature, in all games with outdoor settings it really helps
 
AppaYipYip said:
I call shinanigans on your claims. Prove it to me with a FRAPS benchmark.

Sounds about right to me. Around 30 outdoors and 30-90fps indoors is the same for me at with a 7900GT and at the same exact settings (don't know if he has pretty water mods and reflections, though).
 
I found those benchmarks to be suprising also.. I find that the texture size makes NO difference in my case.. Everything just looks uglier.

I currently run the game in 1280x768 - and its running playably. (probably between 10-30 FPS out doors).

I wonder how Sli machines work.
 
CaiNaM said:
this leads me to believe that in the review BT was not actually running HQAF, as the difference would have been easily noticeable - therefore in the benchmarks the XT was likely not doing "more work" as joker claims.

The XTX also had a higher in-game setting. Which does mean it is doing more work, no matter if the HQ AF was working right or not. FS's article backs up bi-tech in that, the XTX is faster. Heck, the X1800XT is faster than the 7900GTX at times.
 
SPARTAN VI said:
Sounds about right to me. Around 30 outdoors and 60-90fps indoors is the same for me at with a 7900GT and at the same exact settings (don't know if he has pretty water mods and reflections, though).

I went back and checked the settings, and it turns out I didn't max out the water reflections. I did that and my framerates well to the mid 20s outside. No real change for indoor though.
 
Bona Fide said:
I went back and checked the settings, and it turns out I didn't max out the water reflections. I did that and my framerates well to the mid 20s outside. No real change for indoor though.

Mhmm. Looks like the two cards are dead even. If I had a crossfire motherboard, however, I would've gone with the X1800 in a heartbeat. Manufacturer loyalty is rediculous.
 
fallguy said:
The XTX also had a higher in-game setting. Which does mean it is doing more work, no matter if the HQ AF was working right or not. FS's article backs up bi-tech in that, the XTX is faster. Heck, the X1800XT is faster than the 7900GTX at times.

ugh.. need to go back to the earlier posts. it's not at all about which is faster in oblivion. the ati x1k (i say that as the 1800xt is incl. in this as well, as you stated) cards are.

to clarify, joker made his own (erroneous) conclusion that ati dominates as hqaf brings down the framerate as it's doing "more work". BT stated there was no difference in IQ; this to me says it wasn't running at all (once it's running, it's easy to see the difference).

further, the architectures are different, so doing "more work" does not equate to higher benchmark scores. the "work" could be done by another part of the architecture which doesn't really impact the framerate.

also, in fairness grass shadows were discussed, however if you're benchmarking in first person (the default camera), it makes no difference as the player doesn't cast a shadow. BT never stated how the benchmarks were done, so it was faulty logic to simply assume, again, that framerates would be higher had this been turned off.

i'm running an XT, and i'm glad i chose it over the 7900GT (to me it's currently a better value, or "bang for the buck"), and have no problem with the fact it's faster. heck, i even get the advantage of HQAF! the point is joker was/is just plain wrong.
 
Yes he said that HQ AF would bring the frames down. How was he to know bi-tech screwed up? He also said that the in-game settings where higher, and that the XTX was doing more work. Turns out he was right, and wrong. But none of that really matters now, ATi is clearly the better card for Oblivion. For multi card, NV is, for now.
 
CaiNaM said:
that's not the point at all.

it's apparent there is something messed up with enabling/disabling it (the same thing happened to you, when originally you could not get haqf to work).

here is what i found:

1. set std AF in CCC
2. start oblivion - the game correctly renders std af
3. exit oblivion
4. enable HQAF in CCC
5. start oblivion - the game still renders std AF

if you reboot after enabling HQAF, oblivion will render in HQAF

i found that there is a workaround using ati traytools.
after changing the AF settings, click on att and go to display > reset display driver
oblvion will now render with the correct settings.

in the following shots the difference is easily seen:





this leads me to believe that in the review BT was not actually running HQAF, as the difference would have been easily noticeable - therefore in the benchmarks the XT was likely not doing "more work" as joker claims.

also notice the XT is able to render the scene with no impact in framerate (it actually is 1fps faster). this is certainly a fantastic feature, but shows that even if BT was running HQAF and didn't notice it (which is hard to believe) it had no impact on the benchmark numbers.

Now that is a difference. You can tell the better iq easily. It does work then. Just a bit of leg work to make it happen. Nothing a new driver can't fix. :D
 
fallguy said:
Yes he said that HQ AF would bring the frames down. How was he to know bi-tech screwed up? He also said that the in-game settings where higher, and that the XTX was doing more work. Turns out he was right, and wrong.

not in the context he was stating it. the reasoning being ati would be even faster because it was being penalized for doing more work. sure, in a literal sense it is doing more work, but it wasn't being penalized because of it (can't apply the same "rules" or logic to a different architecure in this case), therefore his logic was flawed from the beginning.


ATi is clearly the better card for Oblivion. For multi card, NV is, for now.

overall i agree with you there :)

now i'm gonna actually play it some more rather than "testing" it :D
 
Lord_Exodia said:
Now that is a difference. You can tell the better iq easily. It does work then. Just a bit of leg work to make it happen. Nothing a new driver can't fix. :D

It doesn't look as if the 16X angle dependent mode is doing any filtering at all. I've been playing with 8X AF, and I'm quite certain that, dealing with these sorts of angles, my filtering quality is much more acceptable.

Perhaps I'm living in a bizarro-world.

also, in fairness grass shadows were discussed, however if you're benchmarking in first person (the default camera), it makes no difference as the player doesn't cast a shadow

Other actors do cast shadows, and enabling grass shadows allows other character shadows to cast properly on grass.

At the moment, I'm wrestling with a serious issue where the display drops and the drivers pull a VPU recover. Sometimes I wish I had gone with nVidia again.
 
phide said:
It doesn't look as if the 16X angle dependent mode is doing any filtering at all. I've been playing with 8X AF, and I'm quite certain that, dealing with these sorts of angles, my filtering quality is much more acceptable.

Perhaps I'm living in a bizarro-world.

Thank you! I thought it was just me. I think 8xAF looks far better than 16x, but I'm running Nvidia. I tried both 8x and 16x and the latter looked like there wasn't anything on, which is the way those comparison screenies look to me. 8x is super-sharp and looks very much like the lower Ati hqaf. *shrugs*
 
Bona Fide said:
I don't trust the Bit-Tech review...it's saying that the X1900XTX/7900GTX are limited to 1280x1024, even though I'm playing Oblivion completely fine on my X1800XT at 1680x1050, HDR, no AA, max in-game settings.

30+fps outdoors
60+fps indoors



y did j00 kill that deer.......
 
CaiNaM said:
here is what i found:

1. set std AF in CCC
2. start oblivion - the game correctly renders std af
3. exit oblivion
4. enable HQAF in CCC
5. start oblivion - the game still renders std AF

if you reboot after enabling HQAF, oblivion will render in HQAF

Couldn't it be a "FRAPS" problem? Some time ago i noticed that if you have FRAPS running in the background you can't switch between ADAA performance vs quality. You'll have to turn off FRAPS first and then make the change in the CCC, else it won't work.
Maybe (haven't tried it) this is also related to switching on/off HQAF.
 
yall no that u dont have to run FRAPS to see ur FPS right?

just go into console and type TDT and that will dispaly ur FPS in the top right corner
 
Apple740 said:
Couldn't it be a "FRAPS" problem? Some time ago i noticed that if you have FRAPS running in the background you can't switch between ADAA performance vs quality. You'll have to turn off FRAPS first and then make the change in the CCC, else it won't work.
Maybe (haven't tried it) this is also related to switching on/off HQAF.

hmm.. could be; i haven't checked (altho i don't always run fraps)
 
Cainam/Maniac: You originally speculated the XTX was not doing more work than the OC'd GTX in bit-tech's review by having grass shadows + HQ AF enabled. I claimed the XTX was indeed doing more work and set out to prove that. Here are my results that prove you were wrong all along: http://clan786.com/modules.php?name=Content&pa=showpage&pid=1

Oh and here's what I wrote you at R3D in case you missed it:

http://www.rage3d.com/board/showpost.php?p=1334271642&postcount=145
http://www.rage3d.com/board/showpost.php?p=1334271656&postcount=146
 
5150Joker said:
Cainam/Maniac: You originally speculated the XTX was not doing more work than the OC'd GTX in bit-tech's review by having grass shadows + HQ AF enabled. I claimed the XTX was indeed doing more work and set out to prove that. Here are my results that prove you were wrong all along: http://clan786.com/modules.php?name=Content&pa=showpage&pid=1

Oh and here's what I wrote you at R3D in case you missed it:

http://www.rage3d.com/board/showpost.php?p=1334271642&postcount=145
http://www.rage3d.com/board/showpost.php?p=1334271656&postcount=146

you've shown nothing other than your ability to distort the truth. i'll simply cut & paste my reply from r3d:

5150 Joker said:
My tests were done in the 3rd person and I put up my save files + methods in my review on how to reproduce exactly what I did. Can you say FS was nearly as thorough? You just need to admit it that you were wrong. First you claimed HQ AF was broken until I showed you that it wasn't so then you tried to save face by putting up HQ AF pictures and workarounds. That just shows that you jump to conclusions without fully testing your claims first. I promised I'd put up results that showed the XTX does more work with HQ AF + grass shadows thus proving you wrong yet again and your only retort is that you don't trust my results even though I offer the save files + methods on my page. :nag:

ugh..

your tests don't matter (in the context of our discussion). i'll explain.

my entire issue since the beginning was your changing someone else's conclusion by stating one "stomped" the other (which it did not) was !!!!!! rhetoric.

you stated as fact one card was "doing more work" and therefore the margin of victory was much greater than they concluded.

i said that was crap for a couple of reasons:

1. there was no proof one was doing substantially more work than the other. it was entirely speculative on your part, but you ran with it and made claims of it being factual when it was not. since they didn't see any IQ differences, i said i doubt one was doing "more work" during the tests. my reasoning was if it was doing those things, the difference in IQ would be easily noticeable (as pics of HQAF have shown).

2. you can't simply state as fact that because 1 architecture takes a 5fps "hit" from turning on a feature that a different architecture would be impacted in the same manner. again my reasoning was that ATI certainly implemented AA much more efficiently, as benchmarks all over have often shown nv winning non AA tests, while ATI took the lead when AA/AF was applied. ATI doesn't take the "hit" using AA that nv obviously does.

3. due to the "dynamic" nature of the rendering done and the lack of a consistent way to compare, oblivion is not the best game to use as a peroformance measure. add to that the lack of info on how the BT test was done, there were many unaswered questions. also, regarding your tests being done in third person. cool.. but that wasn't the issue. the issue was, did bit-tech run their tests in third person? something i cannot answer, and neither can you....

i feel no need to retract any of that.

regarding the inability to get AF working. yes, i questioned that, as it would not work for me. i even asked otheres to check (and not only on this forum). when i posted comments regarding my not being able to get it working, i stated this: "anyone is welcomed to try the same thing to see if somehow i'm missing something, or that my methodology and/or setup is flawed in some way... (are there other setting which are required to be changed, perhaps?)"

how is that "jumping to conclusions"? there was no need to "save face" as you claim. sorry.. no such thing happened. i couldn't get it to work (neither could you, initially). i was shown how. cool, thanks, here are some af shots! and that was that. your comments are yet another example of you distorting reality to suit your own agenda.

and as to the "thoroughness" of it all... it doesn't matter, and the reason is simple: FS and BT both showed the same conclusions (and FS even even left the settings the same). they are credible, and you are not. in fact, I feel that's a vindication of what my stance has been all along:

both cards offer similar performance in oblivion, while the XT/XTX is faster. i will say tho that i feel the ati parts enhance the IQ a bit more due to HQAF, however that's not always apparent as it enhances specific situations. still, it's a definate advantage, especially when it comes with little or no performance impact.

in the end, i maintain my position your title was deceptive and misleading, and firingsquads review only reinforces that fact.
 
ryankenn said:
What are you talking about. They enabled all those options because its a "best playable" review.
"Best playable" needs more quotes around it. Am I the only one who thinks that a minimum fps of 15 or 17 is hardly playable?? That's about 17 fps outside and 60 fps inside if you play 50/50 inside/outside on an average of 38 fps.
 
ya i consider -20fps unplayable. .... hell 20fps is BARELY playable
 
Back
Top