My Catalyst AI benchmarks

jbz7890

Gawd
Joined
Sep 22, 2008
Messages
811
Since people were wondering about the performance impact of Catalyst AI, I decided to do some Crysis benchmarks. (I'm using the XP very high tweak, so the game is on very high settings even though it says high.)

No AI:

noai.png


Standard AI:

standard.png


Advanced AI:

advanced.png
 
interesting.. good to know before i switch back to ATI here in a few months..
 
Third picture isn't working here, could have just given numbers...


all the pictures worked fine for me.. theres no difference frame rate wise between on and advanced so you arent missing much.. but im bored tonight so i put his third image on another hosting site.. even though it worked for me anyways..


 
.100 quite an improvement. A few extra FPS from on/off is pretty good though (when dealing with Crysis at least), any noticeable IQ difference? I always got the vibe that using AI was taboo.
 
with the settings he used there would be 0 IQ difference.. all it did was raise the avg frame rate..
 
Note that for people using dual-GPU cards or Crossfire solutions, disabling Catalyst AI will disable all except the primary GPU.
 
So, is anyone else wondering what hardware the OP is running?
 
I don't see Cat. A.I being of value with a single GPU. I'll do the exact same tests tomorrow with my 4870x2 in win7 and Vista 64 and post my results.
Posted via [H] Mobile Device
 
NEXT BENCH RUN- 8/6/2009 10:12:13 PM - Vista 64
Beginning Run #1 on Map-island, Demo-benchmark_gpu
DX10 1900x1200, AA=2x, Vsync=Disabled, 64 bit test, FullScreen
Demo Loops=1, Time Of Day= 9
Global Game Quality: High
==============================================================
TimeDemo Play Started , (Total Frames: 2000, Recorded Time: 111.86s)
!TimeDemo Run 0 Finished.
Play Time: 69.26s, Average FPS: 28.88
Min FPS: 22.31 at frame 1970, Max FPS: 35.98 at frame 995
Average Tri/Sec: -19203028, Tri/Frame: -665016
Recorded/Played Tris ratio: -1.38
TimeDemo Play Ended, (1 Runs Performed)
==============================================================

Completed All Tests

<><><><><><><><><><><><><>>--SUMMARY--<<><><><><><><><><><><><><>

8/6/2009 10:03:53 PM - Vista 64

Run #1- DX10 1900x1200 AA=2x, 32 bit test, Quality: High
NEXT BENCH RUN- 8/6/2009 10:04:29 PM - Vista 64


Run #1- DX10 1900x1200 AA=2x, 64 bit test, Quality: High ~~ Last Average FPS: 28.80

NEXT BENCH RUN- 8/6/2009 10:12:13 PM - Vista 64


Run #1- DX10 1900x1200 AA=2x, 64 bit test, Quality: High ~~ Last Average FPS: 28.88
 
8/6/2009 10:16:29 PM - Vista 64
Beginning Run #1 on Map-island, Demo-benchmark_gpu
DX10 1900x1200, AA=2x, Vsync=Disabled, 64 bit test, FullScreen
Demo Loops=1, Time Of Day= 9
Global Game Quality: High
==============================================================
TimeDemo Play Started , (Total Frames: 2000, Recorded Time: 111.86s)
!TimeDemo Run 0 Finished.
Play Time: 48.34s, Average FPS: 41.37
Min FPS: 27.32 at frame 1986, Max FPS: 60.74 at frame 866
Average Tri/Sec: -27486922, Tri/Frame: -664352
Recorded/Played Tris ratio: -1.38
TimeDemo Play Ended, (1 Runs Performed)
==============================================================

Completed All Tests

<><><><><><><><><><><><><>>--SUMMARY--<<><><><><><><><><><><><><>

8/6/2009 10:16:29 PM - Vista 64

Run #1- DX10 1900x1200 AA=2x, 64 bit test, Quality: High ~~ Last Average FPS: 41.37
 
8/6/2009 10:19:34 PM - Vista 64
Beginning Run #1 on Map-island, Demo-benchmark_gpu
DX10 1900x1200, AA=2x, Vsync=Disabled, 64 bit test, FullScreen
Demo Loops=1, Time Of Day= 9
Global Game Quality: High
==============================================================
TimeDemo Play Started , (Total Frames: 2000, Recorded Time: 111.86s)
!TimeDemo Run 0 Finished.
Play Time: 48.67s, Average FPS: 41.09
Min FPS: 26.61 at frame 1951, Max FPS: 60.49 at frame 864
Average Tri/Sec: -27307242, Tri/Frame: -664542
Recorded/Played Tris ratio: -1.38
TimeDemo Play Ended, (1 Runs Performed)
==============================================================

Completed All Tests

<><><><><><><><><><><><><>>--SUMMARY--<<><><><><><><><><><><><><>

8/6/2009 10:19:34 PM - Vista 64

Run #1- DX10 1900x1200 AA=2x, 64 bit test, Quality: High ~~ Last Average FPS: 41.09
 
I don't see Cat. A.I being of value with a single GPU. I'll do the exact same tests tomorrow with my 4870x2 in win7 and Vista 64 and post my results.
Posted via [H] Mobile Device

7% increase in FPS isn't of value? Half the people here OC their video cards and get about the same.
 
7% increase in FPS isn't of value? Half the people here OC their video cards and get about the same.

Maybe 7 percent is of value when working with 100 fps to start. Not less than 35, especially at that resolution.
Posted via [H] Mobile Device
 
DX9 testing?

Moving right along... Oh nevermind.

Thanks RADEoN, a lot more useful to those of us who are running games properly with a current OS. I'll try the same thing when I slide my 4890 into my P5NE-SLi... waiting on my x48 and Q6600. I hope my 650i chipset doesn't reject AMD with extreme prejudice. :p
 
Maybe 7 percent is of value when working with 100 fps to start. Not less than 35, especially at that resolution.
Posted via [H] Mobile Device

I would argue that the reverse is true. If you are at 100 fps, there really isn't any point to more fps, as you won't be able to see it anyway. At lower fps, every little bit helps, and 7% can easily mean the difference between playable and choppy.
 
I would argue that the reverse is true. If you are at 100 fps, there really isn't any point to more fps, as you won't be able to see it anyway. At lower fps, every little bit helps, and 7% can easily mean the difference between playable and choppy.

But at a more realistic resolution for todays cards and games, that 7 percent is only going to get smaller.
 
But at a more realistic resolution for todays cards and games, that 7 percent is only going to get smaller.

Why would you think that? Increasing the resolution won't suddenly make optimizations become less effective. 7% improvement at 1280x1024 will likely stay pretty close to 7% at 1920x1200. And, of course, as you increase the resolution you increase the graphics load making all the optimizations you can get even MORE important, not less.
 
But I'm sure it won't be 7 percent at higher res.
Posted via [H] Mobile Device
 
Yea going from 8fps to 8.4. Now you got the headroom to crank AA all the way up.
Posted via [H] Mobile Device

Well if you're at 8fps you have the settings way to high for the card or are playing crysis :p

But seriously a free increase is a free increase. Damn ati/amd for giving me faster fps for free with no image difference - i mean what the crazy, if nvidia had this and not ati then we wouldnt hear the end of this "awesome" feature from the fanboys (i choose price/performance and still like nvidia as well btw - well not the 9xxx series that was gay).
 
Well if you're at 8fps you have the settings way to high for the card or are playing crysis :p

But seriously a free increase is a free increase. Damn ati/amd for giving me faster fps for free with no image difference - i mean what the crazy, if nvidia had this and not ati then we wouldnt hear the end of this "awesome" feature from the fanboys (i choose price/performance and still like nvidia as well btw - well not the 9xxx series that was gay).

Where is there more of a notable performance difference, with my dual gpu setup or his single GPU setup?
 
But I'm sure it won't be 7 percent at higher res.
Posted via [H] Mobile Device

Why would you think that? If I can render 1 pixel 7% faster, then I can render 1,000,000 pixels 7% faster. So yes, it will be very close to 7% faster at *ALL* resolutions. I don't expect it to be completely linear, but it will be very close.
 
ok, so run the crysis benchmark at a higher resolution with and without A.I, and I'll shut up if it's what you're claiming it would be.
 
Why would you think that? If I can render 1 pixel 7% faster, then I can render 1,000,000 pixels 7% faster. So yes, it will be very close to 7% faster at *ALL* resolutions. I don't expect it to be completely linear, but it will be very close.

Well, may factors affect how much faster it'll be able to render something on different resolutions. You may need more vram at higher resolutions. You may be limited by rops/tmus with higher settings. These may end up bottlenecking the performance at a different variety of settings.
 
Well, may factors affect how much faster it'll be able to render something on different resolutions. You may need more vram at higher resolutions. You may be limited by rops/tmus with higher settings. These may end up bottlenecking the performance at a different variety of settings.

Very true, of course, but it is quite possible for ATI to make optimizations for those bottlenecks as well, meaning at higher resolution gains of MORE than 7% is very possible.

ok, so run the crysis benchmark at a higher resolution with and without A.I, and I'll shut up if it's what you're claiming it would be.

I can't. I don't have Crysis, and I don't have a 1920x1200 monitor.
 
Numbers would probably be better with A.I. on with a more ATI friendly game. Crysis is a notoriously poor ATI and Crossfire performer. Don't get me wrong, I appreciate the numbers being run on Crysis, but of all the games out currently, Crysis is one of the most un-ATI friendly, so will result in the least improvement with ATI optimizations.
 
Back
Top