Good Oblivion Benchmarks on Firingsquad

Mav451 said:
Lol, quack/quake actually working in favor of ATi. I dunno if this is irony in its finest moment...

Not really like the Quack incident, since the problem here was with the game. It's like forcing widescreen by modifying the INI. :)
 
John Reynolds said:
What I find odd is why a 7900 GTX would lose almost 33% of its frame rate going from 10x7 to 12x10 with 4x AA and 8/AF enabled?

ATi fillrate advantage ;) or nV fillrate bug showing up again. I will take the first though :)
 
killerD said:
So the 1900XTX now takes the cake on this, BF2, and Fear (at least at higher resolutions). That is the three most advanced engines out there. Throw in superior IQ, and it is time to acknowledge the 1900xtx is the fastest card you can buy.

That being said, I'm still not going to buy one. I'd like to keep my heat and energy cost down. I won't being playing at the resolution most of you guys play at.
 
Oblivion is a blast even with the crappy 360 GUI.

Iam really glad I got X1900XT and now even more happier with Zalman VF900-CU on the way :D
 
razor1 said:
ATi fillrate advantage ;) or nV fillrate bug showing up again. I will take the first though :)

I thought they had the same fillrate????



Mav451,
its been stated earlier in the thread that there is no xfire support for this game, unitl then you can use the AFR rename trick on most D3D to see if AFR mode works for them. Basically thats a back door to force any game to use ARF mode in xfire....


Wonder if ATI will be even faster after they cook their drivers like NV did???
 
Jbirney said:
I thought they had the same fillrate????


ATI's memory controller gives it a fillrate and bandwidth advantage. Theoretically numbers aside, a game like this, with huge amount of outdoor rendering, will effect fillrates. This is why the advantage ATi sees is only in outdoor areas. Exact same thing we saw in FEAR. Shaders don't bottleneck the x1900 or the gf7900, its purely AA whats causing the advantage. But with the x1800's it is shader bottlenecked in some areas, specially the indoor areas, where fillrates aren't as important.
 
zzzVideocardzzz said:
eww 10x7 that sh1t is nasty

actually with some AA it's not bad at all. The performance hit for going up to 12x10 is a little higher for me than going 10x7 with AA, and after going back and forth between the two I have to admit while playing, the lower res with AA seems to do better at reducing the jagz.
 
acidic said:
when is it called trolling when we point out obvious bias in your analysis OP

See the nice thing about not being an ATI or Nvidia worshipper like yourself, is that I can revise and change my opinion. I don't feel the need to change my original posts like some people do. I also tend to read through a thread before commenting so I don't look like an ass.
 
I'm playing with X2 3800+, 2Gb Corsair 2-3-3-6 mem, 7800GT at 1024x768 HDR everything is on max.
It seems that my system not nearly can't do fps comparing with firingsquad's test on 7800GT in indoors scenes.
 
can i ask why they are using Windows XP service Pack 1?

is that game less buggy on SP1?....
 
The only important thing to me is Oblivion runs easily on all of these cards and there are tons of quality sliders to control when it doesn't. Maybe I'd care more if I had an LCD, but my 21" CRT can give me any resolution I need.

Like somebody said earlier, this game is going to be a war for ATI/NVIDIA.
 
Cabezone said:
See the nice thing about not being an ATI or Nvidia worshipper like yourself, is that I can revise and change my opinion. I don't feel the need to change my original posts like some people do. I also tend to read through a thread before commenting so I don't look like an ass.


:rolleyes:
 
Cabezone said:
dual card setups.


I still find it hard to believe this is something people care about.

Matas said:
And it seems that Oblivion wants more than 2Gb memory.


Are you sure? I only have 1GB and while the game runs a little slow, I think it's only due to my X800XT PE... It runs well enough for me to play this game 1776x1000 on my HDTV.


Strangely indoors is often slower for me than outdoors, especially in the mages guild halls and sometimes noticeable in caves.
 
killerD said:
So the 1900XTX now takes the cake on this, BF2, and Fear (at least at higher resolutions). That is the three most advanced engines out there. Throw in superior IQ, and it is time to acknowledge the 1900xtx is the fastest card you can buy.

This is a joke right?

Oblivion is probaly the worst engine ever programmed. Its performance is a joke. Its a console port built for the X360 (ATI chipset) and will never be used ever again. That is unless Bethesda releases expansions.

You can judge a cards performance on those three games but I don't see any future games using those "advanced" engines. RTCW and Quake Wars are two blockbuster titles using the Doom 3 engine. For everyone who likes HL2, Valve is releasing 4 new expansions using the source engine with HDR and new eye-candies.

Unless there is a FEAR 2, I woudn't worry that ATI is faster in those 3 games.
 
For this round, ATI I think has the better single card solutions at the high end. The X1900 is great hardware, and if I wanted the best single card performance I could get now, I'd go with the X1900.

That said, ATI's multi-GPU solutions are a little behind, and the best performance at high resolutions and details settings, well, no single card is really close to the high end SLI solutions. SLI gives this game a huge boost.

So its just a matter of perspective and most seem to agree on this. X1900 best single card solution overall. 7900GTX best multi-GPU solution overall.
 
entre nous said:
This is a joke right?

Oblivion is probably the worst engine ever programmed. Its performance is a joke. Its a console port built for the X360 (ATI chipset) and will never be used ever again. That is unless Bethesda releases expansions.

You can judge a cards performance on those three games but I don't see any future games using those "advanced" engines. RTCW and Quake Wars are two blockbuster titles using the Doom 3 engine. For everyone who likes HL2, Valve is releasing 4 new expansions using the source engine with HDR and new eye-candies.

Unless there is a FEAR 2, I wouldn't worry that ATI is faster in those 3 games.
Its programming is not that far off. It's alot better then a shit load of people expected. If you ever played morrowind you would understand. What seems to really rollercoaster performance is those damn speedtrees renders even on the 360. The game was made PC first and it being a port is a poor excuse.

The engine will be used again. I'm pretty sure I read its being used for Bethesda's take on Fallout 3.

The Fear engine probably won't be but fear's effects sure as hell be. Soft shadows will soon become a standard and so will those multiple parallax maps. There's also the game condemned that's coming out this month.

ATI is doing well enough on doom3 for being 3-6 fps behind nvidia in the x1800xt vs 7800gtx on a full ground nvidia based game is great. Nvidia cards on ATI games are a whole different story such as black and white 2.

I say the cards TIE in hl2 but image quality is damn noticeable in that game with all its beautiful textures.
 
Brent_Justice said:
I am

I'm doing a roundup of 3 MSI cards

You'll have info for a 7800 GTX 512, X1800 XT and X1900 XTX in Oblivion at 4:3 resolution and 16:10 Widescreen resolution when i'm done
great, looking forward to it :)
 
Cabezone said:
See the nice thing about not being an ATI or Nvidia worshipper like yourself, is that I can revise and change my opinion. I don't feel the need to change my original posts like some people do. I also tend to read through a thread before commenting so I don't look like an ass.

Your best bet would be not to look like an ass right off the bat. Usually keeps the flames and non relevant posts to a minimum. :D
 
razor1 said:
ATI's memory controller gives it a fillrate and bandwidth advantage. Theoretically numbers aside

Well theoretically NV has the slight advantage, its like you say in real life ATIs is more efficent. But you did not state which one you are talking about :) Still not 100% why 3Dmarks Single Texture fill rate shows the NV parts way out in front...
 
ivzk said:
Your best bet would be not to look like an ass right off the bat. Usually keeps the flames and non relevant posts to a minimum. :D

Actually the only way to to protect myself from fanATIcs like yourself except by hiding in a dark corner and unpluging my computer. I choose not to however and won't let you fools chase me away.
 
Cabezone said:
Actually the only way to to protect myself from fanATIcs like yourself except by hiding in a dark corner and unpluging my computer. I choose not to however and won't let you fools chase me away.

The title of the thread states, and I quote

" Good Oblivion Benchmarks on Firingsquad "

After that you state, again I quote

" Ati eeks out a win in the single card, but Nvidia absolutely crushes them in the dual card setups "

Your Nvidia bias is clear as day. Nobody is trying to chase Nvidiots like yourself into a dark corner. Why should they. Threads like yours put a smile on many peoples' faces. :D

PS

If I had made a thread with the title you used, and then proceeded to state

" ATI absolutely crushes Nvidia in single card benchmarks, and SLI barely beats a single ATI card."

I would most definitely look like an ass without twisting the facts much more than you did.
However, this is your thread , started by you. Put the 2 and 2 together.

Anyways, could someone with a x-fire setup try this and let us know if it actually works.

http://rage3d.com/board/showthread.php?p=1334274801#post1334274801
 
!!!!!!s are too funny

"UPDATE: It's come to our attention that CrossFire support can be forced by renaming the Oblivion.exe executable file to "AFR-FriendlyD3D.exe". We've confirmed that this fix works and we're in the process of re-running our CrossFire benchmarks now. We'll have updated performance numbers for CrossFire shortly. "

I wonder if this will be fixed in the next driver release.

I'm SLIing GTs for this game though apparently from some other threads I'm an ATi !!!!!!. I didn't know. I''m sure it's been posted but it's too funny. Use logic and you always get called a !!!!!!.
 
Jbirney said:
Well theoretically NV has the slight advantage, its like you say in real life ATIs is more efficent. But you did not state which one you are talking about :) Still not 100% why 3Dmarks Single Texture fill rate shows the NV parts way out in front...


Fillrates are also affected by the shaders used, so those theoretically fillrates are meaningless in 3dmark.
 
texuspete00 said:
!!!!!!s are too funny

"UPDATE: It's come to our attention that CrossFire support can be forced by renaming the Oblivion.exe executable file to "AFR-FriendlyD3D.exe". We've confirmed that this fix works and we're in the process of re-running our CrossFire benchmarks now. We'll have updated performance numbers for CrossFire shortly. "

I wonder if this will be fixed in the next driver release.

I'm SLIing GTs for this game though apparently from some other threads I'm an ATi !!!!!!. I didn't know. I''m sure it's been posted but it's too funny. Use logic and you always get called a !!!!!!.


The poster of the thread I linked to isn't implying to rename the oblivion executable to AFR-friendlyD3D.exe. He is implying that you rename it to an executable of a known game that works in AFR mode. Maybe fear.exe. That's what I got out of it anyways.

Hey texaspete, did you make that quote up or is that on the firing squad article. Can't get to that from work.

EDIT: After re-reading a couple of times, I think the OP in the rage thread IS implying to actually name the oblivion executable to AFR-friendlyD3D.exe
 
ivzk said:
The title of the thread states, and I quote

" Good Oblivion Benchmarks on Firingsquad "

After that you state, again I quote

" Ati eeks out a win in the single card, but Nvidia absolutely crushes them in the dual card setups "

Your Nvidia bias is clear as day. Nobody is trying to chase Nvidiots like yourself into a dark corner. Why should they. Threads like yours put a smile on many peoples' faces. :D

PS

If I had made a thread with the title you used, and then proceeded to state

" ATI absolutely crushes Nvidia in single card benchmarks, and SLI barely beats a single ATI card."

I would most definitely look like an ass without twisting the facts much more than you did.
However, this is your thread , started by you. Put the 2 and 2 together.

Anyways, could someone with a x-fire setup try this and let us know if it actually works.

http://rage3d.com/board/showthread.php?p=1334274801#post1334274801


Here's how a non worshipper would have responeded to my initial post:

" I disagree that the ATI cards eek out a win, they clearly have a 30% advantage in the foliage areas. Thats the most demanding part of the game and I feel that makes it a solid ATI win."

Where I would have responded:

" If you read a little farther into the thread you'd see that I brought that up and agree with you."

Wow thats looks like a normal conversation doesn't it?


Instead what I got was:

" Well it's obvious that the origional poseter is a loyal Nvidiot"

Then I replied in kind. I may have read through the review too quickly the first time, that doens't make me biased. People who are not looking for bias in every corner probably read further thatn the first post of a thread. They don't jump down someones throat the minute they spot something that goes against their fanatically held beleif system.

I'll point to Fallguy's reply where he responds with a well thought out post.
 
They updated it with crossfire benchmarks. Even the x1800xt beats the 7900GTX in crossfire, although it's not quite working correctly. Either Nvidia has more optimization to do, or the ATI cards are better built for todays games.
 
I believe there is definitely something wrong driver wise if the X1800XT is beating a 7900GTX crossfire vs SLi. I think both companies and gonna make this game shine very soon. I think both will squeeze at least another 15% boost in this game when better drivers come out. Nvidia's 84.25 was a desperate last minute release and with the proper time I'm sure they can get more going too. ATI should get better performance too. Possibly through memory controller tweaks and better crossfire support. Also the game is long overdue for a patch which will also play a role here as well.
 
ivzk said:
The poster of the thread I linked to isn't implying to rename the oblivion executable to AFR-friendlyD3D.exe. He is implying that you rename it to an executable of a known game that works in AFR mode. Maybe fear.exe. That's what I got out of it anyways.

Hey texaspete, did you make that quote up or is that on the firing squad article. Can't get to that from work.

EDIT: After re-reading a couple of times, I think the OP in the rage thread IS implying to actually name the oblivion executable to AFR-friendlyD3D.exe

Yeppers from the link. Meh it's been covered anyways, my bad. Seemed some people on page1 were questioning the rationale of others that said it will most definitely be fixed. But I'd hope it wasn't insinuated others were !!!!!!s for thinking, hey it will be fixed soon. Of course, it is annoying that Crossfire is not yet SLI. But as PC gamers we put up with these things all the time.
 
Cabezone said:
They updated it with crossfire benchmarks. Even the x1800xt beats the 7900GTX in crossfire, although it's not quite working correctly. Either Nvidia has more optimization to do, or the ATI cards are better built for todays games.


Amazing. I wonder if they'll do mem-controller tweaks and X1xx cards will see a boost
like after Q4 came out. :D
That would be nice. My card does pretty well with this game, but could always use some extra help. :p
 
This one is funny, the nv cards aint even able to reach ati's min fps

Oblivion Performance 1600x1200x32 - Foliage Area 4xAA/8xAF
Code:
Card                       Min FPS Max FPS 
GeForce 7900 GTX SLI       35      47 
Radeon X1900 XTX CrossFire 48      57

whats not so funny is that oblivion is unplayable with crossfire, just have to hope they have it fixed in the 6.4 drivers.
 
Spank said:
whats not so funny is that oblivion is unplayable with crossfire, just have to hope they have it fixed in the 6.4 drivers.


Hopefully.
There are almost 100 people out there counting on it ! :rolleyes:

;)
 
ATI cards always had better image quality in "my eyes" and lot of people usually agree with this. Usually these cards are neck and neck.

What you should consider. Microsoft don't like Nvidia too much right now. Maybe ATI cards get more optimized with DX games that's why better Oblivion benchmarks. Who knows these things. While Nvidia does better with John Carmack's games that are openGL. There are more DX games than OpenGL games.

If the price is similar I would get an ATI card. If you chose Nvidia because your brainwashed that's good too.
 
http://www.xbitlabs.com/articles/video/display/geforce7900gtx_13.html

This is odd, they show the two competing cards giving and taking in different area's. In this one ATI crushes the indoor, but Nvidia has a much higher minimum FPS outdoors. Either the ATI crossfire is more whacked than we thought, or this game is impossible to benchmark properly.

On ething I noticed about the game on my Nvidia cardf is that enabling HDR doesn't seem to impact FPS on my system. There's gotta be something wrong in the driver for that to happen right? Shouldn't I take a FPS hit with HDR? I think I'll try disabling Nvidia dual core optimizations tonight, since they can have a negative impact of multithreaded games.
 
Cabezone said:
http://www.xbitlabs.com/articles/video/display/geforce7900gtx_13.html

This is odd, they show the two competing cards giving and taking in different area's. In this one ATI crushes the indoor, but Nvidia has a much higher minimum FPS outdoors. Either the ATI crossfire is more whacked than we thought, or this game is impossible to benchmark properly.

On ething I noticed about the game on my Nvidia cardf is that enabling HDR doesn't seem to impact FPS on my system. There's gotta be something wrong in the driver for that to happen right? Shouldn't I take a FPS hit with HDR? I think I'll try disabling Nvidia dual core optimizations tonight, since they can have a negative impact of multithreaded games.

I think you should take another look at the graph
 
entre nous said:
I think you should take another look at the graph

I should have specified that I was talking about dual card setups at 1600x1200. The crossfire hias a higher average than it's single card version, but a much lower minimum, outdoors.
 
Back
Top