Mav451 said:Lol, quack/quake actually working in favor of ATi. I dunno if this is irony in its finest moment...
Not really like the Quack incident, since the problem here was with the game. It's like forcing widescreen by modifying the INI.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Mav451 said:Lol, quack/quake actually working in favor of ATi. I dunno if this is irony in its finest moment...
John Reynolds said:What I find odd is why a 7900 GTX would lose almost 33% of its frame rate going from 10x7 to 12x10 with 4x AA and 8/AF enabled?
killerD said:So the 1900XTX now takes the cake on this, BF2, and Fear (at least at higher resolutions). That is the three most advanced engines out there. Throw in superior IQ, and it is time to acknowledge the 1900xtx is the fastest card you can buy.
razor1 said:ATi fillrate advantageor nV fillrate bug showing up again. I will take the first though
![]()
Jbirney said:I thought they had the same fillrate????
zzzVideocardzzz said:eww 10x7 that sh1t is nasty
acidic said:when is it called trolling when we point out obvious bias in your analysis OP
Cabezone said:See the nice thing about not being an ATI or Nvidia worshipper like yourself, is that I can revise and change my opinion. I don't feel the need to change my original posts like some people do. I also tend to read through a thread before commenting so I don't look like an ass.
Cabezone said:dual card setups.
Matas said:And it seems that Oblivion wants more than 2Gb memory.
killerD said:So the 1900XTX now takes the cake on this, BF2, and Fear (at least at higher resolutions). That is the three most advanced engines out there. Throw in superior IQ, and it is time to acknowledge the 1900xtx is the fastest card you can buy.
Its programming is not that far off. It's alot better then a shit load of people expected. If you ever played morrowind you would understand. What seems to really rollercoaster performance is those damn speedtrees renders even on the 360. The game was made PC first and it being a port is a poor excuse.entre nous said:This is a joke right?
Oblivion is probably the worst engine ever programmed. Its performance is a joke. Its a console port built for the X360 (ATI chipset) and will never be used ever again. That is unless Bethesda releases expansions.
You can judge a cards performance on those three games but I don't see any future games using those "advanced" engines. RTCW and Quake Wars are two blockbuster titles using the Doom 3 engine. For everyone who likes HL2, Valve is releasing 4 new expansions using the source engine with HDR and new eye-candies.
Unless there is a FEAR 2, I wouldn't worry that ATI is faster in those 3 games.
great, looking forward to itBrent_Justice said:I am
I'm doing a roundup of 3 MSI cards
You'll have info for a 7800 GTX 512, X1800 XT and X1900 XTX in Oblivion at 4:3 resolution and 16:10 Widescreen resolution when i'm done
Cabezone said:See the nice thing about not being an ATI or Nvidia worshipper like yourself, is that I can revise and change my opinion. I don't feel the need to change my original posts like some people do. I also tend to read through a thread before commenting so I don't look like an ass.
razor1 said:ATI's memory controller gives it a fillrate and bandwidth advantage. Theoretically numbers aside
ivzk said:Your best bet would be not to look like an ass right off the bat. Usually keeps the flames and non relevant posts to a minimum.![]()
Cabezone said:Actually the only way to to protect myself from fanATIcs like yourself except by hiding in a dark corner and unpluging my computer. I choose not to however and won't let you fools chase me away.
Jbirney said:Well theoretically NV has the slight advantage, its like you say in real life ATIs is more efficent. But you did not state which one you are talking aboutStill not 100% why 3Dmarks Single Texture fill rate shows the NV parts way out in front...
texuspete00 said:!!!!!!s are too funny
"UPDATE: It's come to our attention that CrossFire support can be forced by renaming the Oblivion.exe executable file to "AFR-FriendlyD3D.exe". We've confirmed that this fix works and we're in the process of re-running our CrossFire benchmarks now. We'll have updated performance numbers for CrossFire shortly. "
I wonder if this will be fixed in the next driver release.
I'm SLIing GTs for this game though apparently from some other threads I'm an ATi !!!!!!. I didn't know. I''m sure it's been posted but it's too funny. Use logic and you always get called a !!!!!!.
ivzk said:The title of the thread states, and I quote
" Good Oblivion Benchmarks on Firingsquad "
After that you state, again I quote
" Ati eeks out a win in the single card, but Nvidia absolutely crushes them in the dual card setups "
Your Nvidia bias is clear as day. Nobody is trying to chase Nvidiots like yourself into a dark corner. Why should they. Threads like yours put a smile on many peoples' faces.
PS
If I had made a thread with the title you used, and then proceeded to state
" ATI absolutely crushes Nvidia in single card benchmarks, and SLI barely beats a single ATI card."
I would most definitely look like an ass without twisting the facts much more than you did.
However, this is your thread , started by you. Put the 2 and 2 together.
Anyways, could someone with a x-fire setup try this and let us know if it actually works.
http://rage3d.com/board/showthread.php?p=1334274801#post1334274801
ivzk said:The poster of the thread I linked to isn't implying to rename the oblivion executable to AFR-friendlyD3D.exe. He is implying that you rename it to an executable of a known game that works in AFR mode. Maybe fear.exe. That's what I got out of it anyways.
Hey texaspete, did you make that quote up or is that on the firing squad article. Can't get to that from work.
EDIT: After re-reading a couple of times, I think the OP in the rage thread IS implying to actually name the oblivion executable to AFR-friendlyD3D.exe
Cabezone said:They updated it with crossfire benchmarks. Even the x1800xt beats the 7900GTX in crossfire, although it's not quite working correctly. Either Nvidia has more optimization to do, or the ATI cards are better built for todays games.
Card Min FPS Max FPS
GeForce 7900 GTX SLI 35 47
Radeon X1900 XTX CrossFire 48 57
Spank said:whats not so funny is that oblivion is unplayable with crossfire, just have to hope they have it fixed in the 6.4 drivers.
John Reynolds said:http://www.beyond3d.com/forum/showpost.php?p=733800&postcount=183
Gee, dare I try this at 1920x1200 with 4x AA and HDR enabled on two X1800 XTs?![]()
Cabezone said:http://www.xbitlabs.com/articles/video/display/geforce7900gtx_13.html
This is odd, they show the two competing cards giving and taking in different area's. In this one ATI crushes the indoor, but Nvidia has a much higher minimum FPS outdoors. Either the ATI crossfire is more whacked than we thought, or this game is impossible to benchmark properly.
On ething I noticed about the game on my Nvidia cardf is that enabling HDR doesn't seem to impact FPS on my system. There's gotta be something wrong in the driver for that to happen right? Shouldn't I take a FPS hit with HDR? I think I'll try disabling Nvidia dual core optimizations tonight, since they can have a negative impact of multithreaded games.
entre nous said:I think you should take another look at the graph