Some Oblivion benchmark

zzzVideocardzzz

Weaksauce
Joined
Mar 23, 2006
Messages
69
benchmarks I'm guessing a x1900xtx would score 36 or 37 in the 1600x1200 highend benchmark since they didn't include it, thaz almost on par with 2x 7900 gtx
 
It was known that Crossfire didn't work with the game until a week ago...... What should be noticed is a X1800XT keeps up with a 7900GTX. Shows you what good marketing Nvidia has right now over ATI. People continue to buy 7900's not realizing ATI's older offering can keep up or beat NV's current offering in most new games and don't care. Not to mention the IQ differences.


To give you an idea, my X1900's in Crossfire never drop below 50-55fps with the "Chuck" patch.
 
Everytime I'm ready to upgrade, a new game or mode of operation
is used and current generation cards stuggle. I stopped gaming over
3 years ago and I desire to come back, but there is no way I'm
spending big dollars on a new game rig that plays a current gen
game with mediocre frame rate with max settings. I've grown patient
in my older age and I guess I'm going to wait again to see what
the market offers by the end of 2006.

/say 'no' to low FPS :p
 
R1ckCa1n said:
People continue to buy 7900's not realizing ATI's older offering can keep up or beat NV's current offering in most new games and don't care. Not to mention the IQ differences.

I bought two 7900gtx's not because of marketing, but because of engineering... I can't use CF on my mobo and so SLI is my only option.

Anyway I heard the chuck patch creates texture flashing in CF mode, is this true?
 
I guess I'll be the first to say this and I'll only say it because I'm not fully convinced or sold on many of the things that are practically forcefed to us in this forum sometimes.

First to the OP (original poster)

My SLi 7900 GTX rig gets 50% better results than that. with a modified ini file about 75% better. Those results are a bit low on the 7900 side

Now the real comment I want to make is that many people are saying that in shader intense games ati's new architecture is faster. The only game I have really noticed this in is Oblivion, oblivion IMHO (in my honest opinion) is very buggy at this point. At least on 7900 hardware. I've been told by Bethesda support that the game currently doesn't support these cards and is using a 7800 rendering path which is very similar but not quite the same thing. When the patch comes out that will be adressed. in addition the game has crashing issues and all kinds of performance issues on the 7900 cards. sometimes I get low fps out of nowhere in non intense moments, then out of nowhere great performance. I believe there are just bugs and problems going on with nvidia's drivers, lack of optimizations for oblivion in the driver, game support issues on 7900GPUs or a combination of the 3.

Other than Oblivion, where are these examples of the 7900GTX losing in shader intense games? I believe fear has been mentioned, turn on soft shadows and run another benchmark. Nvidia would probably win. Finally, in the benchmark on gamespot, I really bet they did see those results, but if they would have run around for a few more seconds their fps would have gone up giving them more favorable benchmarks enforcing what I've noticed in this game.

Will 7900GTX end up better than X1900XTX in oblivion = probably not

Do I think it can do better in oblivion and come nearly close the X1900XTX single and crossfire performance = absolutely.

Does ATI have better architecture for future games, although I seem to be forcefed this in these forums I'm not sold yet.

I'm not a !!! but a logical demonstration to the contrary is welcome

Final point lately it's seeming that the Architecture of the X1800XT seems to be keeping up with the 7900GTX in this game, if that isn't an indication of a problem with the game and 7900s I dont know what is. Remember the X1800XT does not have the architecture of the R580.
It should be inferior to the 7900GTX in this game but it isn't. There is no advantage in that architecture further enforcing what I believe being a oblivion and 7900 issue, and not a advantage in the R580 architecture. Time will tell for sure but for the present I'm just stating what I've noticed.
 
Lord_Exodia said:
I guess I'll be the first to say this and I'll only say it because I'm not fully convinced or sold on many of the things that are practically forcefed to us in this forum sometimes.

First to the OP (original poster)

My SLi 7900 GTX rig gets 50% better results than that. with a modified ini file about 75% better. Those results are a bit low on the 7900 side

Now the real comment I want to make is that many people are saying that in shader intense games ati's new architecture is faster. The only game I have really noticed this in is Oblivion, oblivion IMHO (in my honest opinion) is very buggy at this point. At least on 7900 hardware. I've been told by Bethesda support that the game currently doesn't support these cards and is using a 7800 rendering path which is very similar but not quite the same thing. When the patch comes out that will be adressed. in addition the game has crashing issues and all kinds of performance issues on the 7900 cards. sometimes I get low fps out of nowhere in non intense moments, then out of nowhere great performance. I believe there are just bugs and problems going on with nvidia's drivers, lack of optimizations for oblivion in the driver, game support issues on 7900GPUs or a combination of the 3.

Other than Oblivion, where are these examples of the 7900GTX losing in shader intense games? I believe fear has been mentioned, turn on soft shadows and run another benchmark. Nvidia would probably win. Finally, in the benchmark on gamespot, I really bet they did see those results, but if they would have run around for a few more seconds their fps would have gone up giving them more favorable benchmarks enforcing what I've noticed in this game.

Will 7900GTX end up better than X1900XTX in oblivion = probably not

Do I think it can do better in oblivion and come nearly close the X1900XTX single and crossfire performance = absolutely.

Does ATI have better architecture for future games, although I seem to be forcefed this in these forums I'm not sold yet.

I'm not a !!! but a logical demonstration to the contrary is welcome

Final point lately it's seeming that the Architecture of the X1800XT seems to be keeping up with the 7900GTX in this game, if that isn't an indication of a problem with the game and 7900s I dont know what is. Remember the X1800XT does not have the architecture of the R580.
It should be inferior to the 7900GTX in this game but it isn't. There is no advantage in that architecture further enforcing what I believe being a oblivion and 7900 issue, and not a advantage in the R580 architecture. Time will tell for sure but for the present I'm just stating what I've noticed.

we're not here to compare your computer, and we're not here to use "custome ini" and most of all we're not here to compare sli vs cf
 
zzzVideocardzzz said:
we're not here to compare your computer, and we're not here to use "custome ini" and most of all we're not here to compare sli vs cf

Then tell me why we are here then. Your the original Poster correct? The original post specifically compared sli to the X1900XTX. I mean isn't this the point to your thread.

I'm just stating things that need to be said. The benchmarks look the way they do for a reason and I believe I know why. I'm just laying all the cards on the table for everyone as some people can be mislead by your post.

If you feel I'm violating any rules in your thread set some ground rules or something in the initial post. A forum is just that, a place for people to proove their point or just state their argument. Sorry if you feel that I crossed a certain line or something but I'm within the guidelines of your initial post.

If you want no inbetween post from unbiased people OR if you dont want any favorable nvidia posts you may want to post these type of threads in the ATI subforum. If you want subjective replies then you posted accordingly
 
Funny to see some people just cant believe NV isnt the best, and that the benches must be wrong.

J-Mag said:
I bought two 7900gtx's not because of marketing, but because of engineering... I can't use CF on my mobo and so SLI is my only option.

Anyway I heard the chuck patch creates texture flashing in CF mode, is this true?

I too had a SLI board, but changed it out for a Crossfire board. Now I have a better motherboard, and sweeter graphics.

No the Chuck patch does not create texture flashing, that was from renaming the .exe to get Crossfire working in Oblivion. You no longer have to rename it with the 6.4 drivers, and thus no more flashing textures.
 
fallguy said:
Funny to see some people just cant believe NV isnt the best, and that the benches must be wrong.

I too had a SLI board, but changed it out for a Crossfire board. Now I have a better motherboard, and sweeter graphics.

No the Chuck patch does not create texture flashing, that was from renaming the .exe to get Crossfire working in Oblivion. You no longer have to rename it with the 6.4 drivers, and thus no more flashing textures.
Same same for me although I sold my 7800GTX for too little.....
 
Xeero said:
thank you for your subjective post

Thanks.

Xeero said:
better motherboard? the way i see it, the CF board is either on par or worse than nforce4 boards. i would not consider either chipset better

Well, there is the way you see it, and the way I and reviews seem to see it. I had the A8N-SLI, its a great board. However, the A8R-MVP its better to me, and most reviewers. Asus fixed a few key errors in the layout, and the A8R overclocks better, keeping HTT@5, not forcing you to drop it. The chipset also runs cooler. So yes, I think its better.

Xeero said:
sweeter graphics? define sweeter graphics? eye candy? frame rate? once again it depends on the game for which one performs better. most of the time they perform on par with each other. are you talking about ati's advantage with HDR+aa?

Sweeter meaning I can have higher settings than SLI'd 7900's, with better frames. It also means HDR+AA, thats sweeter to me. This post is about Oblivion, not any other game.

Xeero said:
i'm not sure if you've noticed, but even CF 1900's with HDR+aa will chug while running Oblivion@ 1920x1200 resolution, when all the in game settings are turned up. IMO a graphic card needs to sustain a pretty good frame rate before having all the eye candy turned up.

Considering how I have X1900's in Crossfire, and a 2405FPW at 1920x1200, I think I know how it plays. It certainly doesnt chug. All I need in Oblivion is around 30 frames, and I get even more than that. Its not a Quake type shooter, you dont need a consistant 60+ frames.

Xeero said:
what good are the features if you can't even use them at a decent frame rate? and no dont tell me to turn down the resolution. i didn't get a sweetass LCD just so i can run games @ 1280x1024 w/ full AA+HDR at a good frame rate.

I can use them at a decent frame rate. One persons "decent" is not always the same for everyone. I have the option to use HDR+AA in Oblivion, and it is very playable. And it looks better, again, thats sweeter.

Xeero said:
and HQ af? according to the screenshots posted by [H] the AF is barely noticeable between the 2

i'm not trying to be biased against one card, i'm just saying both cards offer pretty much the same thing, so I dont understand why there is a need to prove ATI suprerior

Then you need to look again. HQ AF provides a better picture than NV can do. Again, its subjective.

I wasnt trying to prove anything, he said he had a SLI board, so SLI was his only option. What I said was, I sold mine, and got a CF mobo. I am also not the only one who posted that he did that. And I already commented on the "sweeter" graphics. I think you'll be in the vast minority if you dont think ATi has better graphics in Oblivion. With the simple fact that X1900's run faster, letting you enable higher settings, and/or resolution. And the fact that you can enable HDR+AA.
 
I was reading this and almost died laughing. If the OP really thinks a 1900xtx is as fast as 2 x 7900gtx then all I have to say is that is ATI the one with better marketing...more like brainwashing powers in order to make someone actually believe that.
Then comparing the 1800 to a 7900gtx is just disrespectful. Like it or not, the 1900 and 7900 are evenly match cards that beat each other in different games by small fps differences so all this this owns that is totally childish.
Both are great cards, get which ever works better at the games you like and enjoy it!
 
I definitly think Oblivion has crashing issues with the 7900GT(X) cards, although for some reason it usually only does it on loading/exiting, anyways ATI cards are leading in Oblivion right now due to AA+HDR, although I don't care as I can play at 1280x960 with very good framerates just fine.
 
Back
Top