zzzVideocardzzz
Weaksauce
- Joined
- Mar 23, 2006
- Messages
- 69
benchmarks I'm guessing a x1900xtx would score 36 or 37 in the 1600x1200 highend benchmark since they didn't include it, thaz almost on par with 2x 7900 gtx
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
R1ckCa1n said:People continue to buy 7900's not realizing ATI's older offering can keep up or beat NV's current offering in most new games and don't care. Not to mention the IQ differences.
Lord_Exodia said:I guess I'll be the first to say this and I'll only say it because I'm not fully convinced or sold on many of the things that are practically forcefed to us in this forum sometimes.
First to the OP (original poster)
My SLi 7900 GTX rig gets 50% better results than that. with a modified ini file about 75% better. Those results are a bit low on the 7900 side
Now the real comment I want to make is that many people are saying that in shader intense games ati's new architecture is faster. The only game I have really noticed this in is Oblivion, oblivion IMHO (in my honest opinion) is very buggy at this point. At least on 7900 hardware. I've been told by Bethesda support that the game currently doesn't support these cards and is using a 7800 rendering path which is very similar but not quite the same thing. When the patch comes out that will be adressed. in addition the game has crashing issues and all kinds of performance issues on the 7900 cards. sometimes I get low fps out of nowhere in non intense moments, then out of nowhere great performance. I believe there are just bugs and problems going on with nvidia's drivers, lack of optimizations for oblivion in the driver, game support issues on 7900GPUs or a combination of the 3.
Other than Oblivion, where are these examples of the 7900GTX losing in shader intense games? I believe fear has been mentioned, turn on soft shadows and run another benchmark. Nvidia would probably win. Finally, in the benchmark on gamespot, I really bet they did see those results, but if they would have run around for a few more seconds their fps would have gone up giving them more favorable benchmarks enforcing what I've noticed in this game.
Will 7900GTX end up better than X1900XTX in oblivion = probably not
Do I think it can do better in oblivion and come nearly close the X1900XTX single and crossfire performance = absolutely.
Does ATI have better architecture for future games, although I seem to be forcefed this in these forums I'm not sold yet.
I'm not a !!! but a logical demonstration to the contrary is welcome
Final point lately it's seeming that the Architecture of the X1800XT seems to be keeping up with the 7900GTX in this game, if that isn't an indication of a problem with the game and 7900s I dont know what is. Remember the X1800XT does not have the architecture of the R580.
It should be inferior to the 7900GTX in this game but it isn't. There is no advantage in that architecture further enforcing what I believe being a oblivion and 7900 issue, and not a advantage in the R580 architecture. Time will tell for sure but for the present I'm just stating what I've noticed.
zzzVideocardzzz said:we're not here to compare your computer, and we're not here to use "custome ini" and most of all we're not here to compare sli vs cf
J-Mag said:I bought two 7900gtx's not because of marketing, but because of engineering... I can't use CF on my mobo and so SLI is my only option.
Anyway I heard the chuck patch creates texture flashing in CF mode, is this true?
Same same for me although I sold my 7800GTX for too little.....fallguy said:Funny to see some people just cant believe NV isnt the best, and that the benches must be wrong.
I too had a SLI board, but changed it out for a Crossfire board. Now I have a better motherboard, and sweeter graphics.
No the Chuck patch does not create texture flashing, that was from renaming the .exe to get Crossfire working in Oblivion. You no longer have to rename it with the 6.4 drivers, and thus no more flashing textures.
Xeero said:thank you for your subjective post
Xeero said:better motherboard? the way i see it, the CF board is either on par or worse than nforce4 boards. i would not consider either chipset better
Xeero said:sweeter graphics? define sweeter graphics? eye candy? frame rate? once again it depends on the game for which one performs better. most of the time they perform on par with each other. are you talking about ati's advantage with HDR+aa?
Xeero said:i'm not sure if you've noticed, but even CF 1900's with HDR+aa will chug while running Oblivion@ 1920x1200 resolution, when all the in game settings are turned up. IMO a graphic card needs to sustain a pretty good frame rate before having all the eye candy turned up.
Xeero said:what good are the features if you can't even use them at a decent frame rate? and no dont tell me to turn down the resolution. i didn't get a sweetass LCD just so i can run games @ 1280x1024 w/ full AA+HDR at a good frame rate.
Xeero said:and HQ af? according to the screenshots posted by [H] the AF is barely noticeable between the 2
i'm not trying to be biased against one card, i'm just saying both cards offer pretty much the same thing, so I dont understand why there is a need to prove ATI suprerior