Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
In every way you will be able to notice, they will most likely be virtually identical.Matas said:Which card do you think will be faster in Oblivion: 7800 GTX 512Mb or x1800xt?
PWMK2 said:I wouldn't specifically look at the X360 version to make comparisons in that sense.
For instance, the PC version of COD2 traditionally works better with nVidia cards (heck, they ran a big COD2 promo with nVidia a while back), but the X360 has an ATI chip in it.
werrrd said:actually an X1800XT or X1900 series own CoD2 pretty hard
PWMK2 said:What I am trying to say is that nVidia cards work slightly better for COD2 than ATI cards...
dderidex said:Same reason I'd guess 'Oblivion' will do better on a dual core CPU vs faster single-core (well, first, the devs specifically SAID the game was heavily multithreaded - but, again, consider the XBox360. 3 cores! Any game optimized to run best on that is going to clearly run better with 2 cores on the PC than just one!)
dagon11985 said:Some websites have a clear cut bias. I've read reviews that have put the 5200 in the same league as the 9700.. obviously the 5200 was never on par with the 9700...
Simply put - don't think every review you read is golden. People do b.s
the most XBox360-like GPU....otherwise known as the 'Radeon X1900').
It better be the X1900. The game was built on Ati and it probably is in Ati's Get in the Game program.Sharky974 said:I'd more say, the later the game probably the better it will run on X1900 vs 7800. So Oblivion, should, theoretically, go in favor of ATI.
Stereophile said:Actually they admitted in an interview, you're not going to notice the difference between single and dual core performance on PC.
dagon11985 said:Some websites have a clear cut bias. I've read reviews that have put the 5200 in the same league as the 9700...
That is an interesting statement. Since the game is supposedly heavy PS3.0, and ATI just recently released a video card that supports SM3.0, I would figure it would have been developed on an nVidia card.roflcopter said:The game was built on Ati and it probably is in Ati's Get in the Game program.
dderidex said:ATI *totally* pwnz0rs nVidia in 'Call of Duty 2' - it's not even in the same *ballpark*. Heck, the GTX-512 (roughly comparable to the X1900 generally) only barely manages to edge out the X1800XT! The GTX-256 is hopelessly outclassed, getting not even two-thirds of an X1800XT's performance at 1600x1200 with 4xAA.
dderidex said:Or - alternatively - maybe....just maybe....you shouldn't dismiss an argument out of hand because you don't like the position, without first doing a little bit of research on it. Which is, of course, to say that something being optimized for an XBox360 says *VERY MUCH* what video card it will run better on when ported to PC (the most XBox360-like GPU....otherwise known as the 'Radeon X1900').
Elderscrolls 3: morrowind is part of Ati's GITG program that's for sure. I read somewhere that Oblivion devs are/were using Ati 9800 cards.kcthebrewer said:That is an interesting statement. Since the game is supposedly heavy PS3.0, and ATI just recently released a video card that supports SM3.0, I would figure it would have been developed on an nVidia card.
altcon said:ATi will most probably own Oblivion. As it seems the demos shown to reviewers were running on X1900XT machines. If Nvidia had an edge here, they most probably would've been in the rigs for the showcase....
Bad news is...they were running 1280 * 720 (If Im not mistaken), no AA, no AF and you still got some stutters, what does that tell you about Oblivion? probably just like Morrowind when it came out... some very unoptimised code (Remember the Ugly awfull shadows which killed most GPUs at the time?).
sabrewolf732 said:Um, I have a question for the op, why the 1800? Why not the 1900? Im willing to bet the 1800/1900 will do better though, basing that off of how they do in shader intensive games now.
Stereophile said:I'd like to know how the X1800 line compares to the X1900 on this game. 20-30% behind or is it massive ? @1280x1024
sculelos said:First the videos you seen are from a buggy december build.
Second they where using X1800XTs with a 3.4Ghz Pentium processor and 2GB of ram, also they where playing at a high resolution like 1600x1200 or 1920x1200 with everything high/max, but that was with the recent preview, the reviewers have said area load times are quick and the frame rate only bogs down when thiers alot of enimies on screen. (like getting the whole mages guild upset at you and going out in the streets), I'm sure thier will be settings which you can tone down if you want a better framerate.
Apple740 said:Just like they did in FEAR.
From the Oblivion FAQ:
Radeon X1900XTX, PCI-e with 512MB video RAM - This is unquestionably the most powerful card in existance, and will remain so until the next generation of cards arrives, which will be after the release of Oblivion. It handles pixel shaders far better than any GeForce card, and can handily defeat a GeForce 7800GTX SLi setup in F.E.A.R.. Oblivion is a shader-intensive game as well, so you should expect close to the same thing.
http://www.elderscrolls.com/forums/index.php?showtopic=250534
dderidex said:You do realize, right, that you are posting from a document composed by a forum member that is OUTRIGHT WRONG in places? For example, it claims FP16 HDR can be run with FSAA on an SM2.0 card, because "that's what RTHDRIBL does".
Which is obviously 100% wrong.
Apple740 said:It could, but requires blending in the shader = crappy performance. I was indeed very surprised to read that because it is not realistic.
sculelos said:Yes, also he said that 1 X1900XTX is faster then the 7800GTX SLI which is simply not true, true in certain instances it performs about the same, but thats only in certain circumstanes, like fear at 16x10 4xAA/8xAF with softshadows, if you lower the AA or resolution the 7800GTX SLI easily becomes faster.