Which will be better in Oblivion?

Matas

Limp Gawd
Joined
Feb 16, 2006
Messages
438
Which card do you think will be faster in Oblivion: 7800 GTX 512Mb or x1800xt?
 
Possibly the latter, though I do recall hearing word of some Shader Model 3.0 features which would naturally only work on the former. I'll hedge my bets that Bethesda hasn't done much with SM3.0, however.

An X1800 XT knocking out a monstrous GTX 512? Maybe. We'll see what sort of tangled web Bethesda has weaved on March 20th, I suppose.
 
Only work on the former? X1800XT takes advantage of SM3.0 (save VTF), I'm pretty sure...

Anyway the X1800XT performs closely to a GTX512 in games like CoD2 and FEAR, so it's possible.
 
Matas said:
Which card do you think will be faster in Oblivion: 7800 GTX 512Mb or x1800xt?
In every way you will be able to notice, they will most likely be virtually identical.
 
Ah, so they are SM3.0 compatible. I was thinking of the X8xx series, for some reason.
 
Bethesda has done a load of work in the area of SM3.0. That is going to be one of the most shader intensive games to be released this year.
 
FWIW, Bethesda developed the game for the XBox360 and PC.

Recall that the XBox360 basically has a Radeon X1900 GPU in it - so I'd expect that if there was a difference at all, it'd be towards the ATI card.

Same reason I'd guess 'Oblivion' will do better on a dual core CPU vs faster single-core (well, first, the devs specifically SAID the game was heavily multithreaded - but, again, consider the XBox360. 3 cores! Any game optimized to run best on that is going to clearly run better with 2 cores on the PC than just one!)
 
I wouldn't specifically look at the X360 version to make comparisons in that sense.

For instance, the PC version of COD2 traditionally works better with nVidia cards (heck, they ran a big COD2 promo with nVidia a while back), but the X360 has an ATI chip in it.
 
PWMK2 said:
I wouldn't specifically look at the X360 version to make comparisons in that sense.

For instance, the PC version of COD2 traditionally works better with nVidia cards (heck, they ran a big COD2 promo with nVidia a while back), but the X360 has an ATI chip in it.


actually an X1800XT or X1900 series own CoD2 pretty hard
 
werrrd said:
actually an X1800XT or X1900 series own CoD2 pretty hard

:rolleyes: Yes... you are correct.

What I am trying to say is that nVidia cards work slightly better for COD2 than ATI cards...
Not saying an X1900XTX wouldn't kick butt. Actually, it'd probably be the best right now. But once 7900GTX comes around, it will probably have slightly better performance in COD2 than the X1900XTX. Simply because they optimized the game for NV cards.
 
PWMK2 said:
What I am trying to say is that nVidia cards work slightly better for COD2 than ATI cards...

Youresmokingwhatnow?

:confused:

ATI *totally* pwnz0rs nVidia in 'Call of Duty 2' - it's not even in the same *ballpark*. Heck, the GTX-512 (roughly comparable to the X1900 generally) only barely manages to edge out the X1800XT! The GTX-256 is hopelessly outclassed, getting not even two-thirds of an X1800XT's performance at 1600x1200 with 4xAA.
 
Some websites have a clear cut bias. I've read reviews that have put the 5200 in the same league as the 9700.. obviously the 5200 was never on par with the 9700...

Simply put - don't think every review you read is golden. People do b.s
 
dderidex said:
Same reason I'd guess 'Oblivion' will do better on a dual core CPU vs faster single-core (well, first, the devs specifically SAID the game was heavily multithreaded - but, again, consider the XBox360. 3 cores! Any game optimized to run best on that is going to clearly run better with 2 cores on the PC than just one!)

Actually they admitted in an interview, you're not going to notice the difference between single and dual core performance on PC.
 
dagon11985 said:
Some websites have a clear cut bias. I've read reviews that have put the 5200 in the same league as the 9700.. obviously the 5200 was never on par with the 9700...

Simply put - don't think every review you read is golden. People do b.s

Everyone's numbers are the same!

But that can't be right, because it would mean that random posters on [H] with no firsthand experience are not more knowledgeable than reviewers!

It must be a conspiracy!

Oh, noes!!

Or - alternatively - maybe....just maybe....you shouldn't dismiss an argument out of hand because you don't like the position, without first doing a little bit of research on it. Which is, of course, to say that something being optimized for an XBox360 says *VERY MUCH* what video card it will run better on when ported to PC (the most XBox360-like GPU....otherwise known as the 'Radeon X1900').

Not that in games truly developed simultaneously I would expect to see anywhere near such a gap in performance between two manufacturer's high-end cards...but for 'quick & dirty ports'...oh, yeah, there will be a difference.
 
Oh snap, time for the mods to bust out the water hose!!!!!!!!!!!!
 
the most XBox360-like GPU....otherwise known as the 'Radeon X1900').

I wouldn't put too much stock in that. The X360 GPU uses unified shaders.

I'd more say, the later the game probably the better it will run on X1900 vs 7800. So Oblivion, should, theoretically, go in favor of ATI.

It's actually the same reason ATI does well in COD2. Because it's recent. FEAR, BF2, COD2, etc are the games ATI does best in, simply because they're the latest and stress the cards the most.
 
Sharky974 said:
I'd more say, the later the game probably the better it will run on X1900 vs 7800. So Oblivion, should, theoretically, go in favor of ATI.
It better be the X1900. The game was built on Ati and it probably is in Ati's Get in the Game program.
 
Stereophile said:
Actually they admitted in an interview, you're not going to notice the difference between single and dual core performance on PC.


which is sooooooooooooooooooooooooooooooooooooooo whack >:[~
 
roflcopter said:
The game was built on Ati and it probably is in Ati's Get in the Game program.
That is an interesting statement. Since the game is supposedly heavy PS3.0, and ATI just recently released a video card that supports SM3.0, I would figure it would have been developed on an nVidia card.
 
dderidex said:
ATI *totally* pwnz0rs nVidia in 'Call of Duty 2' - it's not even in the same *ballpark*. Heck, the GTX-512 (roughly comparable to the X1900 generally) only barely manages to edge out the X1800XT! The GTX-256 is hopelessly outclassed, getting not even two-thirds of an X1800XT's performance at 1600x1200 with 4xAA.

dderidex said:
Or - alternatively - maybe....just maybe....you shouldn't dismiss an argument out of hand because you don't like the position, without first doing a little bit of research on it. Which is, of course, to say that something being optimized for an XBox360 says *VERY MUCH* what video card it will run better on when ported to PC (the most XBox360-like GPU....otherwise known as the 'Radeon X1900').

Wow. No it doesn't.
1. COD2 was developed for PC first. As it was being developed, it was ported over to 360. Now, wasn't ported after the PC version was already done, but it was the PC version they had in number one priority. It wasn't a port from the 360 version to PC, but the other way around. They made the PC code fit the 360. I'm sure there were snags at points, but I'm sure nobody said,
"Hey guys, since the Xbox 360 version we're porting to has an ATI chip, let's go optimize the PC version for ATI cards even though we're already getting better performance with nVidia cards."
The Xbox 360 has a PowerPC processor in it... are we going to start saying it runs better on macs?
2. Of course the X1900 series will own every game out there! Also, the GTX-512 is NOT comparable to the X1900XTX. The X1900XTX will generally outperform that card noticeably.
3. The GTX-256 is not two thirds of an X1800XT's performance in COD2.
4. The chip in the 360 is NOT like an X1900. You can't even call it that. It's an R500-based chip, as opposed to R580 in the X1900 series. Also, given the console has 512MB of unified memory (which makes it kind of like a really advanced onboard video chip), and everything is running a unified shader process... it's not anything like it.
 
Regardless a lot have people have bought ATi 1900's specifically because of the results like FEAR and yes.... CoD2. I wouldn't be surprised if they keep that lead since it is, relatively speaking, pretty damn big. We'll have to see if Nv can find 30% more fps in CoD2. But Nv is currently a little faster in this game.... HA! You would think most on here posting would care enough to have read a few reviews. Could just look at the pretty graphs. :D edit: On the flipside though, it appears a GTX512 does eek out a 1800 sometimes. Unless that's what was meant... hmm.

CoD2 results influenced myself into getting an XT. In fact, many are thinking these increased shaders and so on just may be the ticket to future titles ala OBlivion. Should be pretty competitive against the next Nvidia offering but we'll see.
 
Thanks for agreeing with me. Err... sorry if I confused you I meant by a GTX512 eeking out a 1800, that yes the 1800xt and the GTX512 are about equal. Ati is generally considered fastest at CoD2 because of the x1900's. I'm just saying after disagreeing with the NV guy that if he was talking about GTX512 vrs. the x1800 series than I was ranting about nothing. All I'm talking about is the same graph thats posted.

edit: I revise my comments. A 1900 is 9% faster than a GTX512 in CoD2? Where do YOU come up with this stuff? Maybe at different IQ like with angle independent aniso. Otherwise, yeah, you're nipping futz too. :p

Consider next gen Nv vrs. ATi for oblivion because ATi does beat Nv significantly right now in FEAR and CoD2 etc. Seriously, it's about as big as one can expect. I mean Nv's a solid company, of course they weren't going to get blasted into a newer hemisphere even by a slightly newer part but it ain't a handful of frames I'll tell you that.
 
Oh I didn't see those CoD2 benchmarks. Very strange how close all the cards are with each other in those bechmarks. I'm not sure if I can trust them because I play CoD 2 @ 1920x1200 with max settings and I get more than 30fps with my 7800GTX512. I've never heard of some of those sites either.
 
ATi will most probably own Oblivion. As it seems the demos shown to reviewers were running on X1900XT machines. If Nvidia had an edge here, they most probably would've been in the rigs for the showcase....
Bad news is...they were running 1280 * 720 (If Im not mistaken), no AA, no AF and you still got some stutters, what does that tell you about Oblivion? probably just like Morrowind when it came out... some very unoptimised code (Remember the Ugly awfull shadows which killed most GPUs at the time?).

I'm just hoping it will be playable at native LCD res with some stuff turned down (and no I will not upgrade in this Gen for Oblivion). Also, considering many effects have gone to bunk and graphics being downgraded for release, maybe it will run ok...

About COD2, yes ATi owns that game, not by a meaningfull 30% scebario, but they are generally faster, and I suspect it's a trend we'll live with for a while. Any shader intense game will probably run better on ATi, and whenever Eye candy is on, it's usually faster on ATi (the only downside I see to ATi right now is the noise factor).
 
Shameless plug ;)

We did an interview with Pete Hines regarding oblivion. Check it out here.
 
kcthebrewer said:
That is an interesting statement. Since the game is supposedly heavy PS3.0, and ATI just recently released a video card that supports SM3.0, I would figure it would have been developed on an nVidia card.
Elderscrolls 3: morrowind is part of Ati's GITG program that's for sure. I read somewhere that Oblivion devs are/were using Ati 9800 cards.
 
altcon said:
ATi will most probably own Oblivion. As it seems the demos shown to reviewers were running on X1900XT machines. If Nvidia had an edge here, they most probably would've been in the rigs for the showcase....
Bad news is...they were running 1280 * 720 (If Im not mistaken), no AA, no AF and you still got some stutters, what does that tell you about Oblivion? probably just like Morrowind when it came out... some very unoptimised code (Remember the Ugly awfull shadows which killed most GPUs at the time?).

First the videos you seen are from a buggy december build.

Second they where using X1800XTs with a 3.4Ghz Pentium processor and 2GB of ram, also they where playing at a high resolution like 1600x1200 or 1920x1200 with everything high/max, but that was with the recent preview, the reviewers have said area load times are quick and the frame rate only bogs down when thiers alot of enimies on screen. (like getting the whole mages guild upset at you and going out in the streets), I'm sure thier will be settings which you can tone down if you want a better framerate.
 
Um, I have a question for the op, why the 1800? Why not the 1900? Im willing to bet the 1800/1900 will do better though, basing that off of how they do in shader intensive games now.
 
sabrewolf732 said:
Um, I have a question for the op, why the 1800? Why not the 1900? Im willing to bet the 1800/1900 will do better though, basing that off of how they do in shader intensive games now.


I think it's a foregone conclusion the X1900 will beat a 512mb GTX in Oblivion due to the shader power.

I'd like to know how the X1800 line compares to the X1900 on this game. 20-30% behind or is it massive ? @1280x1024
 
I'd say the game will run well on either but you will only be able to run HDR and AA at the same time on the ATI card. As far as I remember, SM3.0 is only being used for optimizations and HDR. SM2.0 will get bloom.
 
Stereophile said:
I'd like to know how the X1800 line compares to the X1900 on this game. 20-30% behind or is it massive ? @1280x1024

Just like they did in FEAR.

From the Oblivion FAQ:

Radeon X1900XTX, PCI-e with 512MB video RAM - This is unquestionably the most powerful card in existance, and will remain so until the next generation of cards arrives, which will be after the release of Oblivion. It handles pixel shaders far better than any GeForce card, and can handily defeat a GeForce 7800GTX SLi setup in F.E.A.R.. Oblivion is a shader-intensive game as well, so you should expect close to the same thing.

http://www.elderscrolls.com/forums/index.php?showtopic=250534
 
sculelos said:
First the videos you seen are from a buggy december build.

Second they where using X1800XTs with a 3.4Ghz Pentium processor and 2GB of ram, also they where playing at a high resolution like 1600x1200 or 1920x1200 with everything high/max, but that was with the recent preview, the reviewers have said area load times are quick and the frame rate only bogs down when thiers alot of enimies on screen. (like getting the whole mages guild upset at you and going out in the streets), I'm sure thier will be settings which you can tone down if you want a better framerate.

He is not referring to the December videos, but the plethora of previews that just dumped on the net from the recent weekend (well, okay, last Friday technically) series of previews.

The system in question was:
* AMD Athlon64 '3400+'
* Radeon X1900 video card (no details on if it was the XT or XTX)
* 1gb ram
* 1280 x 720 cinema display

The game was described as '95% complete' to the participants (and it just went gold the same day, so this was probably closer to complete than let on). I would suppose we can presume all the settings were cranked and it was running perhaps ?2xFSAA?

Virtually every one of the previewers noted that it 'seemed slow in places' - heavy action scenes with a lot of monsters on the screen at once, for example. No overt stuttering, though, which seems to put to rest ram and CPU concerns.

But, clearly, the game is going to need a BEAST of a video card to really max out the graphics.
 
Apple740 said:
Just like they did in FEAR.

From the Oblivion FAQ:

Radeon X1900XTX, PCI-e with 512MB video RAM - This is unquestionably the most powerful card in existance, and will remain so until the next generation of cards arrives, which will be after the release of Oblivion. It handles pixel shaders far better than any GeForce card, and can handily defeat a GeForce 7800GTX SLi setup in F.E.A.R.. Oblivion is a shader-intensive game as well, so you should expect close to the same thing.

http://www.elderscrolls.com/forums/index.php?showtopic=250534

You do realize, right, that you are posting from a document composed by a forum member that is OUTRIGHT WRONG in places? For example, it claims FP16 HDR can be run with FSAA on an SM2.0 card, because "that's what RTHDRIBL does".

Which is obviously 100% wrong.
 
Whoever said there's no benefit to dual core with Oblivion obviously hasn't read any of the interviews. They've quite clearly stated in no uncertain terms that dual core chips will bring a definite advantage.
 
dderidex said:
You do realize, right, that you are posting from a document composed by a forum member that is OUTRIGHT WRONG in places? For example, it claims FP16 HDR can be run with FSAA on an SM2.0 card, because "that's what RTHDRIBL does".

Which is obviously 100% wrong.

It could, but requires blending in the shader = crappy performance. I was indeed very surprised to read that because it is not realistic.
 
Apple740 said:
It could, but requires blending in the shader = crappy performance. I was indeed very surprised to read that because it is not realistic.

Yes, also he said that 1 X1900XTX is faster then the 7800GTX SLI which is simply not true, true in certain instances it performs about the same, but thats only in certain circumstanes, like fear at 16x10 4xAA/8xAF with softshadows, if you lower the AA or resolution the 7800GTX SLI easily becomes faster.
 
sculelos said:
Yes, also he said that 1 X1900XTX is faster then the 7800GTX SLI which is simply not true, true in certain instances it performs about the same, but thats only in certain circumstanes, like fear at 16x10 4xAA/8xAF with softshadows, if you lower the AA or resolution the 7800GTX SLI easily becomes faster.

Maybe he meant ultra high res, here the XTX is faster than 2x7800GTX256. :)

fear2048.gif
 
I don't think any game is really quite comparable to Oblivion at the moment. From all indications, most areas aren't going to be heavily shader intensive. Outdoor areas typically seem to be using fairly standard texture mapping on most surfaces, with some specular highlights, low resolution bumps and, of course, high dynamic range lighting. There's no question that nVidia excels in textured pixel performance. Any shaders that are not math intensive; those that may use specular maps, for instance; will generally render somewhat faster on current nVidia hardware. From most indications thus far, nVidia also seems to have a slight edge on floating point buffer HDR with the obvious disadvantage of no multisampling with this technique.

That being said, indoor dungeon areas seem to be an entirely different ballgame. Virtual displacement maps are very computational intensive shaders and will, with little doubt, fly on X1900 cards. Any higher-order, math intensive shaders will again excel on X1900 hardware. This is dependent on how many computational shaders lie within the viewframe, but I think there will be enough to fill up R580's shader processors with ease.

When we're talking about VRAM, I no longer believe that Oblivion is going to be the end-all-be-all game to flaunt your VRAM. Texture resolution has been pulled back since early in development (obvious). Therefore, the X1800XT and GTX 512 may very well typically be within the same framerate ballpark. Far too early to tell - I find it unreasonable to really speculate further.
 
Back
Top