x1900xt or 7900GTX for TES4: Oblivion

Djee

n00b
Joined
Feb 5, 2005
Messages
24
I want to now what you guys think. What would be better to play TES4: Oblivion with the best visual quality possible. The x1900XT or the 7900GTX ?( since they are about the same price) I know that the 7900GTX is faster but is it alot faster? By the way i dont plan to upgrade to a SLI or Crossfire.





AMD 64 3500+
MSI K8N neo4 platinum
2x1gb Corsair XMS DDR PC3200
MSI GeForce 6800 TD 128mb
 
Well, I'm stepping up to the 7900GTX from the 7800 GTX/256.

The system I just retired had a 9700 Pro and I loved it, but I have serious concerns about the noise from a x1900xt.

In terms of graphical quality in Oblivion, I don't think it matters that much which card you go with. So it comes down to things like noise, heat, power consumption, price, etc. I also got the GTX because I wanted to buy an eVGA, and they do not make ATI cards. Customer service means a lot to me.

Have fun with Oblivion either way!
 
^
I think it will matter with Oblivion. It looks like a game in which teh X1900 would excel at.
Personaly I would go with the X1900 for the IQ and HDR+AA. Most benches show that when the Nvidia settings for quality in the CP are set to equal ATi IQ that Nvidia takes a larger loss and ATi gains the lead in terms of performance.
 
I would also go for the X1900, simply because of the features it has over the 7900.

- Adaptive AA
- High-quality AF
- HDR + AA
- Better IQ on the whole

Besides that, the performance is pretty similar between the two.
 
hdr+aa the x1900 has however oblivion developers have said no on the support of hdr+aa.
 
I am thinking for IQ/performance the x1900xtx at this point. 7900gtx in HQ mode and x1900xx in HQ mode really seems to equal out the performance then you are left with noise/heat and IQ. The IQ seems goes to the 1900 and the heat/noise goes to the 7900.
 
id go with the x1900 for the better IQ, plus it's more "future proof" than the 7900. But if you are really nit picky about noise, power, and heat, then go with the 7900.
 
Just ordered my X1900XTX today from MWAVE.com for $556.57
I'm going from a 6800GT AGP to a PCIe system.

Gotta love Ebay for helping sell shit so I could make this purchase.
 
PSYKOMANTIS said:
Just ordered my X1900XTX today from MWAVE.com for $556.57
I'm going from a 6800GT AGP to a PCIe system.

Gotta love Ebay for helping sell shit so I could make this purchase.

lol
 
A review on IGN had a x1900 xt in there test setup and a p4 3.2

it had better visuals than the xbox 360 so I would say x1900 xtx cause thatl pwn and with a amd cpu even better
 
thanks... I think ill go buy a X1900XT. The only think that sux is that Oblivion wont support HDR+AA at the same time :(
 
BTW the difference in performance between the XT and XTX version is very small? Does it worth the extra $$$?
 
no there isnt, and you could just flash it too the xtx, people say the xtx chips are abit better in quality though and overclock even better.
 
If you don't prefer one brand over the other, I suggest waiting for benchmarks to decide. This isn't the FX series with a PS2.0 game.
 
Doesn't matter. It's going to run like shit no matter what you get.
 
phide said:
Doesn't matter. It's going to run like shit no matter what you get.

waste post/10? lol

And dont listen to that plenty of reviews have stated the x1900 can do max settings on elder scrolls quite well and so will the 7900 the 7900 should do better cuase the game is based on the old unreal engine (quake 4 doom 3 etc)
 
Shinryu said:
And dont listen to that plenty of reviews have stated the x1900 can do max settings on elder scrolls quite well and so will the 7900 the 7900 should do better cuase the game is based on the old unreal engine (quake 4 doom 3 etc)
Oblivion uses the "GameBryo" engine, which is not the Unreal engine or the D3 engine. :p

http://www.waiting4oblivion.com/technical_details.html

I expect relatively poor performance all around since it's a cross-platform engine that runs on consoles and PCs.
 
From what I've seen/heard, Oblivion is going to bring even the best of systems [quad SLI, 3GHz FX-60] to its knees at resolutions higher than 1280x1024 with max settings. That doesn't bode well for the 95% of us that don't have systems like that.
 
bullshit. quad sli can run two instances of Oblivion lol. come on what's the deal is everyone coding like EA did with BF2?
 
for a game with a long drawing distance an RPG with alot of out door scenes I think ATI is the better choice no shimmering + HQ AF
 
From what I've seen/heard, Oblivion is going to bring even the best of systems [quad SLI, 3GHz FX-60] to its knees at resolutions higher than 1280x1024 with max settings. That doesn't bode well for the 95% of us that don't have systems like that.
Quad no. Single GPUs at 1280x1024 yes. Dual GPUs at 1600x1200 sounds about right and Quad at only God knows.
 
phide said:
Getting tired of the FUD.

maybe you should quote my whole statement?

its a game where he will be slowly moving in outdoor areas, and you can take a second and look around to see stunning visuals =p these kinda games it helps to have NO SHIMMERING and HQ - AF I've had a 7800gtx and gt thx and they both sucked for lineage 2( another RPG with alot of outdoor areas)
 
Brent_Justice said:
I'd say we aren't gonna know until the game is out, eh

Given how shader heavy the game is going to be it´s pretty easy to give a guestimate, especially when we have seen FEAR performance. But Brent is right, the game is going to be out in 6 days so people with upgrade burn in their vallets should wait one more week :)

Digital Viper-X- has very good point when it comes to draw distance and HQ AF, in these kinda games that feature really shines!
 
Brent_Justice said:
Show me your time travel device, I want a ride.

No time travel device, but wanna bet a steak dinner? :D

Ok, to be politically correct:

Ati does better the more shader intensive a game is. By most accounts oblivion should set the record here.

It's direct 3D where ATI does well.

The devs really love are biased too ATI hardware. I read some oblivion developer post where the guy said ATI cards are the gold standard basically..the oblivion devs are somewhat like Carmack is towards Nvidia maybe. AKA they favor their brand heavily.

Do the math! It's about 95% certain..
 
Ati does better the more shader intensive a game is.

Theoretically (by the numbers) the 7900 GTX can fill more pixels in a single clock than the X1900 XTX.

But hopefully as you know (and if you don't know now you do) that means diddly squat in real world game performance.

You cannot make an assumption as you did that broad. Each game is going to be different.


It's direct 3D where ATI does well.

Don't know where you got that, ATI and NV both do fine using the D3D API.

The devs really love are biased too ATI hardware. I read some oblivion developer post where the guy said ATI cards are the gold standard basically..the oblivion devs are somewhat like Carmack is towards Nvidia maybe. AKA they favor their brand heavily.

Pure assumption and conjecture.


Try this, don't assume someone is going to be better on one GPU than another before the game is even out yet. Wait until people get it in their hands and real honest to goodness testing can be done to evaluate the experience with each GPU, then you will have facts instead of opinion.
 
I understand what you're saying.

But come on, we both know ATI is going to be faster in oblivion, well at the high end anyway. In the mid-range Nvidia usually kicks butt so they probably wont be altogether overcome.

Unless you guys are testing the game already and know something I dont :eek:

Drawing more pixels..that pretty much means zilch. I'm no graphics expert but I do have a general idea. The 48 pipes in R580 are why it's so strong in shader intensive games. In those it should/does overcome the Nvidia cards quiet easily. ATI wants that infamous 3:1 math:texture ratio, but from what I heard it's really more like 9:1 for R580 to shine. The new games are barely begininng to touch this.

If I had to guess I'd say ATI will win in Oblivion by a larger margin than any game yet..
 
I don't know if ATI or NV is going to be better in Oblivion, and that's exactly my point.

I can't understand how people can come out and say "xxx GPU" is going to be faster in "xxx Game" before the game is even out and no evaluations have been done to test its performance.

I guess some people just like to live in their own little world.


Me, I'll buy the game on the 20th and find out then which GPU provides me the best gameplay experience.
 
pxc said:
Oblivion uses the "GameBryo" engine, which is not the Unreal engine or the D3 engine. :p

http://www.waiting4oblivion.com/technical_details.html

I expect relatively poor performance all around since it's a cross-platform engine that runs on consoles and PCs.


Yeah and the source engine is based on the havoc engine, but do the call it havoc 2.3, no ive been readin info on that game long time ago and they used the unreal engine as there start up.

And to be honest your totally overestimating the game, yeah its multiplatform, but its been made for pc orginally anyways, been nearly 6 years in the making.
 
All I know is that I'm picking up the 360 version and sparing my PC the frustration of dealing with configuring the quality settings.

Thank god there's a console that you don't have to worry about high quality graphics for a change.
 
Shinryu said:
Yeah and the source engine is based on the havoc engine, but do the call it havoc 2.3, no ive been readin info on that game long time ago and they used the unreal engine as there start up.
No, you're still totally wrong. The GameBryo engine is not related to the Unreal engine, or as you mixed it up, the Unreal engine isn't related to the Doom 3 engine either.

BTW, Source (3D engine) is not based on Havok (physics) but Source engine games do include a modified Havok physics engine just like dozens of other games do. http://www.havok.com/content/view/72/57/

:rolleyes:
 
pxc said:
No, you're still totally wrong. The GameBryo engine is not related to the Unreal engine, or as you mixed it up, the Unreal engine isn't related to the Doom 3 engine either.

BTW, Source (3D engine) is not based on Havok (physics) but Source engine games do include a modified Havok physics engine just like dozens of other games do. http://www.havok.com/content/view/72/57/

:rolleyes:


Revews said ages ago the engine was based on an old openGL engine and they added better physic so dont give me any bulls.

Secondly the source engine is based on havok physics what is your definition of based ?
can you computer program do you understand what it means?
 
If you must buy now, and have it ready for the release, then I would recommend the x1900 because of the shimmering and just prefferable IQ in general. I now have a 7600gt and I must say I really didnt think the shimmering would be this noticable. I actually prefer nVidia's drivers for competative FPS, somehow they feel more responsive, but for IQ, I prefer ATI hands down, and in a game like oblivian, IQ is what matters most.
 
Back
Top