x1900xt or 7900GTX for TES4: Oblivion

bobrownik said:
yes , they could have added 8 more pipes but they didnt .

you know why? because they knew you were gonna pay for anything they come out with anyway ,

so they just make whole bunch of money off of you right now , then they add the 8pipes in next gen and make you pay for it again

they also could have fixed the hardware based "performance optimizations" since they were improving on the core , but they didnt
Nvidia is a buisness out to make money, it too bad ATI wasn't more competitive anyway to make Nvidia throw more onto their core, but they compete well enough as it is.

ATI also has performance optimizations as well they aren't rosy in this regard either.

More money for Nvidia means more money for research and development for them while keeping them in the black also it allows them to charge cheaper prices as well with cheaper products.
 
coldpower27 said:
More money for Nvidia means more money for research and development for them while keeping them in the black also it allows them to charge cheaper prices as well with cheaper products.

of course , cards like 7800gtx 512 , which features all kinds off IQ lowering optimizations and which price was about 1000 dollars at a point .

thats the great hardware and prices you get from nvidia
 
bobrownik said:
of course , cards like 7800gtx 512 , which features all kinds off IQ lowering optimizations and which price was about 1000 dollars at a point .

thats the great hardware and prices you get from nvidia

Dude if you don't have anything intelligent to say, just STFU, quit being a fan boy
 
bbordwell said:
i vote to make it a [H]ard rule to never compare the number of piplines beetween two cards as it would clear up this whole argument. i think if you want to compare theoreticle performance you should have to state the count of all stages of the pipline. i think the word pipeline is becoming something like the word planet in the scientific world with all the debate about pluto.

I second that vote. Also, PLUTO IS NOT A PLANET, DAMNIT. :p

I have this game preordered, and now I'm thinking I'm going to have to reduce my res to something like 160X120 to get acceptable framerates with my setup. I need to upgrade, but have vowed to go M2 and DX10. :(
 
Sharky974 said:
Qauke 4 was programmed predominmantly on Nvidia cards by the makers throughout it's development cyce. That's why Nvidia does well.

Don't believe this is a fair statement. nVidia does well for a few reasons - wonderful OpenGL implementation, stencil shadow culling and mixed precision shaders. There are those that exploit the features of nVidia cards and those that don't. Using those features to their fullest is just vendor specific optimization and is in no way related or limited to development hardware. Valve, on the other hand...

Just because I don't own a tractor doesn't mean I don't know how to use it.
 
One of the PC Gamer reviewers has a review copy of Oblivion and says on his 4800+, 1GB RAM, and single X1900 rig that he plays at 1680x1050 with HDR and all in-game settings maxed out and, using Fraps to monitor the frame rate, says the game stays pegged at 60 (Vsync is enabled and I suspect the steady frame rate is largely due to his high-end dual core CPU) except for heavy fights, where it dips into the 30s. A reviewer buddy/co-worker is also playing the game with a 7800 GT and says it runs great, though more details of his sytem or game settings weren't given. Sounds like anyone with a 7800/7900 or 1800/1900 is going to be in great shape, and those with 6800/800s should be OK. Older cards are going to struggle, though, with higher in-game settings.
 
so what you're saying is that if you're upgrading your system to play this game it's just worthwhile to buy an xbox 360 :|
 
tim-x said:
so what you're saying is that if you're upgrading your system to play this game it's just worthwhile to buy an xbox 360 :|

Depends on what you want. The 360 gives 1280x720 with I believe 2x AA and HDR. Techies with disposable income and high-end machines may prefer Oblivion on the PC as a showcase title that finally delivers some of the eyecandy that justifies spending a lot of $$ on their gaming rigs.
 
John Reynolds said:
Depends on what you want. The 360 gives 1280x720 with I believe 2x AA and HDR. Techies with disposable income and high-end machines may prefer Oblivion on the PC as a showcase title that finally delivers some of the eyecandy that justifies spending a lot of $$ on their gaming rigs.

bravo.... Well put.

If $ is a concern, then get the xbox and quiet down.
 
I may have missed it somewhere else but is Oblivion multi-threaded to take advantage of dual cores cpus?
 
John Reynolds said:
Depends on what you want. The 360 gives 1280x720 with I believe 2x AA and HDR. Techies with disposable income and high-end machines may prefer Oblivion on the PC as a showcase title that finally delivers some of the eyecandy that justifies spending a lot of $$ on their gaming rigs.

Well, except XBox360 = AA + HDR and PC = AA or HDR, but not both.
 
dderidex said:
PC = AA or HDR, but not both.
No current video cards support the mode that this game is using. It does not mean that a future card won't support it.
 
kcthebrewer said:
No current video cards support the mode that this game is using. It does not mean that a future card won't support it.
AND NO, the ATI HDR AA does NOT work with Oblivion before anyone suggests it.
 
bobrownik said:
yes , they could have added 8 more pipes but they didnt .

you know why? because they knew you were gonna pay for anything they come out with anyway ,

so they just make whole bunch of money off of you right now , then they add the 8pipes in next gen and make you pay for it again

they also could have fixed the hardware based "performance optimizations" since they were improving on the core , but they didnt

I could come up with a lot more dirtier comments about the X1800 -> X1900 transistion, but will refrain from doing so.

DigitalN said:
AND NO, the ATI HDR AA does NOT work with Oblivion before anyone suggests it.
Not that you would enjoy playing at 10fps....

7900GTX, since both are equal performance-wise, nVidia drivers > ATI drivers, and the nVidia is much quieter.
 
InorganicMatter said:
7900GTX, since both are equal performance-wise, nVidia drivers > ATI drivers, and the nVidia is much quieter.

Where are the benches that say both are equal in Oblivion? Your opinion that NV drivers are better than ATi drivers are just that, your opinoin. Not fact.
 
DigitalN said:
AND NO, the ATI HDR AA does NOT work with Oblivion before anyone suggests it.


if you read in their forums it seems more an issue with them not Implimenting the option to do it, technically the ATI card can do that form of HDR + AA
 
InorganicMatter said:
7900GTX, since both are equal performance-wise, nVidia drivers > ATI drivers
It doesn't work like that. :p

We'll see on Monday or Tuesday.
 
Heh, apparently both companies are quickly working on updated drivers for this game. The race to win the Oblivion benchmark battle is on!
 
John Reynolds said:
Heh, apparently both companies are quickly working on updated drivers for this game. The race to win the Oblivion benchmark battle is on!

Which will hopefully mean a better experience for the end user.
 
Brent_Justice said:
Which will hopefully mean a better experience for the end user.

Hey, if both companies can eek out another 15-20% (enough to be noticeable while playing) by tuning their drivers for the game now that they have their hands on final code, more power to them. I heard NVIDIA has it running with QuadSLI (I can only drool at the thought of playing it at 2560x1600 on Dell's 3007 display with HDR enabled) and ATI has it working in Crossfire.
 
link to the ati working on enhanced driver for oblivion please?

i saw the nvidia one on [H]
 
John Reynolds said:
Heh, apparently both companies are quickly working on updated drivers for this game. The race to win the Oblivion benchmark battle is on!

Better double-check your sources. Linkage

sireric (ATI Employee) said:
All I know is that 6.4 should add better Oblivion performance on Xfire, using AFR. We haven't done any other changes for Oblivion in the drivers at this point. Perhaps later.

Posted on Mar 17, 2006. So far, ATI has just improved performance by adding an AFR profile for 'Oblivion', and done no other performance tweaks.
 
Heh, two more days left, then Oblivion gaming goodness will be possible on my duel 7900GT rig, I'm not sure who will win this race, but both cards I'm suspecting will be close I suspect.
 
Narius said:
link to the ati working on enhanced driver for oblivion please?

i saw the nvidia one on [H]

I don´t think ATI need a special driver for this game, I hear it has been favoring ATI hardware from the get go. No huge suprise, after all it´s a big 360 game also.
 
The X1900 XTX is a faster card marginally in the certain Shader Intensive games, at a much larger cost in comparison to Nvidia. Nvidia's 7900 GTX remember is only 278 Million Transistors with a Die size of 196mm2, in comparison to the Radeon X1900 XTX with 384 Million Transistors with a Die size of 352mm2 much more expensive to make then 7900.
Nvidia could have easily added 8 more pipelines to this card and still come under the transistor count of X1900 XTX and STILL be cheaper to make as Nvidia has better transistor density on a given process in comparison to ATI. Why should they though when the present situation they are even with ATI anyway and make tons more money in the process.

Who cares about all this die size stuff? It is all the sudden this super important thing since Nvidia is emphazing it? It never was before..

If Nvidia wants to drop prices from the parity they are at with ATI right now, great. But so far they aren't doing that. Nvidia likes it's 40% profit margins. Great but they aren't helping the consumers..they are just being greedy so far. If they drop 7900GTX to 399..and 7900GT to 229 or something..THEN I'll be impressed.

Also I think the prices are so close, perhaps ATI is getting better yields because they've already been on 90 nm for a long time? This is just a theory why prices are the same.

Nvidia would LIKE to add 8 pipes I bet..the reason they didn't is they didn't expect R580 to be so much of a leap over R520. They just misjudged the competition, so dont be giving them too many kudos for that either. As somebody said "refreshes" are always just lock speed increases..Nvidia continued on that path, they didn't expect ATI to do a massive rework. Now it takes so long to do a spin on a new chip design, that there is nothing they can do about it. Nvidia would have had to make that decision 8 months ago, it is too late now.

As it is they have lesser image qaulity and will be falling further behind in performance as long as it's R580 vs G71. I'm not sure they're sitting so pretty as youd like to think. And new cards will likely not be out until September, so that is a long time..

Also, ALL the oblivion graphs in this thread have been proven FAKE..so we have nothing to go on with that game..
 
Who cares about all this die size stuff? It is all the sudden this super important thing since Nvidia is emphazing it? It never was before..

If Nvidia wants to drop prices from the parity they are at with ATI right now, great. But so far they aren't doing that. Nvidia likes it's 40% profit margins. Great but they aren't helping the consumers..they are just being greedy so far. If they drop 7900GTX to 399..and 7900GT to 229 or something..THEN I'll be impressed.

Also I think the prices are so close, perhaps ATI is getting better yields because they've already been on 90 nm for a long time? This is just a theory why prices are the same.

Nvidia would LIKE to add 8 pipes I bet..the reason they didn't is they didn't expect R580 to be so much of a leap over R520. They just misjudged the competition, so dont be giving them too many kudos for that either. As somebody said "refreshes" are always just lock speed increases..Nvidia continued on that path, they didn't expect ATI to do a massive rework. Now it takes so long to do a spin on a new chip design, that there is nothing they can do about it. Nvidia would have had to make that decision 8 months ago, it is too late now.

As it is they have lesser image qaulity and will be falling further behind in performance as long as it's R580 vs G71. I'm not sure they're sitting so pretty as youd like to think. And new cards will likely not be out until September, so that is a long time..

Also, ALL the oblivion graphs in this thread have been proven FAKE..so we have nothing to go on with that game..
That's your response who cares? Bah rofl. No if you can get away having significantly reduced costs and still charge the same or so as the competition then you should do so to reap higher margins, these are corporations remember, not charities. They are supposed maximize profits if possible, having reduced die size gives Nvidia the more flexibility to cut prices if they feel, ATI is too much of a threat, and Nvidia judging from the pricing is not that worried that ATI will be.

Nvidia has been working on 90nm for a fairly long time now they did have 90nm products out last year, with some integrated graphics chipsets and the Go 7300/7400. So their yields should be fine on 90nm products, as evidenced by their increased transistor densities over ATI.

That's one scenario possibly, however most likely Nvidia knew they were competitive with the X1900 Series by simply raising the clockspeeds to match ATI and they are, so they decided to do it this way to reap higher margins and make more money, if ATI's card had been more of a threat then Nvidia would have decided to go to a 8 Quad design. I will give kudos to whom I believe is deservent of it thank you, and I believe the strategy Nvidia is taking is an effective one.

Again the image quality issue is a subjective one, this is not something that everyone can percieve identically, if using the shimmering issue as an example, not everyone can see it unless specifically pointed out. If it were a major issue then everyone should have no problem identifing it. "Falling behind" we will have to see if that plays out that way, the only games where Nvidia has "major" issues are possibly the ATI programmed games F.E.A.R, and Oblivion, which were predominantly programmed on ATI hardware, don't bother with the TWIMTBP sticker, F.E.A.R was programmed with ATI hardware for the bulk of it's development cycle, and even still Nvidia improved performance in F.E.A.R, and with the rumors of an Oblivion Forceware that game as well.

That's fine I don't like to trust pre-release benches anyway as they were showing I rather wait for the final from reputable site such as HardOCP and TechReport.
 
FEAR is fast on ATi Hardware mainly due to ATis Shader Advantage.
I dont know the history of the FEAR development but I highly doubt it is due to some ATi Bias in the programing. This will just turn out into one of those HL2 was made for ATi or D3 was made for Nvidia. Oblivion is a diffrent story most likely but its still way to early to act like the coding enhances ATi cards specificly.
When Ati moves to 80nm they will be able to move the X1900XTX beast for better yields I suppose. Even today its fine though. And the IQ diffrence with ATi and Nvidia is obvious.
 
dderidex said:
So far, ATI has just improved performance by adding an AFR profile for 'Oblivion', and done no other performance tweaks.

Eric told me that in e-mail Friday too. Bear in mind he's not on the software team. I was told by someone else, a hardware reviewer at a major site who chats with IHV employees quite a bit, both companies are working on updated drivers to tune performance now that they have final code (that's where my QuadSLI comment comes from). We'll see, it's certainly not worth arguing, though as the other poster suggested NVIDIA might have more work to get the game running as smoothly. A game reviewer who writes for CGW told me in e-mail last night that he ran the game on a 7800 GTX and then a X1900 XT and said, "wow, what a difference." His words, not mine.
 
FEAR is fast on ATi Hardware mainly due to ATis Shader Advantage.
I dont know the history of the FEAR development but I highly doubt it is due to some ATi Bias in the programing. This will just turn out into one of those HL2 was made for ATi or D3 was made for Nvidia. Oblivion is a diffrent story most likely but its still way to early to act like the coding enhances ATi cards specificly.
When Ati moves to 80nm they will be able to move the X1900XTX beast for better yields I suppose. Even today its fine though. And the IQ diffrence with ATi and Nvidia is obvious.
No, This is a game where X850 XT PE was competitive with 7800 GTX, it is not because of Shader advantage entirely which ATI favourables would like to believe. Your opinion, but it was an ATI GITG until the public beta and hence programmed predominantly on ATI hardware.

When ATI moves to the 80nm process the die size will reduce to about R520 levels, which in itself is still larger then G71, so still more expensive nonetheless. To you perhaps, image quality is not a quantifyable unit but a qualitative one and hence whether or not someone can notice a difference will vary.
 
John Reynolds said:
Eric told me that in e-mail Friday too. Bear in mind he's not on the software team. I was told by someone else, a hardware reviewer at a major site who chats with IHV employees quite a bit, both companies are working on updated drivers to tune performance now that they have final code (that's where my QuadSLI comment comes from). We'll see, it's certainly not worth arguing, though as the other poster suggested NVIDIA might have more work to get the game running as smoothly. A game reviewer who writes for CGW told me in e-mail last night that he ran the game on a 7800 GTX and then a X1900 XT and said, "wow, what a difference." His words, not mine.
Hmm though the 7800 GTX is a fairly old board compare to the X1900 Series, any idea on the 7900 GTX itself???
 
That's your response who cares? Bah rofl. No if you can get away having significantly reduced costs and still charge the same or so as the competition then you should do so to reap higher margins, these are corporations remember, not charities. They are supposed maximize profits if possible, having reduced die size gives Nvidia the more flexibility to cut prices if they feel, ATI is too much of a threat, and Nvidia judging from the pricing is not that worried that ATI will be.

My point is, if the end prices are similar to us, why do we care about die sizes? I know why you do, because Nvidia has a temporary advantage there and is playing it up right now. I guess if it makes you feel better that Nvidia is raking in profits one these chips (alledgedly) great.

If these chips by Nvidia are half size, where's my half price?!

hence programmed predominantly on ATI hardware.

Proof?

Was Doom 3 and Qauke 4 programmed predominantly on Nvidia hardware?

I believe the answer is YES so I guess that's why Nvidia wins those benchmarks? Yes or no?

Umm, these cards have different strengths so games will perform differently..

EVERY NEW GAME WILL PERFORM BETTER ON ATI. We explained to you why..maybe Nvidia can do some driver stuff to catch up..that's all they got.

Head up the 7900 is kinda old tech..SORRY. Guess that's your price for half die size..think about it dude. You dont get ring bus, memory controller, more shaders, better dynamic branching, image qaulity..you get what you pay for think about it! Nvidia is doing amazing to keep up in most games with a half size chip..that much is true.
 
Sharky974 said:
My point is, if the end prices are similar to us, why do we care about die sizes? I know why you do, because Nvidia has a temporary advantage there and is playing it up right now. I guess if it makes you feel better that Nvidia is raking in profits one these chips (alledgedly) great.

If these chips by Nvidia are half size, where's my half price?!



Proof?

Was Doom 3 and Qauke 4 programmed predominantly on Nvidia hardware?

I believe the answer is YES so I guess that's why Nvidia wins those benchmarks? Yes or no?

Umm, these cards have different strengths so games will perform differently..

EVERY NEW GAME WILL PERFORM BETTER ON ATI. We explained to you why..maybe Nvidia can do some driver stuff to catch up..that's all they got.

Head up the 7900 is kinda old tech..SORRY. Guess that's your price for half die size..think about it dude. You dont get ring bus, memory controller, more shaders, better dynamic branching, image qaulity..you get what you pay for think about it! Nvidia is doing amazing to keep up in most games with a half size chip..that much is true.
No, your point doesn't stand, Nvidia doesn't have to give us, the same margin levels that ATI gets, if they have costs benefits, your acting as if the cost benefits should ALL be translated to the consumer, this is ridiculous, this depends on how competitive ATI is in realtion to Nvidia, if ATI cannot provide enough pricing pressure to make Nvidia cut pricing then Nvidia can just simply reap higher margins and charge a competitive price with a higher margin. The move is right now up to ATI if they wish to cut pricing further, which they have less leeway to do due to higher costs as already explained.

X850 XT PE is competitive with 7800 GTX in this game prior to the improved drivers, so that's evidence enough. It was also an ATI GITG till the public beta, so it was program predominantly on ATI hardware.

Of course different cards have different strengths, that is a very general no brainer statement.

Correction, you believe every new game will perform better on ATI, and you provided reason why you think it's the case, don't try to pass this off as fact, we will have to see with final benches how each game stacks up.

I never argued on the issue of how new or old the G7x architecture is, what matters is performance, those new features are performance enhancement releated short of the image quality coment, if those extra features don't provide enhanced performance, they then represent waste silicon die area.

The Image Quality issue again is a subjective one and varies from person to person.

Regarding Doom 3 and Quake 4, Nvidia didn't have to optimize performance on these games as they designed the nV4x/G7x architecture around the Doom 3 engine, and hence had great performance from the onstart, it wasn't until recently that ATI became more competitive with the ring bus memory controller optimization implmented in R5xx architecture. Not to mention Nvidia has better OpenGL drivers then ATI.
 
ATI GITG till the public beta, so it was program predominantly on ATI hardware.
But real proof, Do you really know what machines they used.
The Texture to Shader Ratio in Fear is quite favorable for ATi.
 
As somebody said "refreshes" are always just lock speed increases..Nvidia continued on that path, they didn't expect ATI to do a massive rework.

I wasn't aware the 1900 was a massive rework. The exact design it's using was released right around the same time as the 1800. The 1900XT is basically just a 1600XT with higher clocks. They all have the same ALU/tex ratios etc.

As for oblivion if they are using the 9:1 ratio like I've heard thrown around a few times ATI hardware will likely perform rather well on it. The huge advantage they have in shader power should really come into play if that is the case.
 
{NG}Fidel said:
But real proof, Do you really know what machines they used.
The Texture to Shader Ratio in Fear is quite favorable for ATi.
Yes, I have heard it's a 7:1 Ratio for F.E.A.R, but I also heard of a 8:1 Ratio for Splinter Cell Chaos Theory so meh.

Like I said, the fact that X850 XT PE is pretty competitive with the 7800 GTX prior to Forceware 84.20 is pretty damning evidence in itself, as the 7800 GTX is more advanced in terms of Shader Power which flies in the face of Shader Heavy argument.
 
Back
Top