x1900xt or 7900GTX for TES4: Oblivion

Brent_Justice said:
huh what?

http://www.beyond3d.com/forum/showpost.php?p=716900&postcount=40

This post is a guess by a fairly technical poster at B3D, but I wouldn't be surprised if he's right. The 360 can do HDR + AA, but the developer has said no existing PC graphics chip can, which means a FP16 format certainly isn't being used with the game's HDR, and Gavin Carter, the game's producer, flat-out stated during the B3D interview that it's a blendable format in use, so. . . .
 
John Reynolds said:
http://www.beyond3d.com/forum/showpost.php?p=716900&postcount=40

This post is a guess by a fairly technical poster at B3D, but I wouldn't be surprised if he's right. The 360 can do HDR + AA, but the developer has said no existing PC graphics chip can, which means a FP16 format certainly isn't being used with the game's HDR, and Gavin Carter, the game's producer, flat-out stated during the B3D interview that it's a blendable format in use, so. . . .
Edit, I read the article. The version of HDR + AA they are using isn't supported.
 
For me it's a no brainer 7900GTX SLi > Crossfire X1900XTX. Single card though I'm not sure. I'll wait for the benchmarks which are sure to be available soon. I think the X1900XTX may be slightly faster but I wouldnt be surprised if the 7900GTX wins either. So close in performance. Now as per image quality. The slight edge would have to go towards the X1900XTX for reasons mentioned by many a poster before me.
 
Here's a crazy thought....

If you have SLI video cards, would it be theoretically possible to have one card do the AA and the other do the HDR?
 
Advil said:
Here's a crazy thought....

If you have SLI video cards, would it be theoretically possible to have one card do the AA and the other do the HDR?


whats the point in that? the cards already split the load in half essentially. :confused:
 
um. stupid question(which I am good at btw) is DX10 applicable to this title? IMO why have this debate if the Dx10 cards wax the shit outta the existing tech.
 
Brent_Justice said:
huh what?

Linkage

lancekt said:
"On PC, HDR and AA are not supported simultaneously at all, on any video chipset, because of the way we do HDR.
If you absolutely need antialiasing (and the HDR effect looks great without it, frankly), you can switch to the bloom effect."

....

"I don't think "any chipset" was ambiguous. It's not supported, sorry. It's true there are ways you can make HDR+AA work on ATI X1K cards, but it would have required substantial changes, and you want the game this year, right?

All I gotta say is...."so much for the XBox and PC version 'looking identical'"

HDR + AA is confirmed to work fine on the XBox360

Course, the dev summed it up better than I:

lancekt said:
Yeah. The Xbox uses both. On PC you have to choose. Clear enough?

Yup. Clear enough. Now that I understand where they sit with PCs vs console graphics quality, me = passing on this game. Too bad, looked cool, too. :mad:
 
Ghost Recon runs slow as crap on the Xbox 360. Looks good though must have some serious effect enabled..

Still, DX10?
 
Lord_Exodia said:
For me it's a no brainer 7900GTX SLi > Crossfire X1900XTX. Single card though I'm not sure. I'll wait for the benchmarks which are sure to be available soon. I think the X1900XTX may be slightly faster but I wouldnt be surprised if the 7900GTX wins either. So close in performance. Now as per image quality. The slight edge would have to go towards the X1900XTX for reasons mentioned by many a poster before me.

Just for the Record and it is only A bench ATI Xfire First to Break 20,000 score one for ATI . This none game sells a lot of M/B and Video cards. I call the 2 cards = but NV fans so 7900GTX is faster . I say prove it . Don't say it . prove it. use 6.3 drivers and prove it.

Run every game available and prove it. Don't select a few that they both are good at . Run all games with it. I will bit ATI wins 85%
 
$BangforThe$ said:
Just for the Record and it is only A bench ATI Xfire First to Break 20,000 score one for ATI . This none game sells a lot of M/B and Video cards. I call the 2 cards = but NV fans so 7900GTX is faster . I say prove it . Don't say it . prove it. use 6.3 drivers and prove it.

SLi 7900GTX beats in almost any benchmark over X1900XTX Crossfire with 6.2 or 6.3. I dont find any need to proove this to anyone as enough people have reviews fresh in their head from the 7900 launch and otherwise to remember.
 
Lord_Exodia said:
SLi 7900GTX beats in almost any benchmark over X1900XTX Crossfire with 6.2 or 6.3. I dont find any need to proove this to anyone as enough people have reviews fresh in their head from the 7900 launch and otherwise to remember.

Links please No O/C cards factory or not
 
Right now I'm running the X1900XTX, and as soon as my local computer store gets the 7900GTX in I'm replacing the 1900. I just can't stand the noise. I'm also still not too sure about Crossfire, whereas I've had SLI, in two rigs so far, and it works VERY well. Also, the Nvidia card allows you to use two monitors with two different wallpapers, and alot more configureability. (Is that even a word??)
 
$BangforThe$ said:
Links please No O/C cards factory or not

$Bang, check it out dude. You have a store right. I'm gonna give you the bennefit of the doubt. I want you to test in your store a SLi setup vs a crossfire setup. I guarantee the sli setup will be faster in most games if not all the ones you test. Then publish the results here on the [H] forums. You have said in countless other posts that you have a shop. Take pictures of the 2 setups & the scores. More people already know sli is faster. Your the only person saying crossfire can keep up with Sli or beat it. You prove it. You have the resources right? ;)
 
I know sli is faster right now. I also know NV is using optimizations to skew performance. Turn NV optimizations off and run the test ATI wins. An ethical company would not sacrfice IQ for a few frams a second NV does.
 
$BangforThe$ said:
I know sli is faster right now. I also know NV is using optimizations to skew performance. Turn NV optimizations off and run the test ATI wins. An ethical company would not sacrfice IQ for a few frams a second NV does.

ROFL ATI started the optimizations. they invented the optimizations in the drivers. Nvidia followed suit later but they were the first to enable auto overclocking and the trilinear optimizations and all. ATi and nvidia both optimize. Either way my point still stands.
 
$BangforThe$ said:
I see a lot of people saying CCC sucks I for one like it' Here's my wifes Score in 3Dmark05 on an intel @ 3.4GHZ Now if this was an AMD X2 @ 2.8 I would think the score would be close to 8000 you guys place way ti much value in a few FPS

http://service.futuremark.com/compare?3dm05=1865103

I dont think CCC sucks either. I have used it and found it okay. It's not horrible or even bad. I prefer nvidia's stuff better but I get to play with both of these cards like you do. I even get to fuck with Quadro and Fire GL cards. Both companies have a hell of a product but there are people who will like one over the other for their own personal reasons. For my gaming needs for example I have a 21 inch monitor @ 1680x1050 In most reviews I have read SLi owns this resolution. With high IQ settings or otherwise. ATi C-fire does wonderful matching winning or sometimes losing at the higher res like 1920x1200 or 2560x1600.

I was seriously considering getting a RD580 X1900XTX crossfire setup. A few factors swayed me in a different direction. The software was definitely not the major thing here.

I do agree that CCC does not Suck
 
Lets sum it up shall we.

Single card will probably go to X1900s, Oblivion from all info appears shader limited.
Dual card either way. Will have to be tested.
Quad SLI. Goes without saying really. There is no competition. Good for the millionaires out there.
 
Lord_Exodia said:
$Bang, check it out dude. You have a store right. I'm gonna give you the bennefit of the doubt. I want you to test in your store a SLi setup vs a crossfire setup. I guarantee the sli setup will be faster in most games if not all the ones you test. Then publish the results here on the [H] forums. You have said in countless other posts that you have a shop. Take pictures of the 2 setups & the scores. More people already know sli is faster. Your the only person saying crossfire can keep up with Sli or beat it. You prove it. You have the resources right? ;)

I have never sold a Sli system . I am currently working with Xfire but unless the performance Picks up and goes up to at least 80% effiency I will not sell these or sli to customers Its not worth it. Now I believe Xfire and Conroe may change my mind . At any rate I will have Xfire in my personal Conroe Vista R600 PC And my wife also wants Xfire that does not mean I will sell them.By the time Vista arrives I think Xfire will look totally differant than it does today. Sli or Xfire in a low res system just does't make sense.

Low res for todays standards =1280X1024 these machines do not require SLi Or Xfire


Here's a fair question How many games are based on Open GL and how many on DX

There not many open GL games and yet they are given preferential treatment. THis is prejuidice. I have never seen BHD in a bench mark review. NV is the best I ever seen at Spinning. I like NV I have bought their cards but their IQ sucks and if you turn off the optimizations to = ATI there performance is less.
 
obs said:
This should have been the end of the thread.

If that's all the thread was about.

But let's recap.

Q: Which is the CHEAPEST card to get for 'Oblivion'? Radeon X1900 or GeForce 7900?
A: For comparable models, Radeon X1900 right now.

Q: Which will RUN better in 'Oblivion'? Radeon X1900 or GeForce 7900?
A: No way to know until it comes out. Probably equal, or very close.

Q: Which will LOOK better in 'Oblivion'? Radeon X1900 or GeForce 7900?
A: XBox360. Radeon X1900 and GeForce 7900 should look the same, but with at least one major graphical feature already known to be disabled that only XBox360 users get.
 
I will buy this game as so many people are talking about it. I bet my wifes machine won't run it tho.
 
VolvoR said:
Right now I'm running the X1900XTX, and as soon as my local computer store gets the 7900GTX in I'm replacing the 1900. I just can't stand the noise. I'm also still not too sure about Crossfire, whereas I've had SLI, in two rigs so far, and it works VERY well. Also, the Nvidia card allows you to use two monitors with two different wallpapers, and alot more configureability. (Is that even a word??)

Do you want to sell that card and how much you want for it?
 
dderidex said:
If that's all the thread was about.

But let's recap.

Q: Which is the CHEAPEST card to get for 'Oblivion'? Radeon X1900 or GeForce 7900?
A: For comparable models, Radeon X1900 right now.

Q: Which will RUN better in 'Oblivion'? Radeon X1900 or GeForce 7900?
A: No way to know until it comes out. Probably equal, or very close.

Q: Which will LOOK better in 'Oblivion'? Radeon X1900 or GeForce 7900?
A: XBox360. Radeon X1900 and GeForce 7900 should look the same, but with at least one major graphical feature already known to be disabled that only XBox360 users get, also ATI x1900s get the advantage of HQ AF.


that looks better =p
 
Sharky974 said:
No, Carmack is a dev that's made it pretty clear he likes Nvidia cards. He designs for certain features (double z rate?) of nvidia cards so that his games run faster on them.

Yes. John Carmack only used z-fail (Carmack's Reverse) stencil shadows because NV40 could process 32 Z/stencil operations per clock cycle. Give me a break.

Sharky974 said:
And X1900 has 48 PIPES.

Since when were arithmetic logic units considered pipelines?
 
$BangforThe$ said:
NV is the best I ever seen at Spinning. I like NV I have bought their cards but their IQ sucks and if you turn off the optimizations to = ATI there performance is less.

Nv iq does not suck. Only the most extreme ATi FanATIcs can say that. No customer will walk into your shop and look at a screen running SLi with 8AA 16AF Supersampling Transparency Gamma Correct Antialiasing SM3.0 SplinterCell Chaos theory and say these graphics Suck. Show me someone who says that and I'll be glad to call them a FanATIc.

The one thing you said is true is that if you drop their Optimizations nvidia's performance will go down. Not much but it will. Slower than ati yes. However turn down ati's optimizations and the same. It will go down. I would love to see un optimized High quality benchmarks from 7900 vs X1900XTX. I've yet to see that happen but would love it.

The only good thing for ati is that their optimizations dont have such a notable negative impact on the eye such as nvidia's optimizations do. Either optimization, however is a cheat at the cost of graphics to gain better performance and should be outlawed IMHO
 
phide said:
Since when were arithmetic logic units considered pipelines?

How many pipes is a X1800?

How many ALU's in a X1800?

How many pipes in a X800XT?

How many ALU's is a 6800 ULTRA?

How many pipes is in a 6800 ULTRA?

How many ALU's in a X800XT?

How many ALU's in a 7800 GTX?

How many pipes in a 7800 GTX?

How many pipes in a 7900 GTX? ALU's?

Please answer these questions real quick for me for me. Thanks.
 
There is a new video released in Oblivions homepage. What a teaser it is, but just confirms yet again that this game will look terrific!
 
Sharky974 said:
Please answer these questions real quick for me for me. Thanks.

ALUs are components of pixel and vertex pipelines. I'm not sure why you want me to give you the specifications for all of these cards. What purpose would this serve?
 
phide said:
ALUs are components of pixel and vertex pipelines. I'm not sure why you want me to give you the specifications for all of these cards. What purpose would this serve?


Well if you dont mind will you please do it?

The purpose it will serve is to show that you are silly!
 
Sharky974 said:
How many pipes is a X1800?

How many ALU's in a X1800?

How many pipes in a X800XT?

How many ALU's is a 6800 ULTRA?

How many pipes is in a 6800 ULTRA?

How many ALU's in a X800XT?

How many ALU's in a 7800 GTX?

How many pipes in a 7800 GTX?

How many pipes in a 7900 GTX? ALU's?

Please answer these questions real quick for me for me. Thanks.


For the last time!!!

"Pipelines" are nothing more than 2+ functional units in sequence.

Current NV architecture has TMU-ALU pipes, with decoupled ROPs. Thus number of pipes equals number of ALUs. 7900s have 24 TMU/ALU pipes, and 16 ROPs.

Current ATI architecture has TMU-ROP pipes, with decoupled ALUs. Thus number of pipes is not directly related to number of ALUs. X1900s have 16 TMU/ROP pipes, and 48 ALUs

Not really hard now is it.........
 
Okay Manic One. Shouldn't be too hard then for you to take my challenge eh (listing the ALU's/pipes for those cards)? But I'd realy like phide to do it.

I'm wondering why you guys are so wary of it? Not too hard really. You dont have to educate me anything. Just list those pipes/Alu's on those models I listed please.
 
My point is you don't get it Sharky.

Prior to Geforce 6 and Radeon X series cards, graphics "pipelines" consisted of 3 units; TMU-ALU-ROP.

From Geforce 6 on NV decoupled (meaning removed from the pipe stage) the ROPs, which can be seen in 6600s (which had 8 pipes and 3 ROPs).

With Radeon X series, ATI decoupled the ALUs. The reason X1800s have 16 ALUs was a design decision, and not governed by the architecture. Sure they have both 16 pipes and ALUs, but that is not due to any sort of 1:1 relationship. With X1900s, ATI upped the ALUs to 48, while maintaining 16 pipes.

In NV's 6 series, 6800s had 16 pipes and 16 ROPs, but again this was not due to an architectural relationship. NV realised that 16 ROPs was excessive (ROPs are not a bottleneck in any current GPU). This is why 6600s had 8 pipes but only 3 ROPs.


Conclusion: X1900s do not have 48 pipes (remember pipe = 2+ functional unit sequence, in this case TMU-ROP), but rather 48 ALUs. The ALUs are not part of the pipeline.


Sorry for answering what you wanted from others, but your reason for asking was flawed. Recent GPU generations have seen architectural changes for both ATI and NV, and what equals a pipe for ATI does not equal a pipe for NV. With USA, don't be surprised to see the "pipeline" disappear altogether, with all functional units becoming decoupled from one another, as GPU designers strive for efficiency.
 
I already know what you've stated. Why wont you answer my question?

Just please list the pipes on those cards. Why is it so hard?

I dont care if you use your definition, my definition, ATI's definition..just please be consistent.

Nobody has taken this challenge yet. Why not?

It's so simple.

Use your methodology you just explained to me. I dont care.
 
16 X1800
16

16 X800XT

16 6800U
16

16 X800XT

24 7800GTX
24

24 7900GTX
24

And your point is? That R580 has most ALUs? I think thats kind of established.

Sorry forgot to add:

16 X1900
48

Do you get it. Reread what I posted dude, X1900s "pipes' don't contain ALUs. They are decoupled. Thus it is possible to have 16 pipes/48 ALUs.

Second Edit: Further example is the FX 5900/5950s. Although reported as 8 pipes, they actually weren't. They contained 2 ALUs in each pipeline (still coupled). they would appear as:

4 FX5900/5950
8

Thus they were not an 8x1 part as reported (pipes x ALU per pipe) but rather a 4x2.
 
How many pipes is a X1800? 16

How many ALU's in a X1800? 32 (All ati cards have had the same basic alu config since the 9700 with a full alu and a mini alu for each)

How many pipes in a X800XT? 16

How many ALU's is a 6800 ULTRA? 32 (the 2nd alu is mainly used for texturing and is much weaker than the 2nd alu in the g70 series, though it can still do some math ops)

How many pipes is in a 6800 ULTRA? 16

How many ALU's in a X800XT? 32

How many ALU's in a 7800 GTX? 48

How many pipes in a 7800 GTX? 24

How many pipes in a 7900 GTX? ALU's? 24, 48

Take the pipeline counts with a rock of salt. The fact is, everyone should stop referring to pipes because graphics cards are not arranged in this way anymore. Both sides are decoupling units making the whole pipeline idea obsolete. The whole reason you see ATI referring to 48 ALU's vs. 48 pipelines is because it is no longer an applicable description.
 
Back
Top