X2900XT preview

Whats really confusing me is: What are those 320 shaders doing? They are just sitting there, fondling themselves? What is the new memory interface for? Why 512bit if 256 was enough to beat the new 512? What is the point of all this new crap if it cant beat last gen's stuff?

To me, it seems that there is no possible way that ATI can released a card that is worse than the last generation. Maybe I'm just hoping that it isn't so, but you cant blame one for hope. If in the end it does turn out this way, turns out that the R600 is nothing more than a load pig excrements, then I will purchase the G80.

Its not worse than last generation. One thing I noticed from the "preview" is that there is relatively no frame drop between 4xAA and 8xAA. Also at the higher resolutions R600 was faster than last generation. Having all of those parts is one thing, but making that many changes and having everything working out perfectly is another story, especially if all of the changes are done in one generation.
 
i have to say it is kindof interesting, if they are real just looking at the numbers difference betwen 4XAA and 8XAA @ 2560

COH: <.01% Decrease
FEAR: 7% Decrease
FarCry: 2% Decrease
Oblivion: 8% Decrease
Prey: 30% Decrease
Stalker: 0% Decrease - this looks very suspect for all 3 cards
SC: 2% increase
X3; 20% Decrease

those are some pretty good numbers IMO, with an average 8% decrease in performance

when you compare it to the Nvidia's drop in framerate from going between 4X to 8XAA of the same res, it comes out to about an average of 33% drop in framerate in just going from 4x to 8xAA..

i can't say whether these numbers are legit or not, but if they are.. it might be a good sign for hi-res/high AA/ high AF gamers indeed..

Not that it matters if these are all fake, but the S.T.A.L.K.E.R. numbers make sense in light of the fact that, with dynamic lights enabled, AA settings have no effect--the post-processing for the lighting effects prevents AA from being applied. So you could set AA to anything and framerates wouldn't change.
 
While these early previews don't look all that impressive, some guys over Rage3d are saying that ATI has disabled half of the ring bus in these early drivers to artificially cripple the card and make it appear slower than the final version. I'm not too sure why they would do this, but if it is the case the final numbers should be much faster than these early results show.

Only four more days and we'll see what this card is capable of...

I love this--I wish the Rage3D true believers would give me some of what they're smoking!
 
I'm not sure it would be completely unthinkable to effectively cripple the cards to keep numbers under control. We know it's easily possible based on what they could do with the 1900's. It would be a way of keeping the card working for developers that needed the cards early without revealing the numbers.

Based on the numbers we're seeing there are a few things that we can fairly safely conclude.

1) R600 isn't limited by bandwidth. If it was they wouldn't be sticking 800MHz GDDR3 on the cards.
2) R600 has pleanty of math and shading power to back it up when used efficiently. The high 3dmark scores and teraflop in a box demo seem to point to this.

In the case of the FX series the math power simply wasn't there. Nvidia resorted to all kinds of hacks and optimizations to improve performance. With R600 it looks like the card has the potential but due to drivers it probably isn't being realized. And don't take this as driver problems in the sense of what Nvidia had. All indications are that the drivers are stable and run well on all OS's but performance isn't quite there yet.
 
Even if these numbers are true, this is still a great card which should kick GTS's ass in Dx10 games. It's the expectancy that it would beat GTX, which is causing all the bad press.
 
Even if these numbers are true, this is still a great card which should kick GTS's ass in Dx10 games. It's the expectancy that it would beat GTX, which is causing all the bad press.

Have you read anything on the R600... at all? Or did you just see the name of the thread and click "Post Reply" before putting on your ATI4LYFE tard-o-vision glasses and posting?
 
I'm not sure it would be completely unthinkable to effectively cripple the cards to keep numbers under control. We know it's easily possible based on what they could do with the 1900's. It would be a way of keeping the card working for developers that needed the cards early without revealing the numbers.

Based on the numbers we're seeing there are a few things that we can fairly safely conclude.

1) R600 isn't limited by bandwidth. If it was they wouldn't be sticking 800MHz GDDR3 on the cards.
2) R600 has pleanty of math and shading power to back it up when used efficiently. The high 3dmark scores and teraflop in a box demo seem to point to this.

In the case of the FX series the math power simply wasn't there. Nvidia resorted to all kinds of hacks and optimizations to improve performance. With R600 it looks like the card has the potential but due to drivers it probably isn't being realized. And don't take this as driver problems in the sense of what Nvidia had. All indications are that the drivers are stable and run well on all OS's but performance isn't quite there yet.

You are contradicting yourself...
If the card is performing poorly due to drivers, then the drivers suck. Plain and simple. If the drivers are good, then the card simply does not have the potential that you mentioned.
The drivers can't be good, while they completely cripple the card's performance. It makes no sense...

Also, the optimization tricks were used by both NVIDIA and ATI. I think it was Kyle himself who found those tricks, during some tests to ATI's 8xxx family, in Quake 3, or some similar game.
 
WTF? There isn't enough hype....or better yet 'buzz'...for your liking? :eek: O-kay.....


For Enthusiast? Sure, this is normal. But, they create their own 'mega' buzz anytime something geeky is ready to blow..

What I mean is a paper launch or soft launch. If they had a card that would oust any of the competitions cards with room to grow, they wouldn't be holding out so long with so many delays and allowing so much &#9829;&#9829;&#9829; to be spread. That's all I'm sayin'. Take it for what you will, it's just my gauge or perception on the current climate of things.

You have to try and remember two things: 1) Of the ppl making up the 20-40% market share for the green or red team get, "enthusiast" make up less than 5% (if that). 2) We're talking about the Green and Red team, two rivals who have worse (IMO, because they're supposed to be business men and engineers) cat-fights and spats then the biggest grudges in college football; and that's saying something.

Both of these companies have taken a 'Cut-Throat' approach many times in the past, why delay this 'round'? AMD's not gonna tolerate it anymore? Hardly enough of a reason. But I 'spose.


Edit: LOL, I'll give you all 1 guess as to what word is no longer allowed on the [h] and has been doused in sugar (hearts).
 
You are contradicting yourself...
If the card is performing poorly due to drivers, then the drivers suck. Plain and simple. If the drivers are good, then the card simply does not have the potential that you mentioned.
The drivers can't be good, while they completely cripple the card's performance. It makes no sense...

Also, the optimization tricks were used by both NVIDIA and ATI. I think it was Kyle himself who found those tricks, during some tests to ATI's 8xxx family, in Quake 3, or some similar game.

If the drivers suck then games are crashing, BSODs, and parts aren't being rendered at all, I haven't heard any reports of this being a problem. The potential of the card seems to be fairly well documented through all the specs that have been released. Realizing that potential remains to be seen. Even with the 1900's there were driver releases giving 30% performance increases and performance wasn't what I would call bad to begin with. And that was mainly ATI creating profiles for their memory controller. I wouldn't rule out R600 having other programmable parts that now need to be optimized so it could take some time to figure out what works best.

And while both sides used optimizations ATI wasn't exaclty cutting every corner trying to bring FPS up to par. There is no denying that Nvidia released some underpowered cards at the time.
 
If the drivers suck then games are crashing, BSODs, and parts aren't being rendered at all, I haven't heard any reports of this being a problem. The potential of the card seems to be fairly well documented through all the specs that have been released. Realizing that potential remains to be seen. Even with the 1900's there were driver releases giving 30&#37; performance increases and performance wasn't what I would call bad to begin with. And that was mainly ATI creating profiles for their memory controller. I wouldn't rule out R600 having other programmable parts that now need to be optimized so it could take some time to figure out what works best.

So, by your definition, "good" drivers are just those that do not crash, even if they are causing the card to perform poorly...
I actually find it amusing, that you are "ok", with having a $400-500 card, that performs poorly due to drivers, but at least it doesn't crash in games.
IMHO, good drivers are those that provide excellent performance, while being stable.

Also, if you go by specs alone, even the HD 2900 XT, should be destroying a 8800 GTX completely.
So, if these unconfirmed numbers are indeed confirmed, there's a serious problem of efficiency with the R600 architecture, because its specs are just amazing.

And while both sides used optimizations ATI wasn't exaclty cutting every corner trying to bring FPS up to par. There is no denying that Nvidia released some underpowered cards at the time.

Actually, they were and it seems you try to boost NVIDIA's mistakes, to hide the ones from ATI. Why is that ?
Both companies did the same thing, to boost fps count. Period. There's no excuse for either of them. That's the only thing that there's no denying.
 
Ati didnt cut corner with the 8500, yea there was a problem that looked like they cheated but they released a new driver right after that fixed it with the same preformance.:rolleyes:
 
So, by your definition, "good" drivers are just those that do not crash, even if they are causing the card to perform poorly...
I actually find it amusing, that you are "ok", with having a $400-500 card, that performs poorly due to drivers, but at least it doesn't crash in games.
IMHO, good drivers are those that provide excellent performance, while being stable.

Also, if you go by specs alone, even the HD 2900 XT, should be destroying a 8800 GTX completely.
So, if these unconfirmed numbers are indeed confirmed, there's a serious problem of efficiency with the R600 architecture, because its specs are just amazing.



Actually, they were and it seems you try to boost NVIDIA's mistakes, to hide the ones from ATI. Why is that ?
Both companies did the same thing, to boost fps count. Period. There's no excuse for either of them. That's the only thing that there's no denying.

Who said they're making the card perform poorly? From the leaked specs I've seen the scores at least looked reasonable. IMHO it's much more important to have drivers that let you actually play the game than just getting high FPS while crashing or rendering objects incorrectly.

Going by the specs it doesn't look like it would be that far ahead of an 8800GTX. It has about 1/3 more shading power, 1/6 more bandwidth, and it's texturing power looks significantly lower. Most games still seem to be texture bound so it wouldn't be unrealistic to think that's what's holding it back atm. It's also curious why they clocked the card as low as they did. If anything I'd think the numbers should balance out fairly well. There don't seem to be any really shader heavy games out there to really unleash the potential of the card.

And I'm not boosting Nvidia's mistakes, they did that on their own. While both sides were making optimizations only one side was blatantly impacting image quality to get there. And don't count a bugged driver that simply wasn't rendering parts of a scene as an optimization.
 
Who said they're making the card perform poorly?

You said that drivers were the cause of the poor use of the card's potential. And here is the quote:

With R600 it looks like the card has the potential but due to drivers it probably isn't being realized.

Anarchist4000 said:
From the leaked specs I've seen the scores at least looked reasonable. IMHO it's much more important to have drivers that let you actually play the game than just getting high FPS while crashing or rendering objects incorrectly.

First of all, you don't know how R600 drivers are and second, and I'm sure you are refering to NVIDIA's drivers, those problems happened mostly in Vista, which, when NVIDIA launched G80, wasn't even out yet. Drivers were bad, but they sorted out (and are sorting out) the problems and according to what I read in NVIDIA's sub-forum, the drivers are much better.

Also, you confirmed what I understodd as your definition of "good" drivers. For you, a $400-500 card, that has stable drivers, but doesn't deliver good performance, is ok. I guess it's a matter of perspective, since I really don't agree with you there, as I mentioned before.
For me, good drivers, not only are stable, but offer the performance that's expected from the card.

=Anarchist4000 said:
Going by the specs it doesn't look like it would be that far ahead of an 8800GTX. It has about 1/3 more shading power, 1/6 more bandwidth, and it's texturing power looks significantly lower. Most games still seem to be texture bound so it wouldn't be unrealistic to think that's what's holding it back atm. It's also curious why they clocked the card as low as they did. If anything I'd think the numbers should balance out fairly well. There don't seem to be any really shader heavy games out there to really unleash the potential of the card.

You are joking right ? No shader heavy games ? What do you call Oblivion ?
You're basically saying that the card doesn't look as good, in those unconfirmed numbers, because there are no games out there that use it's potential ?
Right,..God forbid it's the hardware itself, not being efficient enough, in distributing the work load, between its stream processors...No, it must be the games...

=Anarchist4000 said:
And I'm not boosting Nvidia's mistakes, they did that on their own. While both sides were making optimizations only one side was blatantly impacting image quality to get there. And don't count a bugged driver that simply wasn't rendering parts of a scene as an optimization.

LOL, bugged driver ? So, NVIDIA does the scam and it's "blatantly impacting image quality", but ATI does it and it's "a driver bug". Please, no more flag waving :)

NVIDIA and ATI are the same. They'll do what's necessary to stay on top of each other, including those kinds of scams.
 
Smell that?
Smells like... fanboys.

The benchmarks here don't look promising, but I've never heard of that site before, and the entire review reeks of "huuuuuh?"

Giving any kind of credibility to this place, positive or negative, is just downright stupid.

Wait until some credible places review it before you start saying how screwed/divine the x2X00 line is.
 
The card with it's current drivers looks to be performing quite nicely. I still feel the drivers are likely holding it back a bit but just because there is room for fine tuning doesn't mean the drivers are bad. I guess we'll have to wait until Monday for the full story but I haven't heard of any reports of certain games having really low framerates, crashes, or compromised IQ.

And I don't know what the driver situation is but AMD seems to be releasing drivers daily and each time FPS keeps going up substantially according to leaks. And I never said that good drivers were simply stable. They are stable while at least delivering respectable performance given the hardware, so no extremely poor performance in certain games. Just because there is a fair amount of room for improvement doesn't mean performance isn't already good.

What makes you think Oblivion is shader heavy? It's one of those games that managed to be both VS and PS limited. It's much better suited for a unified architecture than just having a lot of shader power. A shader intensive game is one that is doing a lot of math to achieve the final results. Most games just use shaders to texture everything like crazy. Shader intensive should be the ratio of ALU:TEX operations. And I can't think of any game out there where the shaders are the bottleneck. None of the games out there have even started to push DX9 IMHO. With DX10 things should get a bit more shader intensive as GS will primarily be math along with the VS.

And you can spin and twist things all you want but ATI's "bug" was in there for one driver version and was fixed. I'm not sure Nvidia ever removed theirs. All those reviews where parts of a scene were missing but framerates were good. Then a week later after the reviews a new driver arrives fixing those issues and framerates start falling. Maybe you haven't been following the graphics cards long enough to remember it but the difference was rather obvious. And if anyone here is flag waving it's you. Your in an ATI forum promoting Nvidia and pointing out every thing you can, true or false, to discredit anything ATI/AMD does.
 
NVIDIA and ATI are the same. They'll do what's necessary to stay on top of each other, including those kinds of scams.
Quoted for truth.

both will do everything they can to put money in the bank.
 
Another new review up at PCAdvisor magazine. Their take? If you want an economical card for playing lower detail settings, the X2900 is a good choice. If you want to be playing with all the eye candy turned on and have the muscle to run newer DX10 titles, the G80 is the way to go.

http://www.pcadvisor.co.uk/reviews/index.cfm?reviewid=834

Once again, take this review with a grain of salt. The guys at rage say that these numbers are also out to lunch, and that they must be using older drivers or a version of the card with the crippled ringbus. Apparently, there is a new set of drivers floating around that only a few people have that give the card about a 30&#37; performance boost, putting it in GTX killing territory. I just don't understand ATi's stategy of giving review sites cards with half of the memory bandwidth disabled, as well as defective GDDR4 memory for the XTX, combined with drivers that are apparently holding the card back. It just doesn't make sense. ??

Could it be the cards are working properly, and that the highly anticipated R600 is actually just slower than the GTS?
 
Could it be the cards are working properly, and that the highly anticipated R600 is actually just slower than the GTS?

If it doesn't at least match GTS performance, I don't know how they expect to sell any.
Therefore, I think it will be at least as good as an 8800 GTS.
 
Another new review up at PCAdvisor magazine. Their take? If you want an economical card for playing lower detail settings, the X2900 is a good choice. If you want to be playing with all the eye candy turned on and have the muscle to run newer DX10 titles, the G80 is the way to go.

http://www.pcadvisor.co.uk/reviews/index.cfm?reviewid=834

Once again, take this review with a grain of salt. The guys at rage say that these numbers are also out to lunch, and that they must be using older drivers or a version of the card with the crippled ringbus. Apparently, there is a new set of drivers floating around that only a few people have that give the card about a 30% performance boost, putting it in GTX killing territory. I just don't understand ATi's stategy of giving review sites cards with half of the memory bandwidth disabled, as well as defective GDDR4 memory for the XTX, combined with drivers that are apparently holding the card back. It just doesn't make sense. ??

Could it be the cards are working properly, and that the highly anticipated R600 is actually just slower than the GTS?

OR all the fanboys are simply in denial...thats 3 reviews now reporting the same thing AND SAMPSA from XS...if there were new drivers I guarantee he has them...
 
If it doesn't at least match GTS performance, I don't know how they expect to sell any.
Therefore, I think it will be at least as good as an 8800 GTS.
Can anyone else see the graphs it refers to? Because from that wording it reads to me like it did test faster than the 640 GTS on most of the tests. So it is largely a match. Also it seems a bit fuzzy on what "details" they are talking about? Do they mean resolution?
 
The card with it's current drivers looks to be performing quite nicely. I still feel the drivers are likely holding it back a bit but just because there is room for fine tuning doesn't mean the drivers are bad. I guess we'll have to wait until Monday for the full story but I haven't heard of any reports of certain games having really low framerates, crashes, or compromised IQ.

And I don't know what the driver situation is but AMD seems to be releasing drivers daily and each time FPS keeps going up substantially according to leaks.

I love this...So these leaks/rumors/unconfirmed benchmarks are true, because they don't mention any driver anomaly. Yet, all of a sudden, since they put the HD 2900 XT, in GTS 640 performance realm, they are most likely fake and should be taken with a huge grain of salt...

Amazing...just amazing...

What makes you think Oblivion is shader heavy? It's one of those games that managed to be both VS and PS limited. It's much better suited for a unified architecture than just having a lot of shader power. A shader intensive game is one that is doing a lot of math to achieve the final results. Most games just use shaders to texture everything like crazy. Shader intensive should be the ratio of ALU:TEX operations. And I can't think of any game out there where the shaders are the bottleneck. None of the games out there have even started to push DX9 IMHO. With DX10 things should get a bit more shader intensive as GS will primarily be math along with the VS.

Because Oblivion IS a game that makes heavy use of shaders. And as you said, DX10 will increase that use, even more.

And you can spin and twist things all you want but ATI's "bug" was in there for one driver version and was fixed. I'm not sure Nvidia ever removed theirs. All those reviews where parts of a scene were missing but framerates were good. Then a week later after the reviews a new driver arrives fixing those issues and framerates start falling. Maybe you haven't been following the graphics cards long enough to remember it but the difference was rather obvious. And if anyone here is flag waving it's you. Your in an ATI forum promoting Nvidia and pointing out every thing you can, true or false, to discredit anything ATI/AMD does.

Spin and twist ? As I recall you were the one saying that ATI's optimization, was a bug.
It must be as we, in the software development group call it, an "undocumented feature", in ATI's drivers. It couldn't possibly be an optimization...:rolleyes:
All I can do is suggest you read Kyle's review on that "driver bug".

And I'm promoting NVIDIA ? I've been trying to explain to you in my last couple of posts, that NVIDIA and ATI are exactly the same. They'll do everything they can to be on top of the market, including the optimization or driver bug (as you called it) scam and doubtful marketting strategies. If that's promoting NVIDIA, so be it.
 
I thought [H] was based in Texas, but the forums must run on servers hosted in Egypt. Everywhere I look in this thread, I see de Nile... :eek:
 
Yes, but that's not even in question. R600 is already a "fiasco", even without confirmed numbers to back it up, since it's so late in the game.
However, if they at least have a card, such as the HD 2900 XT, which beats NVIDIA's second best and is priced at around the same price ($350-370), then they may have a saving grace.

The average price of the 8800 GTS 640 is higher then $370 and, I wouldn't compare it with the 8800 gts 320. IMO, it's not a fiasco as you call it. That's a bit harsh IMO. I understand how upset you must feel do to the late launch (if that's what you mean). Honestly, the launch of this card with it's questionable performance ratings leaves a bit to be desired. However, it does appear this video card performance does get better with each new driver release. We have to wait and see. It's very possible that the best performance is not had until after hard launch. Which is nothing new... Let us not forget to look at how well this card scales when OC. For example. lets say that the HD2900XT performs on par with the 8800gts 640 but when OC on air it performs on par with mildly OC 8800gtx. That's enough to get people to buy the card IMO. Specially if the Vista drivers prove stable.
 
OR all the fanboys are simply in denial...thats 3 reviews now reporting the same thing AND SAMPSA from XS...
All those others were also refering to the 2900XT card as a "winner"?

.dwight, unable to swing a mouse without hitting a "fanboy"
 
The average price of the 8800 GTS 640 is higher then $370 and, I wouldn't compare it with the 8800 gts 320. IMO, it's not a fiasco as you call it. That's a bit harsh IMO. I understand how upset you must feel do to the late launch (if that's what you mean). Honestly, the launch of this card with it's questionable performance ratings leaves a bit to be desired. However, it does appear this video card performance does get better with each new driver release. We have to wait and see. It's very possible that the best performance is not had until after hard launch. Which is nothing new... Let us not forget to look at how well this card scales when OC. For example. lets say that the HD2900XT performs on par with the 8800gts 640 but when OC on air it performs on par with 8800gtx (that has a mild OC). That's enough to get people to buy the card alone IMO. Specially if the Vista drivers prove stable.

Well, it's a fiasco, because it's a full product cycle late. And because of that, we have NVIDIA's Ultra, which may be an indication of prices to be, if AMD/ATI doesn't provide any sort of competition.

And what I said about the XT, was that if it proves to be faster than the GTS 640, while costing the same (is it $350-370 or $400), it will surely be a good buy. But it does need to be faster, otherwise, it's better to invest on a GTS 640, for the simple fact that it has more memory.
 
I love this
And I'm promoting NVIDIA ? I've been trying to explain to you in my last couple of posts, that NVIDIA and ATI are exactly the same. They'll do everything they can to be on top of the market, including the optimization or driver bug (as you called it) scam and doubtful marketting strategies. If that's promoting NVIDIA, so be it.

Well, it's a fiasco, because it's a full product cycle late. And because of that, we have NVIDIA's Ultra, which may be an indication of prices to be, if AMD/ATI doesn't provide any sort of competition.

And what I said about the XT, was that if it proves to be faster than the GTS 640, while costing the same (is it $350-370 or $400), it will surely be a good buy. But it does need to be faster, otherwise, it's better to invest on a GTS 640, for the simple fact that it has more memory.

Hmm, I thought you were not promoting Nvidia. It appears what Anarchist4000 said about you is true after all. And the term fiasco doesn't define th R600. For one, there are no official benchmarks to warrant such a baseless opinion.

I see no sudden collapse of ATI or the R600
I do not see proof that the R600 is a complete failure
Therefore, your use of the term is out of context.
Furthermore, the average price of the 8800GS 640meg is not at $350-$370. Please read my other post for link. You know what, it's fine that don't like ATI or the R600. However, you discredit yourself when you attempt to discredit others because they have an opinion that does not agree with your negativity about the R600.
 
Does R600's bandwidth mitigate the difference ?


If page flipping is occuring bandwidth doesn't help out at all.

Hmm, I thought you were not promoting Nvidia. It appears what Anarchist4000 said about you is true after all. And the term fiasco doesn't define th R600. For one, there are no official benchmarks to warrant such a baseless opinion.

I see no sudden collapse of ATI or the R600
I do not see proof that the R600 is a complete failure
Therefore, your use of the term is out of context.
Furthermore, the average price of the 8800GS 640meg is not at $350-$370. Please read my other post for link. You know what, it's fine that don't like ATI or the R600. However, you discredit yourself when you attempt to discredit others because they have an opinion that does not agree with your negativity about the R600.


The r600 is a disappointment, and might be a fiasco because of performance and the amout of energy it uses to get to the performance of the GTS and a little above it. Price it with the extra power you draw out of your wall, and a beefier powersupply. I really don't think this card is going to get the same reception the g80 did thats for sure.
 
If page flipping is occuring bandwidth doesn't help out at all.




The r600 is a disappointment, and might be a fiasco because of performance and the amout of energy it uses to get to the performance of the GTS and a little above it. Price it with the extra power you draw out of your wall, and a beefier powersupply. I really don't think this card is going to get the same reception the g80 did thats for sure.

This is all based on assumption. That's the problem here. There is no solid proof at this time. Wait until the official benchmarks come out first.
 
This is all based on assumption. That's the problem here. There is no solid proof at this time. Wait until the official benchmarks come out first.


rephrase that you don't have proof ;), fiasco, depends on how you look at it, they lost the highend its that simple, I think that is a fiasco. Unless you think AMD wanted to give up the high end market?
 
rephrase that you don't have proof ;), fiasco, depends on how you look at it, they lost the highend its that simple, I think that is a fiasco. Unless you think AMD wanted to give up the high end market?
LOL, what source are you referencing that said that the xt was ATI's flagship? Also, the definition of Fiasco is very clear cut how you use it depends on your understanding of the word :rolleyes:. Also as I mentioned before, Silus is promoting Nvidia based on his earlier post.
 
Thing is that if the reviews were reporting that the R600 was a GTX killer all of you would be accepting the results from the previews. But since they leave a sour taste you are in denial...its OK it will be revealed Monday and then I, Sampsa, [H], and the leaked reviews can say "I told you so."
 
LOL, what source are you referencing that said that the xt was ATI's flagship? Also, the definition of Fiasco is very clear cut how you use it depends on your understanding of the word :rolleyes:. Also as I mentioned before, Silus is promoting Nvidia based on his earlier post.

hmm let me guess...... lets see where I get my info from........ someone else can tell ya that :p

a mishap caused by something suddenly falling down or caving in

This is the definition from your first link. What was everyone expecting to come out? I think something that would be equal if not better then the GTX, thats what I was expecting till last week, thats for the XT, as I said its a point of view. If you were thinking the XT would not compete with the GTX I guess that is what you were saying you weren't disappointed and this isn't a fiasco, its your point of view.
 
Thing is that if the reviews were reporting that the R600 was a GTX killer all of you would be accepting the results from the previews. But since they leave a sour taste you are in denial...its OK it will be revealed Monday and then I, Sampsa, [H], and the leaked reviews can say "I told you so."
1. There is no source (that I know of) stating that the XT is ATI''s flagship video card
2. There is no official source (that I know of) implied or express that XT was the GTX killer. Are you sure you are not getting your post reading mixed up?
3. No one is in denial because they are not interested in a Nvidia (as you are implying to) video card.
 
Hmm, I thought you were not promoting Nvidia. It appears what Anarchist4000 said about you is true after all. And the term fiasco doesn't define th R600. For one, there are no official benchmarks to warrant such a baseless opinion.

I see no sudden collapse of ATI or the R600
I do not see proof that the R600 is a complete failure
Therefore, your use of the term is out of context.
Furthermore, the average price of the 8800GS 640meg is not at $350-$370. Please read my other post for link. You know what, it's fine that don't like ATI or the R600. However, you discredit yourself when you attempt to discredit others because they have an opinion that does not agree with your negativity about the R600.

First of all, I don't really care if you think I'm against ATI. In your mind, I'm against it just because I think R600 is not what it was supposed to be. Fine, think what you will.

Second @ newegg, the GTS 640 is mostly around $370.
Third, no it's not out of context. It's a fiasco, because it's late and it seems it will not deliver what was expected, which is, to beat G80. Again, do you think that if AMD had a product that could beat G80, they wouldn't be the first to show it to the public, instead of dragging and delaying it for 6 months now ?

Last but not least, so I'm promoting NVIDIA, if I'm suggesting the better card, between the XT and GTS 640 ?
Since you didn't seem to understand what I wrote the first time, I'll write it again. The XT NEEDS to be faster than the GTS 640, otherwise, at the same price (is it 350,370 or 400), the GTS 640 is a better investement, for the simple fact that it has more memory.

Do you understand now ? If the XT ends up being roughly the same in performance, as the GTS 640, it's a better investement to buy a GTS 640, for the simple fact that it has more memory (XT only has 512) that will, no doubt, be used in future games.
 
2. There is no official source (that I know of) implied or express that XT was the GTX killer. Are you sure you are not getting your post reading mixed up?

Not official sources, but when people read articles from the INQ like this.

http://www.fudzilla.com/index.php?option=com_content&task=view&id=948&Itemid=1

They tend to get crazy ideas that R600 will be the G80 killer. Then you have people with the mindset that since the R600 is coming out 6 months later than G80, the cards better have a large margin of performance over G80.
 
1. There is no source (that I know of) stating that the XT is ATI''s flagship video card
2. There is no official source (that I know of) implied or express that XT was the GTX killer. Are you sure you are not getting your post reading mixed up?
3. No one is in denial because they are not interested in a Nvidia (as you are implying to) video card.

So you're saying you were expecting that R600 high-end model, would NOT beat G80 ?
That's great, then by your perspective, it surely does not seem to be a fiasco. It must be everything you ever wanted. When the actual reviews are out and if these unconfirmed numbers are confirmed, I'm sure you'll be the happiest guy in the world, to buy a card that was delayed 6 months and yet did not beat the competition's most powerful offerings. What great expectations you have of ATI. I thought ATI fans believed ATI's products, would be better than NVIDIA's. I guess you are the exception to the rule. If that's the case, you're right, there's no denial.
 
Lets argue about this after monday :rolleyes:

Anyway if the HD2900XT offers better or similar performance as a 8800GTS at the same price Im sold. Im looking forward to playing my 2D spaceshooter (subspace continuum if anyone is curious lol) again which for some freaking reason just wont work on the 8800. Annoyed the **** out of me.
 
Not official sources, but when people read articles from the INQ like this.

http://www.fudzilla.com/index.php?option=com_content&task=view&id=948&Itemid=1

They tend to get crazy ideas that R600 will be the G80 killer. Then you have people with the mindset that since the R600 is coming out 6 months later than G80, the cards better have a large margin of performance over G80.

Exactly why I said so, many times in other threads (and others too) that the hype surrouding R600, would be its downfall. Being so late in the game, everyone would be expecting it to completely destroy NVIDIA's 8800 GTX. Not only because it has killer specs, but because the delays would be used by AMD/ATI to tweak the card further and improve its performance.
 
Back
Top