Recent R600 Reports have got be total FUD

swetmore

Supreme [H]ardness
Joined
Oct 11, 2004
Messages
7,904
Hi all,

You must be like me.. Holding out for the release of the R600 variants before making a decision to go with an 8800 or 2900.

I find it really strange that AMD is so tight lipped about the release, but then again the 8800 GTX came out of nowhere and bam hit AMD / ATI around Christmas time. So maybe we all will be rewarded for waiting patiently.

Hopefully all the recent reports are just FUD. I do think the top of the line card will be a power hog, and probably sound like a leaf blower too. If you are like me, you will probably put it on water anyhow.

Hopefully AMD will come through and surprise us all and these specs will really mean something with good drivers and blow Nvidia out of the water:

Superscalar unified shader architecture with 64 Vec5 unified shaders for a total of 320 stream processing units which is different from NVIDIA's stream processors, 512-bit 8-channel memory interface.
 
I agree. Even for the diehard nvidia fanboy, a weak ATI is not a good thing. A strong ATI will drive the prices down which is good for everybody.

However, ALL the news are bad, and ALL can't be wrong.
 
R600 is a flop...the sooner you deal with it, the better off you'll be.
 
Rumors can be wrong and cannot be, but every one is saying the same thing. Which means it is true, but it does not make sense to me, how could this be possible.

1GB memory, 512bit memory, 320 stream processing unit, 140MB memory bandwidth. All these number are better than Nvidia then how come it is slow. There is definately some drivers issue or every thing i stated up there is not true.
 
We should ignore everything except that it consumes 240W, does HDMI, Physics, comes with Half-life 2 Episode 2/Team Fortress 2. Anandtech's/HardOCP's/Guru3D's reviews will be the reviews that are completely legit, so only 10+ days left :).
 
Who knows, they may have gone so far as to make it a killer in dx10, while being not so hot at dx9.
Just have to wait and see.
 
Rumors can be wrong and cannot be, but every one is saying the same thing. Which means it is true, but it does not make sense to me, how could this be possible.

From what i have seen Fudo, dailytech and kyle are the only sources that are all doom and gloom. Everything else i've been reading has been saying its about even with an 8800gtx and that the power/temp reports have been overstated. What its like in real life, who knows, but at this point I'd rather wait for the benches before i make a decision either way.
 
From what i have seen Fudo, dailytech and kyle are the only sources that are all doom and gloom. Everything else i've been reading has been saying its about even with an 8800gtx and that the power/temp reports have been overstated. What its like in real life, who knows, but at this point I'd rather wait for the benches before i make a decision either way.

Which means that, once again ATI is left with only sloppy seconds, because they are once-again at performance parity with Nvidia, and only 6 months late (happened with the highly-anticipated R520 too). ATI slid into a performance lead with the x1900 series, but that only took NINE MONTHS, after which Nvidia was heavily entrenched.

ATI hasn't wowed the video card community since the introduction of the 9700 Pro. I'm not holding my breath on this one. Competition is a good thing, but ATI is a little late to this party - Nvidia went ahead and released midrange parts without them, and because there is no competition, the value in the 8500 / 8600 sector is weak.
 
Which means that, once again ATI is left with only sloppy seconds, because they are once-again at performance parity with Nvidia, and only 6 months late (happened with the highly-anticipated R520 too). ATI slid into a performance lead with the x1900 series, but that only took NINE MONTHS, after which Nvidia was heavily entrenched.

ATI hasn't wowed the video card community since the introduction of the 9700 Pro. I'm not holding my breath on this one. Competition is a good thing, but ATI is a little late to this party - Nvidia went ahead and released midrange parts without them, and because there is no competition, the value in the 8500 / 8600 sector is weak.

r520 was arguably faster than the 7800gtx and was roughly ~2 months late iirc. The x1900 raped when it was released. r4XX wasn't much of a letdown, it was just a paperlaunch however it is arguable that ati's x800xt was faster than a 6800 ultra. The 6800gt was amazing though.
 
My X800XL is getting really long in the tooth, especially on my 24" 1080p display and running Vista. I try to get 2 years out of my video cards. This time I have to have a card with 512meg or more. My rig is as follows:

X2 3800 939 @ 2.4Ghz
Nforce4 (non SLI)
2 gig pc3200
Antec True Control 550W
2 320 gig 7200.10 in raid 0
1 320 gig 7200.10 storage
X800 XL @ 440core/ 567mem (on water) PCI express.
HDTV Wonder
SB Audigy 2

Think I will end up with a 8800GTX since the power requirements may not require me to replace my power supply. I do not plan on ever running SLI. I will go Quad core in 6 months to a year, so I will get 2 years out of my MB, Proc, and Mem.

Do you guys think my power supply will cut it for now?
 
Your power supply is probably the only thing that can handle an 8800GTX in your rig.:(
 
r520 was arguably faster than the 7800gtx and was roughly ~2 months late iirc. The x1900 raped when it was released. r4XX wasn't much of a letdown, it was just a paperlaunch however it is arguable that ati's x800xt was faster than a 6800 ultra. The 6800gt was amazing though.

The x800 XT was faster than a 6800 Ultra, but the difference is less than %10-20, and it was actually slower in OpenGL. Only the x800 XT PE and the later x850 XT revisions were NOTICEABLY faster. The x800 Pro (12 pipe) was actually an embarassment to ATI because it was toasted by the 16-pipe 6800 GT (same price). Didn't help that ATI also got raped on the midrange because they refused to release the x700 XT to compete with the 6600 GT, and they were late with an AGP version of the x700 (the 9800 Pro couldn't even compete with the 6600 GT). Their only midrange wins were with the x800 XL and x800 GTO, and thse were LATE in the product cycle. ATI ruled the high-end, but blew the midrange (you'll see this pattern many times in my post).

Release date of 7800 GTX: 6/22/05. This was also a hard launch.
Release date of x1800 XT: 11/5/05. That's 5 months later, and the launch had poor availability. R520 WAS faster than the 7800 GTX, but not the 7800 GTX 512MB (which had the same poor availability). However, the x1800 XT was NOT more than 10 or 20 percent faster in a given game, which meant there was little impetus for 7800 GTX owners to upgrade (especially with SLI available).

The x1900 series was impressive on launch, but Nvidia came back and offered > %80 of the performance of an x1900 XT for almost $200 off (the 7900 GT, priced at $299). While there are people who buy the ultra high-end cards, the real money is made in the $200-300 bracket - this is why Nvidia cleaned-up despite having a pathetic top-end, because the 7900 GT had zero competition for four months (once the x1800 XTs were liquidated). And since the x1900 GT (four months later) was too little, too late, ATI had to wait until the release of the x1950 Pro (8 months later) before they could compete on price and performance with the 7900 GT...and by then, they had to sell it for under $200, cutting into profits.

The pathetic showing of the x1600 versus the 7600 GT also didn't help things. In fact, poor performance in the midrange hurt ATI, because even though they had the performance crown for almost the whole year of 2006, they actually lost marketshare that year.

So, here we are again, it's practically a repeat of 2005. ATI is six months late with their power part, and the competition has had time to release lucrative upper-midrange solutions (8800 GTS 320) and hog potential sales. In fact, things are worse than 2005: even Nvidia's midrange cards have beaten ATI to market, giving them little chance to compete.

I'm sick of watching ATI dig themselves a deeper grave with every iteration. This was supposed to be the generation where they beat Nvidia to market with a better product, thanks to their investment in the Xbox 360 GPU, but that obviously didn't help things. R600 may be better, but once again it will be a marginal improvement, discouraging upgrades from the 8800 GTX. If the midrange is anything like the x1600 (and x700 Pro before it), you can kiss that goodbye DESPITE the poor performance of Nvidia's 8500 and 8600 lineup.
 
My X800XL is getting really long in the tooth, especially on my 24" 1080p display and running Vista. I try to get 2 years out of my video cards. This time I have to have a card with 512meg or more. My rig is as follows:

X2 3800 939 @ 2.4Ghz
Nforce4 (non SLI)
2 gig pc3200
Antec True Control 550W
2 320 gig 7200.10 in raid 0
1 320 gig 7200.10 storage
X800 XL @ 440core/ 567mem (on water) PCI express.
HDTV Wonder
SB Audigy 2

Think I will end up with a 8800GTX since the power requirements may not require me to replace my power supply. I do not plan on ever running SLI. I will go Quad core in 6 months to a year, so I will get 2 years out of my MB, Proc, and Mem.

Do you guys think my power supply will cut it for now?

thats roughly what i have
and i just got a 8800GTS and love it :D i would say if you have the cash(and room) for a GTX go for it
 
We should ignore everything except that it consumes 240W, does HDMI, Physics, comes with Half-life 2 Episode 2/Team Fortress 2. Anandtech's/HardOCP's/Guru3D's reviews will be the reviews that are completely legit, so only 10+ days left :).

It "might" only consume 240 watts if it's overclocked.
Simple math:
PCI-e connector #1 - 75 watts
PCI-e connector #2 - 75 watts
PCI-e slot - 75 watts

The max it could draw is only 225 watts. That mysterious 2x8 plug might give more, but it's not REQUIRED for use.
 
It could be that a few people, throwing DX9 games at the card and using immature drivers, have...

... insert thunderclap echoing over the peaks and down through the valleys...

... proclaimed as if the voice of Hardware Heaven itself that the R600 has lost. Or it could be that it's a card that can handle today well, and might prove the better once a tomorrow with DX10-based games arrives. And a lot of awfully loud and edgy people are perhaps terrified that if the latter were true, that means they paid almost $600 to one camp for what they could have got for $400 and a tub of patience from the other.
 
The x800 XT was faster than a 6800 Ultra, but the difference is less than %10-20, and it was actually slower in OpenGL. Only the x800 XT PE and the later x850 XT revisions were NOTICEABLY faster. The x800 Pro (12 pipe) was actually an embarassment to ATI because it was toasted by the 16-pipe 6800 GT (same price). Didn't help that ATI also got raped on the midrange because they refused to release the x700 XT to compete with the 6600 GT, and they were late with an AGP version of the x700 (the 9800 Pro couldn't even compete with the 6600 GT). Their only midrange wins were with the x800 XL and x800 GTO, and thse were LATE in the product cycle. ATI ruled the high-end, but blew the midrange (you'll see this pattern many times in my post).

Release date of 7800 GTX: 6/22/05. This was also a hard launch.
Release date of x1800 XT: 11/5/05. That's 5 months later, and the launch had poor availability. R520 WAS faster than the 7800 GTX, but not the 7800 GTX 512MB (which had the same poor availability). However, the x1800 XT was NOT more than 10 or 20 percent faster in a given game, which meant there was little impetus for 7800 GTX owners to upgrade (especially with SLI available).

The x1900 series was impressive on launch, but Nvidia came back and offered > %80 of the performance of an x1900 XT for almost $200 off (the 7900 GT, priced at $299). While there are people who buy the ultra high-end cards, the real money is made in the $200-300 bracket - this is why Nvidia cleaned-up despite having a pathetic top-end, because the 7900 GT had zero competition for four months (once the x1800 XTs were liquidated). And since the x1900 GT (four months later) was too little, too late, ATI had to wait until the release of the x1950 Pro (8 months later) before they could compete on price and performance with the 7900 GT...and by then, they had to sell it for under $200, cutting into profits.

The pathetic showing of the x1600 versus the 7600 GT also didn't help things. In fact, poor performance in the midrange hurt ATI, because even though they had the performance crown for almost the whole year of 2006, they actually lost marketshare that year.

So, here we are again, it's practically a repeat of 2005. ATI is six months late with their power part, and the competition has had time to release lucrative upper-midrange solutions (8800 GTS 320) and hog potential sales. In fact, things are worse than 2005: even Nvidia's midrange cards have beaten ATI to market, giving them little chance to compete.

I'm sick of watching ATI dig themselves a deeper grave with every iteration. This was supposed to be the generation where they beat Nvidia to market with a better product, thanks to their investment in the Xbox 360 GPU, but that obviously didn't help things. R600 may be better, but once again it will be a marginal improvement, discouraging upgrades from the 8800 GTX. If the midrange is anything like the x1600 (and x700 Pro before it), you can kiss that goodbye DESPITE the poor performance of Nvidia's 8500 and 8600 lineup.

I remember back in the early discussions about the Xenos GPU, someone theorized that when nVidia got the contract for the GPU on the original XBox, the resources that the project consumed helped to create the GeForce FX disaster, and that ATi better watch out that the "XBox curse" didn't get them too. Almost seems to be a prophecy fulfilled, now...
 
I remember back in the early discussions about the Xenos GPU, someone theorized that when nVidia got the contract for the GPU on the original XBox, the resources that the project consumed helped to create the GeForce FX disaster, and that ATi better watch out that the "XBox curse" didn't get them too. Almost seems to be a prophecy fulfilled, now...
nvidia's problem during the xbox development wasn't the NV30, it was that the xbox caused all products during and sometime after it (GF3 - NV30) to be late.

The problems with the NV30 had to do with bad assumptions on SM2 design, and fixes during the development couldn't help the design shortcomings yet still caused it to be late.

I don't think it was the same situation since ATI handed off the 360 GPU to MS almost 2 years ago. ATI has had problems with releasing products on time that go back much longer.

Of course developing more products at once will be a distraction from the core PC GPU business, but ATI had a lot more on its plate than Xenos. Don't forget the desktop chipsets that never went anywhere and the Wii GPU refresh.
 
I remember back in the early discussions about the Xenos GPU, someone theorized that when nVidia got the contract for the GPU on the original XBox, the resources that the project consumed helped to create the GeForce FX disaster, and that ATi better watch out that the "XBox curse" didn't get them too. Almost seems to be a prophecy fulfilled, now...

but the xbox chip gave nvidia a nice foot in the door on chipset development, basicly, if MS hadnt contracted nvidia i wouldnt have my nice NF4 SLI board right now
 
but the xbox chip gave nvidia a nice foot in the door on chipset development, basicly, if MS hadnt contracted nvidia i wouldnt have my nice NF4 SLI board right now

I agree, I always saw the xbox deal as a big plus for nVidia. NV30 happened for what seemed like completely separate reasons.
 
It "might" only consume 240 watts if it's overclocked.
Simple math:
PCI-e connector #1 - 75 watts
PCI-e connector #2 - 75 watts
PCI-e slot - 75 watts

The max it could draw is only 225 watts. That mysterious 2x8 plug might give more, but it's not REQUIRED for use.

The 8-pin PCI-e provides 150 watts.
 
Your PS is fine. I've been running my gtx on this PS since december without a hickup. Check my sig for details.
 
What else would they throw at it? DX10 games? Oh wait, those don't exist.

Exactly my point; they aren't available yet, and until they are we won't know either way how HD 2900XT(X) and 8800GT(X/S) stack up against each other. So, one can't say (regardless of their stature on the web) the HD 2900XT is a bum. It's irresponsible to suggest it because there's no way of knowing yet. It may well be a bum, but taking sides so early is a blatant bias towards one camp or another, and for many web editors it sullies their objectivity. I don't know about you, but I try to avoid such 'fanboy' websites. They're fun if you own the product and like it, but they steer you into a bad purchase if you haven't yet decided.
 
The mental gymnastics of rationalization are a fascinating spectator sport. First the G80 cards were "irrelevant, premature, and/or unnecessary" because DX10 was so far off and ATi cards were "good enough" in DX9. Now "it doesn't matter" if R600 cards are NOT "good enough" in DX9 because they "may show their true superiority" in DX10. Or in the words of Chicago Cubs fans everywhere, "Just you wait til next year!"

I understand that real info has not yet been revealed, just for the record. The above is simply based on the premise being put forward by you.
 
I don't think 8800 GTX was a bad thing, i think that Nvidia had a good chance of getting some real money from it. Even if everyone keep saying that the real money comes from the mid-end cards.
Think about it, you are the only card vendor with something that does DX10, and even do DX9 in a huge display like the 30 inch LCDs with very high image quality and high fps. So you can say you are the only one that HPB (High Paying Bastards :p) can buy from.
They had an insane price for those things, come on, 600+ dollars on a video card?
I think that ADM/ATI is late to the market with the R600, not because of DX10 or vista are already on the market, but because they could not get the easy money that comes from beeing alone in a certain market.

Everyone seems to forget that ATI will sell those things for 400 bucks.
I do have a 24' DELL Wide LCD, you guys know how hard it is for a video board to push all that res in full glory (4x+ AA and 16 HQAF), but I was not willing to pay another 600+ dollars to do that, but 400 is a nice ammount. If nvidia drop the price of NV80 to 400, the only one that get to win is the consumer.
So in my view, the launch is good, not for ATI or Nvidia, but for us. The performance of a 8800GTX or more, for 400 dollars, for me is a hell of a good deal.

Santroph.
 
I don't think 8800 GTX was a bad thing, i think that Nvidia had a good chance of getting some real money from it. Even if everyone keep saying that the real money comes from the mid-end cards, think about it, you are the only card vendor with something that does DX10, and even do DX9 in a huge display like the 30 inch LCDs with very high image quality and high fps. So you can say you are the only one that HPB (High Paying Bastards :p) can buy from.
They had an insane price for those things, come on, 600+ dollars on a video card?
I think that ADM/ATI is late to the market with the R600, not because of DX10 or vista are already on the market, but because they could not get the easy money that comes from beeing alone in a certain market.

Everyone seems to forget that ATI will sell those things for 400 bucks.
I do have a 24' DELL Wide LCD, you guys know how hard it is for a video board to push all that res in full glory (4x+ AA and 16 HQAF), but I was not willing to pay another 600+ dollars to do that, but 400 is a nice ammount. If nvidia drop the price of NV80 to 400, the only one that get to win is the consumer.
So in my view, the launch is good, not for ATI or Nvidia, but for us. The performance of a 8800GTX or more, for 400 dollars, for me is a hell of a good deal.

Santroph.

the thing is so far all the leaks say the same thing its slow
and barely able to keep up with the GTS
and the paper specs back this up LOTS of memory bandwidth so it can do a lot of AA with little speed hit but the lack of ROPs says its fillrate limited and thats bad at high res

i would bet the card kicks at at less then 1920x1080 say 1600x1200 is were it realy shines with 4x to 8x aa and 16x af
but the lack for fill will mean at high res its going to slow down and i think thats were the "free" AA is coming from
since its limited in fill and has ton of bandwidth youll see no hit from AA at higher res but youll also see no gain from not having it on

this is just my guess any way
 
You know, you all can debate this CRAP all day. Here is my Radeon History.

Radeon 64MB DDR
Radeon 8500 64MB
Radeon 9800XT 128MB
Radeon 800 Pro
Radeon 850XT
Radeon 1900XTX

I really don't care how much you love/hate ATI/Nvidia. ATI "screwed the pooch" on this one and we should show them with our money, not our fucking loyalty BS. Our loyalty is getting us a lackluster card for the money and just out right lied to and burned. Oh and that 1800XT was the biggest fuck up in ATI history, only to turn around and release the 1900 series 3 months later?? Hope they do that with this turd for their sake.

I am venting on because of my LOYALTY to this company that doesn't fullfill their promises. I have sold all stock of theirs after not seeing anyone use any of their past technologies.

TruForm. WOW. Looked GOOD!! NEVER EVER TAKEN SERIOUSLY AND USED!
WOW. 48 shaders in the 1900XT!!! NEVER TAKEN ADVANTAGE OF BY ANY GAMING COMPANY! And evidence is in benchmarks against the Geforce 7900 Series. ATI has been dragging us along long enough with NEW TECHNOLOGIES that never get fully utilized in gaming unless you are a folder. But GPU's are primary gaming cards anyways.

I say we let them bite it one round. Let them know we are sick of waiting. Delays started with the 8500 series and continue through with the HD 2900. As consumers, buy a card in the 8000 series from nVidia and let ATI get the message. It pains me to do so, but my next card is gonna have to be an 8800 GTX.:(
 
I agree with most of what you say Snow, Im a huge Ati fan too and yeah these now accepted delays are just retarded even longer with every release, and they dont really seem to care about it. You would think so after the x800 and x1800 stupid ass launches theyd be changing their strategy.


Anyways, theres gotta be something we can do, not buying Ati products would mean Nvidia can do whatever they want with their new releases right ?

Im actually going to buy the 8800GTS the day the R600 comes out lol.
 
Agree with the above posts. Even if the ATI card IS faster or just as fast as the 8800's, it's still too late to gain all the potential customers lost through delays. I buy the best card i can for my system regardless of what company makes it. Letting Nvidia have almost a half year or more to sell their top end card is just letting customers slip away who might have bought ATI cards if it had been out on time.
Best atm to just let the cards come out then make a decision on whats best after people have had a chance to use it a few weeks or so and can do reviews based on facts not on hearsay.
 
I totaly don't agree with all those above. Like if ATI/AMD do it on purpouse? Even I do work, screw up sometimes. Shit happens. Also shit happens on large scale. like a chain event. Even nV with nV30.

Absolute Performance only make sense if $ is not relevant.
For some it will be for most people not.

Rumors are HD2900XT perform between GTS and GTX but with the Price off the GTS. So It has a good Price performance ratio.

nV had his shit happening with nV30. Wich I don't have a problem with.
What I do have a problem with was the Extreem Cheating. Cheating on me. How will I take Reviews for Game representing seriously? This is what make me dislike nV. Be very suspisious as they run behind and regain out of the blue. Oh nV40 did well G70 G80 it doesn't swep away the black history.
Even if you don't care about the cheating.
HD2900XT could be a competive produkt but also due to it's price point.
Not absolute performance.
And then dX10 brings the Device cappability caps away. every GPU must supoport full features or more not less.
However R600 have a Testalation unit. Might explain the better Geometry Performance claimed by MSFSX and Dice Dev teams also rumors. Maybe DX10.1 features?

Then there a non public beta Crysis demo around? Rumor R600XT beats GTX with that. So DX10 Performance first indication looks better for R600. But well it is a rumor now.

But keep in mind if the demo is in hands of some of the reviewers on 14th may we might know some Preview indication of how DX10 can perform on R600 and G80. As a indication. If this rumor is true.
so it will be a smaller gamble wich might be the best Dx10 game accelerator.
And another rumor R600 OC reach 1Ghz on stock cooler :) wondering about that rumor. And the performance and power draw that will bring.

So it doesn't look that bad.
But then again wait and see how it turn out to be.

1 june I deside what to do. in my case what DX9 or DX10 aTI GPU wil it be.

Also if you screw up work and finish it late. Doesn't mean the result will be better. So because R600 is late it will have refresh Performance like it s become R650? That ridicules. As building a car yourself a like buggy where you run into major problems and way later you have a Zonda?
So my Expectation will be in the middle something that might compete with G80.

Maybe I wait for Crysis first or some other important for me DX10 game. HD2900XTX might be out by then.

For me DX10 performance is important as I keep my card for more then 3 years.
 
R600 is a flop...the sooner you deal with it, the better off you'll be.

Great post....I especially love all the detailed info you provided......
it all makes sense now...I hope you will continue with this valuable info ..I feel I can learn a lot from you.
Thanks again.
 
You guys argue like a bunch of whores in a cathouse :p

I bought a GTS, I had the money, it was time to upgrade.

ATI and Nvidia have served me well in the past (I had a GF2 GTS for a very long time, replaced by a Ti4200, then a radeon 9800, then an X800XL, then a 7900GS and now my 8800GTS.

Wait and see if you want, if not Nvidia is a good option.

But QUIT FUCKING FIGHTING!
 
Rumors can be wrong and cannot be, but every one is saying the same thing. Which means it is true, but it does not make sense to me, how could this be possible.

1GB memory, 512bit memory, 320 stream processing unit, 140MB memory bandwidth. All these number are better than Nvidia then how come it is slow. There is definately some drivers issue or every thing i stated up there is not true.

I love how I keep seeing this fallacy... "I just bought all the parts for a corvette but it doesn't go as fast as my mustang... wtf?".

Having crazy specs doesn't matter if you put them together improperly, or fail to do so free of any bottlenecks.
 
The 8-pin PCI-e provides 150 watts.

True, however the R600 will run with 2x6-pin. The 8-pin is optional, as has already been said.

Therefore, the 2x75W for the connectors and 75W for the pci-e slot still totals 225W max. As has been said, if the 8-pin is connected, which is optional, it will provide additional power which will be REQUIRED for overclocking.
 
Back
Top