ATI R600 Confirmations on Size and Power

Your posts define your caliber, and by extension your credibility. I find both of them lacking.

In this thread I agree, but I'm lowering myself to the level of the room. Just like Razor above.

Had this been an intellectual discussion from the start with thoughtful back and forth between author and his critics, I likely would've had a far briefer and far more respectful post.
However that is NOT this thread.
 
Heck the basis for my criticism is even in what Kyle himself writes, like I said before comparing 'configuration' specs to single card specs is a poor basis for the claims made.

Now you are adding things after the fact that wasn't in your original arguement

Not when he appears to be taking the information out of context. Perhaps he should've checked with AMD/ATi whether or not the figures were for single card or for the whole configuration, like the wording implies. Was that even attempted? The article doesn't say 'we tried to contact ATi over the past 5 months we had this spec sheet about whether it's single card or system, but they didn't return our calls'; at least that would show that some effort was made to get the information correct, the only thing that Kyle does have in hand and can be interpreted in many ways if it states 'configuration' and not card. The same thing happened with previous generations of cards two where everyone got overexcited about the requiredment ATi and nV provided cards because they started talking about 500+W PSUs.


Again read above

'Very wrong' is this similar to you MPG numbers? :rolleyes:
'start readed articles properly' ? Sure thing coach! :p

Well looking at the way you post, seems to be your nack ;)

Well it's pretty easy to confirm that the OEM fan reuires significantly more power than the current retail solutions from ATi, nV and add-on vendors like AC. So going with that and the actual pictures of the cooler as the basis for this assumption is better than the idea that it's impossible to conceive of the retail boars without that extra 3" of OEM fan and it's load.

Show me the proof for the r600


No like I said the InQ is a crapshoot, but entertaining like TheWeeklyWorldNews, I don't take them as a 'news' source, but a rumour mill. And my definition of good journalism would be Kyle contacting ATi/AMD and saying, can you clarify the following as I'm about to write an article (omitting mentioning the sensationalizing aspect of course) and I want to be sure I understand this correctly. That was obviously not done if all he's coming back to is the fact sheet, and it's iffy wording at best, and not saying ATi confirmed that number to be good for a single card or system.

Show me where you heard the real deal



Well nothing more than reviewers' statements at actually seeing A13 at CES first hand at the time of the article and now A15 being the one bandied about on the latest sightings. But of course their word at seeing the actual die firsthand versus Kyle's extrapolation of other information to make claims about an actual physical item without seeing it first hand let alone observing his claims. My assumption that there has been a spin or two since Novemer based on the info from people who've seen the dies first hand versus Kyle's assumption that the 'configuration' specs from 4+ months ago are the same as the single card requirement, my assumption seems more reasonable. And I'm not putting it out there as news, so the onus, like I said remain on the author to check his facts, especially if pretending to be better than the InQ, I have no such aspirations or claims, unlike Kyle's. And based on previous statements in this thread it seems like double-checking the specs on the sheet with AMD didn't seem to even cross his mind and he was quite resistant in the beginning of this thread despite people's concerns expressed with tactfulness and deference to the grand poobah, yet met with sarcasm and dismissed. The concept of measure twice (or ten times) and cut once seems lost in this article. Adding small caveats later is along the same journalistic practices of front page news page 10 retraction, long after the rumour makes the rounds of the forums and spreads the FUD. Speaking of edits to the original article, why not include the fact that it's not 'ATi stating' but "ATi stated in a Nov '06 document". Because that would show the information to be as stale as most information from last year.

Which reviewer told you they saw a die shot? And which one saw the spin and metal layer numbers? To my knowledge no one has yet.

At this point in time it doesn't much matter anymore, the article is unstickied from the front page, and will likely fade away without a correction. However, if you want we can revisit this at launch time and see what's what. I expect you will say, 'that was then this is now' type of defense, which pretty much applies to the spec sheet from Nov.


BS, my story won't change, I don't expect the r600 retail to have 300 watts of power consumption, and the article says the same, might want to go read again


I'm pretty mellow, I write this while siting on the couch watching TV and solving world hunger (less people / more food, done!), and far from being an idiot, but if that characterization helps you, go with it. :cool

Trust me you are an idiot, everything you say is a one way street in your mind, when the article clearly gives doubt to the retail boards. The way you post its pathetic, so far you haven't showed anyone your "proof" which was asked for a while back. Are you afriad there is no proof at the moment?

And I don't know you as a person, but the way you post is why I think you are an idiot.
 
In this thread I agree, but I'm lowering myself to the level of the room. Just like Razor above.

Had this been an intellectual discussion from the start with thoughtful back and forth between author and his critics, I likely would've had a far briefer and far more respectful post.
However that is NOT this thread.

You were the cause of the lowering :eek:, and I responded in a equivalent tone, just to show you how it feels. I guess you didn't like it, your tone seems to have changed a bit which is good, it would have been better if you started off with that tone, because no one would have gotten upset.

And just for an added bonus, where was the uproar when Kyle wrote this

http://www.hardocp.com/news.html?news=MjE1NTksLCxobmV3cywsLDE=

Well now do you really think he is that one sided as you say?
 
Can we not dissect posts for pages and pages....let's get back to speculating.

Maybe the rumored shorter retail card features TEC assisted cooling in order to maintain low temperatures with low noise and reasonable dimensions, though it sounds like that would be far too expensive....

Why haven't we seen TEC powered coolers like the Vigor Monsoon appear for videocards?
 
You were the cause of the lowering :eek:, and I responded in a equivalent tone, just to show you how it feels. I guess you didn't like it, your tone seems to have changed a bit which is good, it would have been better if you started off with that tone, because no one would have gotten upset.

My tone didn't change because of you, my tone changed in the previous posts because of the request to change the original post. You on the other hand still get things factually incorrect when the information is right there for everyone to see in their own hands, just like your 130nm Low-K R9800 BS you posted at B3D (thanks for reminding me which tool you were) or the X800SE never existing. You're really good at getting the information wrong, and then calling people arrogant idiots for pointing it out to you. Now I see why you think people pick sides, because they're never on your side. Looking back at that old B3D thread and others, I see you're often on the wrong side of reality.

Well now do you really think he is that one sided as you say?

Did I say he was one sided? I think this is the bazilliionth time I said it doesn't matter about sides, it's simply sensationalistic, not to be pro or anti AMD, but to get page hits, and that still stands. That he got the G80 wrong too just reinforces my point on the accuracy of these assumptions, and also proves ATi or nV it doesn't matter it's just the "The Sky Is Falling" effect he's looking for when writing them.

BS, my story won't change, I don't expect the r600 retail to have 300 watts of power consumption, and the article says the same, might want to go read again

I'm not talking about just the retail R600, and if the OEM R600 comes in at well under 300W, then what? I don't expect you to own up to it anymore than any of your previous errors, at least you can blame Kyle for this one.

The way you post its pathetic, so far you haven't showed anyone your "proof" which was asked for a while back. Are you afriad there is no proof at the moment?

Proof of what? Prove a negative? Prove that Kyle doesn't have a board and he's guessing? Like he was with the G80? What point did you want me to prove? The onus is on Kyle to back up his 'news', my statements can easily be backed by other public reviews, and for this forum that'd be more than enough, for Kyle he's the author of this public 'news', and without proof it's looking more like and error or fabrication, even when talking about just OEM, let alone retail. Like I said I don't pretend to be more reliable than the InQ or Kyle, but Kyle is pretending that this is factual current hardcore news, and claiming to be better than the InQ or Dailyech.

And I don't know you as a person, but the way you post is why I think you are an idiot.

And based on your post history, it's obvious that you post more BS than most.

Until the article can be supported and fact checked, then it's just as sensational and pie in the sky FUD as the G80 article. The Onus is on the author, pure and simple.
 
Lets see....1 8-pin @ 150W, 1 6-pin @ 75W, PCI-e slot @ 75W.

Amazingly, that adds up to 300W!!! That is maximum spec plug draw guys. If you honestly think this thing is gonna pull 300W alone at stock then you are fools
I also doubt the R600 will use 300W, but your power estimations are incorrect on 6-pin and 8-pin connectors, and how cards are powered when they include aux power connections.

See near the bottom of this page for the 6- and 8-pin connector power ratings: http://www.playtool.com/pages/psuconnectors/connectors.html

A 6-pin PCIe power cable has a maximum rating of 288W and an 8-pin power cable has a maximum rating of 336W. nvidia was very conservative by adding 2 6-pin power connectors on the 8800GTX (140W-170W), IMO probably to be safe with PSU compatibility.

I would think ATI is doing the same (i.e. the R600 cards will likely use less power than 288W), and that the 8-pin connector isn't a necessity over a 6-pin connector -- it's there as a convenience in case your PSU only has 1 6-pin and 1 8-pin plug. If you have 2 6-pin plugs, you can use those instead.

A big BUT is that huge copper cooler and 24W fan imply that the card uses a lot of power. And ATI's documents also imply the highest power usage for any consumer card so far.
 
My tone didn't change because of you, my tone changed in the previous posts because of the request to change the original post. You on the other hand still get things factually incorrect when the information is right there for everyone to see in their own hands, just like your 130nm Low-K R9800 BS you posted at B3D (thanks for reminding me which tool you were) or the X800SE never existing. You're really good at getting the information wrong, and then calling people arrogant idiots for pointing it out to you. Now I see why you think people pick sides, because they're never on your side. Looking back at that old B3D thread and others, I see you're often on the wrong side of reality.

Funny I remember you were wrong on that account, I was wrong too, but I didn't insult anyone like you, remember?
Thank you for clarifing my view point of you, as you did with many others here too ;).

Oh you want to look at how many times Dave was on the wrong side of reality too? Yeah he makes alot of mistakes as well, it doesn't matter, I don't go bashing without being provoked first, and you did provoke first here as well. Your tone, your mannerisms, your arrogenance blinds you.

Anyways the rest of it I'm not going to argue with you anymore it pretty obvious where you come from, btw you did get laughed out of Rage with your some of your posts there too, and with your viewpoints that gotta be bad.

Just to add one more line, Kyle is right on the specs, they are recommended just wait and see.
 
Yes while physically capable of carrying more current the numbers he was referencing are the PCI-Express 1.1/2.0 spec for power to videocards.

I also doubt the R600 will use 300W, but your power estimations are incorrect on 6-pin and 8-pin connectors, and how cards are powered when they include aux power connections.

See near the bottom of this page for the 6- and 8-pin connector power ratings: http://www.playtool.com/pages/psuconnectors/connectors.html

A 6-pin PCIe power cable has a maximum rating of 288W and an 8-pin power cable has a maximum rating of 336W. nvidia was very conservative by adding 2 6-pin power connectors on the 8800GTX (140W-170W), IMO probably to be safe with PSU compatibility.

I would think ATI is doing the same (i.e. the R600 cards will likely use less power than 288W), and that the 8-pin connector isn't a necessity over a 6-pin connector -- it's there as a convenience in case your PSU only has 1 6-pin and 1 8-pin plug. If you have 2 6-pin plugs, you can use those instead.

A big BUT is that huge copper cooler and 24W fan imply that the card uses a lot of power. And ATI's documents also imply the highest power usage for any consumer card so far.
 
I also doubt the R600 will use 300W, but your power estimations are incorrect on 6-pin and 8-pin connectors, and how cards are powered when they include aux power connections.

See near the bottom of this page for the 6- and 8-pin connector power ratings: http://www.playtool.com/pages/psuconnectors/connectors.html

A 6-pin PCIe power cable has a maximum rating of 288W and an 8-pin power cable has a maximum rating of 336W. nvidia was very conservative by adding 2 6-pin power connectors on the 8800GTX (140W-170W), IMO probably to be safe with PSU compatibility.

I would think ATI is doing the same (i.e. the R600 cards will likely use less power than 288W), and that the 8-pin connector isn't a necessity over a 6-pin connector -- it's there as a convenience in case your PSU only has 1 6-pin and 1 8-pin plug. If you have 2 6-pin plugs, you can use those instead.

A big BUT is that huge copper cooler and 24W fan imply that the card uses a lot of power. And ATI's documents also imply the highest power usage for any consumer card so far.


The power supply is probably recommended to be that high because of weaker PS's that don't have enough sustained amperage on the 12 volt rail for the card
 
ATI R600 BLOWS, GET 8800 GTX!

Yeah, my 8800GTX works great :rolleyes:

stillfreakingerrors.jpg


If the R600 blows, the 8800 drivers SUCKS.

This IMHO
 
interesting, I just went to Vista 4 days ago, haven't seen that yet Rick, any paticular game or app that happens with? I've heard about it alot, thats why I haven't gone to Vista till a little while back. But I'm dual booting so just incase something goes wrong.....

My printer still doesn't work either, MS finally got an update for thier mass storage driver, which now seem to work on more motherboards.
 
Yes it does, he got the G80 wrong and now he's got the R600 wrong too.
Nothing new, and it has nothing to do with red vs green.

Direct Quote from article:

"ATI is stating that a SINGLE R600 high end configuration will require 300 watts of power (+/-9%) and a DUAL R600 "CrossFire" high end configuration will require, as you might guess, 600 watts of power (+/-9%)."


As opposed to:

"Note the two six-pin power connectors and water cooler on one of the cards seems to still be prototype. This goes a long ways to confirm all of the 300 watt consumption rumors that have been circulating."

(direct quote from GTX spy picture news item)

There is a difference between "ATI is stating" and "Here are some spy pics, they support the rumors that have been circulating".
 
:rolleyes:

Have you tried not trolling every R600 thread? I don't think you'll be happy until Nvidia is putting up candidates for Congress.

I'd say it takes a bigger "fan=boy" to take every word the EIC says as law until you've seen the proof yourself.

Let's not bring personal attacks into this. I am pissed at both ATI and NVIDIA right now, so no need to get personal. NVIDIA sucks because they've had 3+ years to make a decent Vista driver and failed miserably, and ATI sucks because they've had 18+ months to make a decent DX10 part, and also failed miserably. Now, I want you to read my specs in my sig very carefully and look at what brand of video card I am using in my gaming machine, then I want you to read this thread before you begin more childish insulting.

Lets see....1 8-pin @ 150W, 1 6-pin @ 75W, PCI-e slot @ 75W.

Amazingly, that adds up to 300W!!! That is maximum spec plug draw guys. If you honestly think this thing is gonna pull 300W alone at stock then you are fools.

Mark my words, 300W, just like at every other recent high-end gpu launch, is a deliberate overstatement.

And InorganicMatter, if you honestly think someone's position at an online tech site directly relates to their ability to comment, then you too, are a fool. Stop kissing ass.

I am sorry, but I am much more apt to trust the EIC of the largest tech site in existence than fan=boys on a forum who have their heads in the sand screaming "its not that bad, srsly!" This will not have been the first time AMD produced a part that underperforms relative to its power draw.

Direct Quote from article:

"ATI is stating that a SINGLE R600 high end configuration will require 300 watts of power (+/-9%) and a DUAL R600 "CrossFire" high end configuration will require, as you might guess, 600 watts of power (+/-9%)."


As opposed to:

"Note the two six-pin power connectors and water cooler on one of the cards seems to still be prototype. This goes a long ways to confirm all of the 300 watt consumption rumors that have been circulating."

(direct quote from GTX spy picture news item)

There is a difference between "ATI is stating" and "Here are some spy pics, they support the rumors that have been circulating".

Finally, another voice of reason.

I think this thread needs a lock.

And another voice of reason.
 
Wow, the Mods should lock this Thread!

People got their freedom, but threads like this one, which started off with valid content material, just gave way to a start in the tearing down of integrity of this forum. Respectability to Iggnorance!

Now I know why they don't let ppl on here with Yahoo, Hotmail, or G-mail accounts, maybe they should add a few more to the list.
 
Nice editing (and deliberate paraphrasing) of my post InorganicMatter. Love how you call me an ATI fan=boy, yet you remove this part of my comment:

"Removing emotion from the debate, who knows how good this card will be. Obviously if its power draw dramatically exceeds its supposed "superiority", then it will fail, simple as that. Debating true power draw (still independantly unmeasured) before any true 3D numbers are known is not only stupid, its against any scientific or logical method."


I don't give a rat's ass who makes the best card (I will buy from either vendor, just make it teh shizzle!).

Oh, and my definition of a fan=boy would be someone who trashes a product before it even makes an appearance. Sound like anyone to you?


Edit: Just to make sure you don't edit it back in, here's your original "quote":

Originally Posted by ManicOne View Post
Lets see....1 8-pin @ 150W, 1 6-pin @ 75W, PCI-e slot @ 75W.

Amazingly, that adds up to 300W!!! That is maximum spec plug draw guys. If you honestly think this thing is gonna pull 300W alone at stock then you are fools.

Mark my words, 300W, just like at every other recent high-end gpu launch, is a deliberate overstatement.

And InorganicMatter, if you honestly think someone's position at an online tech site directly relates to their ability to comment, then you too, are a fool. Stop kissing ass.
 
Something actually in context to the thread, WOW!!!

A quote from PTaylor's WebLog (FSX dev I believe);

"Given the state of the NV drivers for the G80 and that ATI hasn’t released their hw yet; it’s hard to see how this is really a bad plan. We really want to see final ATI hw and production quality NV and ATI drivers before we ship our DX10 support. Early tests on ATI hw show their geometry shader unit is much more performant than the GS unit on the NV hw. That could influence our feature plan."

Source: http://blogs.msdn.com/ptaylor/default.aspx

Edit: This was spotted over at B3D so me no creds for discovery.
 
Nice editing (and deliberate paraphrasing) of my post InorganicMatter. Love how you call me an ATI fan=boy, yet you remove this part of my comment:

"Removing emotion from the debate, who knows how good this card will be. Obviously if its power draw dramatically exceeds its supposed "superiority", then it will fail, simple as that. Debating true power draw (still independantly unmeasured) before any true 3D numbers are known is not only stupid, its against any scientific or logical method."

I was not deliberately changing your quote, I just snipped out the irrelevant parts. I was replying to personal attack part of your post, and the part of your post that said R600 won't draw a full 300W (because it will draw 300W ;)).
 
I was not deliberately changing your quote, I just snipped out the irrelevant parts. I was replying to personal attack part of your post, and the part of your post that said R600 won't draw a full 300W (because it will draw 300W ;)).

I'm going to hold you to that m8. 300W at stock hey? Guess it will be the first high-end completely un-overclockable card (no room left in the spec for oc'ing). Good luck backing this one up.
 
Just a thought.. how much umph do you need with a Crossfire setup with Dual R600's and a Quad AMD setup... That's got to be just insane. When you have to rewire your house to support our PC... wow.. thats [H]!
 
All I can say is thank God directX10 cards are no required anytime soon.

I'll be waiting for the next ones around from both ATI and nVidia
 
I'm going to hold you to that m8. 300W at stock hey? Guess it will be the first high-end completely un-overclockable card (no room left in the spec for oc'ing). Good luck backing this one up.

Do you have any idea how absurd what you just said is?
 
I'm going to hold you to that m8. 300W at stock hey? Guess it will be the first high-end completely un-overclockable card (no room left in the spec for oc'ing). Good luck backing this one up.

I have no doubts that it will draw 300W. Look:

ATI said:
a SINGLE R600 high end configuration will require 300 watts of power
Source.

That is extremely black-and-white. Leaves no room for this +/- 30W people are talking about.
 
Do you have any idea how absurd what you just said is?

Enlighten me. Are the PCI-e 1.1/2.0 specs a load of crap? You can pull more power through the connectors but that would mean ATI has a card that must break the specs to OC. Is that what you are getting at?
 
Go spend more time with your family and stop reading/writing Op-Eds for a while. :(

Sounds like you have some serious issues with me reporting what was stated on an ATI document. Sorry you feel that way.
 
From reading the 300 watt spec per high R600 card the way ATI is stating it, I would GUESS that the card is likely to use closer to 250 watts when it is fully loaded and ATI is suggesting to the system integrators that they need at least an additional 300 watts of PSU rating to cover the cards needs safely. That said, 250 watts is still a freaking monster. One of the scary things about G80 is how much wattage it burns while it is idling along in 2D. Hopefully ATI has done some VERY progressive gating on their part to keep it from being a space heater while you are not gaming. We will see when we get a card. :)
 
Enlighten me. Are the PCI-e 1.1/2.0 specs a load of crap? You can pull more power through the connectors but that would mean ATI has a card that must break the specs to OC. Is that what you are getting at?

What I mean is the idea of staying within any spec while overclocking is absurd.
 
What I mean is the idea of staying within any spec while overclocking is absurd.

Remember that each new adaptor requires PCI-sig verification to meet the PCI-e spec. Taking hardware out of spec (the very definition of oc'ing) is different to releasing a product that breaks a regulatory body's rules when pushed hard. That is my point.
 
Remember that each new adaptor requires PCI-sig verification to meet the PCI-e spec. Taking hardware out of spec (the very definition of oc'ing) is different to releasing a product that breaks a regulatory body's rules when pushed hard. That is my point.

That's not what you said though. You said there would be no room in the spec for overclocking. I'm not so sure PCI-SIG is concerned with what end users do with a card, just that the card has to be within spec to pass, and at 300w the R600 would still be in spec (at least the power requirements) and would pass, even if just barely.
 
Actually that is what I said. The 6/8pin connectors can physically handle more power than the specs supply them with, but they aren't being fed more power. If the R600 was to pull 300W at stock then a whole new OC limiting factor comes into play, independant of the actual adaptor hardware limitations. Thus my use of the term "un-overclockable". Sorry if my terminology causes you grief.
 
Well, in that case, I stand corrected on the 300W number, I was only going by what was on the front page (which said an actual 300W draw, not a "recommended" PSU allocation of 300W). Regardless, 250W of "actual" draw is still one hell on an unreasonable amount of consumption. :eek: And all-AMD gaming rig will burn 900w in CPU/GPU power alone; throw in 100W of peripherals and you've maxed out the best of the best PSUs out there! :eek:


What you read on the news page was a direct quote from ATI documentation. I did not think it was a good place for me to start making guesses. And honestly, I am just guessing above.

And yes you are still very much correct about the entire thing. At this point, I dont think that two them drawing 500 or 600 watts makes much difference in the large scheme of things. That picture on the front page did show a 1000 watt PCP&C so at least we know that is enough juice for a CrossFire rig.
 
Back
Top