ATI R600 Confirmations on Size and Power

i wonder how fast the above noob post will get deleted

Yes you wouldn't want to question Kyle, that would be sacralicious. :rolleyes:

I'd rather see him reply to the post with facts, of course he could chose to simply delete the post, but the questions remain, and Kyle should knuckle up and answer them.
Previous e-mail experience with him leads me to believe he doesn't like being questioned, but he's at least answered questions when challenged.
 
lolZ... Well this guy won't last long... heh j/k :D

i wonder how fast the above noob post will get deleted

Yeah, he's gonna be gone in a hurry!! :rolleyes:

Yes you wouldn't want to question Kyle, that would be sacralicious. :rolleyes:

I'd rather see him reply to the post with facts, of course he could chose to simply delete the post, but the questions remain, and Kyle should knuckle up and answer them.
Previous e-mail experience with him leads me to believe he doesn't like being questioned, but he's at least answered questions when challenged.

I agree with some of the things you are saying, but there is such a thing as a polite disagreement and not bolding huge words.

There is a bit of that aura around here that nobody really questions Kyle for fear of getting banned and all that and I think it's sad. You should be able to question anybody to an extent without worrying about your account getting banned just because they have power over you.
 
I was really looking forward to upgrading to R600 in my SFF case but I'm rather limited in my upgrade options by my 400 watt PSU, even if it is a high quality one. I'm still hopeful that the 9.5 inch version will draw wattage on par with a 8800 GTX, which my system can support, if just barely.

Gosh I wonder how the drivers are. That's what has kept me away from Nvidia, but the company has been making some strides recently and their G80 is a much more viable option to me now than it was back in December when I originally built my rig. Looks like my stopgap X1950 XT is actually going to turn into a more permanent solution, though.
 
What you're doing Kyle is similar to complaining about the fuel efficiency & length of a Bugatti Veyron!

Actaully the Bugatti has very good fuel economy for a W18 engine and its 2 ton+ weight and performance and being AWD, it has better fuel economy then a Hummer or Farrari Enzo, so your analogy is very poor. Now where did Kyle say anything bad about performance or the power used, he just stated the facts that were given to him, and his opinions about them is that wrong?

not enough page views recently since you don't have an early board and are obviously Pi$$ed at being left out unlike VR-Zone, Overclockers, etc?
Either that or maybe give you the benefit of the doubt, and it's just poor fact checking (who edits the editor?), either way nothing new, no better than the InQ despite your comments. At least they're known as a 50/50 crapshoot rumour-mill, you pretend to be a source for [H]ard News.

You obviously need a break from the site dude, your objectivity is gone completely, and you're definitely no longer [H]ardcore if you're not just more concerned about these issues instead of performance, but so concerned about them they drive you to wild hysterics clouding your critical thought.

PROVE me wrong, BUT PROVE IT for FAQ sake!

What you posted is something I'd expect from a n00b who just found the level505 site and said 'OMFG look what I found about the R600!' without questioning it or doing a bit of research.

I guess it might be wrong if you don't like it?

Edit in the article he did state that these might not be the final boards, if you actually read the entire article you would know that
 
There is a bit of that aura around here that nobody really questions Kyle for fear of getting banned and all that and I think it's sad. You should be able to question anybody to an extent without worrying about your account getting banned just because they have power over you.

I've questioned Kyle and Brent on a few occassions, there is a certain way to write a post which shows respect to the person, any person, doesn't have to be a mod or owner of a site.
 
Why comment on the 'space required' versus actual length Kyle? Oh that's right you don't have one.
Still you could've checked the actual cards in other people's hands like that first posted by VR-Zone showing 12.5 inches for the OEM version. Could it be that the story isn't as poignant if your 13+" number isn't bigger than the GF7900GX2 length? :confused:
So ATi recommends the R600 required 13.6", what does the 7900GX2 require 14+"?

Or did you just forget that the GF7900GX2 was bigger than the GF7950GX2 and the GF8800GTX as well, and from the looks of things it's bigger than the R600 to?

And being as it's the OEM version like the GX2 was OEM only, the OEM can pick the case that fits it when building their UberRig, I think AlienDELL and VoodooHP can figure out how to get it in there, and as long as it is effective cooling and good performance it'll be another option just like the GTX.

Also the power requirements you state that ATi says the SYSTEM will require 300W and then you go on to comment on the GTX only requiring 180W (which is for the card alone not the system). And BTW kyle the fan on the OEM version runs @ 12v 2A (aka ~24W for the FAN ALONE !!) hence the OEM version being more than the typical retail cooling with it's impressive 12V @ 0.13-0.5A HSF (hey wow that's about the ~20W difference people are talking about!).

http://overclockers.com/articles1411/pic2.jpg
http://www.arctic-cooling.com/pics/ati5_04h.jpg

BTW, how do you get 300W out of a 75W PCIe and an 8+6 pin configuration, because you KNOW they aren't about to further shorten the market segment by making it PCIe 2.0 only as a requirement. (meaning it could be 2.0 but then one power connector would be optional [ala GF6Ultra] for when you plug into PCIe1.1 slots).

ALSO who gives a flying FAQ about power and size concerns when talking about these eP3n1$ boards for Bungholio feeders? And if your room is 90F in the summer get a Damn air conditioner and don't tell me about saving the F'in planet Kyle, if you aren't using an EDEN CPU with integrated VIA/S3 graphics you're not eco-friendly. So suck it up and stop complaining about stuff that should only be concerns of people who don't build killer rigs and are more concerned about cool and quiet rigs (for thing like HTPC or professional editing use [where a passive cooler is hardware security]). Seriously you run a review site for a living, make your home-office to accommodate this dude, it's not like you aren't getting paid and you're simply doing this in you spare time. :rolleyes:
What you're doing Kyle is similar to complaining about the fuel efficiency & length of a Bugatti Veyron! :mad:

Did you come up with the 'Length' porno reference before writing the body of the article? Looks like you tailored the content to match the title, not write the title after finishing the article. While I don't think you're pro either IHV, you do love to sensationalize every once in a while when the site gets dull, similar to other sites. Just makes you a questionable source for 'news'. Is S3/VIA next on the list, do their DX10.1 VPUs run on the souls of newborns?

Smells like a little number cooking @ [H], FUDged for effect Kyle, not enough page views recently since you don't have an early board and are obviously Pi$$ed at being left out unlike VR-Zone, Overclockers, etc?
Either that or maybe give you the benefit of the doubt, and it's just poor fact checking (who edits the editor?), either way nothing new, no better than the InQ despite your comments. At least they're known as a 50/50 crapshoot rumour-mill, you pretend to be a source for [H]ard News. :rolleyes:

You obviously need a break from the site dude, your objectivity is gone completely, and you're definitely no longer [H]ardcore if you're not just more concerned about these issues instead of performance, but so concerned about them they drive you to wild hysterics clouding your critical thought. :eek:

PROVE me wrong, BUT PROVE IT for FAQ sake!

What you posted is something I'd expect from a n00b who just found the level505 site and said 'OMFG look what I found about the R600!' without questioning it or doing a bit of research.

Go spend more time with your family and stop reading/writing Op-Eds for a while. :(


i wish your post was emo so it would cut itself
 
I've questioned Kyle and Brent on a few occassions, there is a certain way to write a post which shows respect to the person, any person, doesn't have to be a mod or owner of a site.

Bravo

It's time for a nap, go get you bottle and lay down.
Anyone should know that when you start use font and color changes to grab attention all it does is discredit you.

I think you just need a hug emailthatguy.

Back to the R600

I find it hard to belive that ATI would make a card so large.......
Didn't AMD put out a document a few weeks ago showing that they belive most PC's will be SFF (or close to it) in the near future?
<EDIT>
Found it: http://hardocp.com/image.html?image=MTE3MTkyNTYyMVQ2ODJzU2JzTERfMV8xX2wuanBn
 
I agree with some of the things you are saying, but there is such a thing as a polite disagreement and not bolding huge words.

That was definitely for effect, and if that's the part that's inappropriate then that would suprise me. I would've been certain it would be my tone that would be at issue. The two enlarged segments (one is also bolded another segment just colour bolded), are to get the point across quickly in a long post which is required for the information it carries and the fact that it tried to reply with the same tone as Kyle has kept in this thread. I would've replied with a better post had I seen better posts from him leading up to mine. I'm fine with self-editing the bolding to somewhat smaller since it was bigger than expected... like the cards. But it's not like you guys were have a tea social here and that Kyle was very open in the first part of the thread, so the comments stand, and I still think the last sentence is the best advice.

Considering the amount of conclusions being made in this thread for the limited amount of information (especially no first hand experience obviously) it makes me laugh, and reminds me why I only read the reviews nowadays. For logical sane talk about the released information Beyond3D is the place to be. And I'm sure Kyle will be there and elsewhere once he has something more to 'share' when he's priviledged.with an actual board, because the article he wrote isn't the kind of fare he'd post on other sites because it'd get ripped apart.

Off to edit a bit, but of course emailthatguy should learn how to selectively quote so as to allow that. :rolleyes:
 
For logical sane talk about the released information Beyond3D is the place to be.

Then I would suggest that you go make this type of post over there and see if they approve. My guess is that they would not. Razor1 is a member there, as am I, and several other people commenting in this thread.

The people here on this forum are just as open to sane discussion as anywhere else, they just do it with a little less self-importance and snobbery than other sites; it doesn't mean that abject rudeness is tolerated here.
 
I feel the need to add something. I sincerely hope that this doesn't offend anyone, but it is really time that this was said.

Some (not by any means all) members at Beyond3D have characterized [H]forum members as though they are all overclocking epen0r grunts.

In the last couple of weeks, though,since the rumors have abounded that R600 requires a lot of power and is not necessarily practical for the normal enduser, suddenly out of the woodwork comes all of these "Hardcore Overclockers".

A few ATI-biased members at Beyond3D have already indicated that they believe that no one here cares about power consumption, heat, just epeen0rs. What is amusing is that now, many of those same members that unjustly characterized everyone here as being Nvidiots are now looking to our community to save ATI.

We are all hardcore overclocking grunts, right? We don't care if it drains the neighborhood for a minor FPS gain, right? All of us want to run out and buy new PSUs, new cases and fans, and hike the air conditioner up 10 extra degrees, right? None of us have children, bills to pay, or are even remotely concerned about our power biils, because it might make us less [H] right?

WRONG. [H] means that you get the most out of what you own whether you scavenged it out of a dumpster or spent $10,000. Being [H] means you don't have to flog your epeen0r because everyone knows your system kicks ass anyway. It also includes being intelligent. Only a fool would purchase a product that heated up the house, required dremelling the case and required a kilowatt power supply at the time of purchase. If you overclock the crap out of something and it needs all of that, then see above, that's [H]. If you buy it already in that state, you are a fool.

There are intelligent people here. Let's not try to characterize everyone over here as ignorant savages just because we have a more relaxed attitude and are enthusiastic about all aspects of hardware.
 
Actaully the Bugatti has very good fuel economy for a W18 engine and its 2 ton+ weight and performance and being AWD, it has better fuel economy then a Hummer or Farrari Enzo, so your analogy is very poor.

The Veyron is a W16, not W18 (two 9 cylinder engines you were thinking? interesting), and an R600 may be 'efficient' for what it brings to the table, however you don't know what that is yet do you, and obviously neither does Kyle, and that's the point. If all you can complain about is size and power/heat concerns, then really, how [H]ardcore are you? If the R600 does more for less energy consumption than the X1950XTX in Xfire (like the GTX often does) then it's a step forward now isn't it? It should be no surprise that something with much more performance is far more power consuming, and actually would take up less space than an Xfire assembly although it'll be more awkward and it may not fit in some cases Xfire might have, but that's similar to the GTX initially, some people had to decide how hardcore they were and whether a case upgrade was also needed at the time, otherwise there's always the GTX option, and the same goes here, there's always the XL option for those of you with little feet. Or do you not see how that relates?

Now where did Kyle say anything bad about performance or the power used, he just stated the facts that were given to him, and his opinions about them is that wrong?

It's the way he presented them, and some of the things he stated were factually incrorrect and either he purposefully manipulated the article for effect in talking about system power recommendations one moment and then single card requirements the next. You decide whether that's wrong, when the facts he has don't match the arguments he's trying to make. If he goes back far enough to evoke the venerable VooDoo series, and say that the R600 would be the longest, he needs to remember something more recent in the GX2 series and find out whether his statement still applies. If people weren't scared of by it's size with the potential for Quad SLi, those will be the same people who don't care about the length or power consumption of the R600, as long as it ends up being number 1, even if it's only a coupla lousy fps. That's the nature of this segment of the market.

Edit in the article he did state that these might not be the final boards, if you actually read the entire article you would know that

I read the entire article, the entire thread, and the full source article and other people's take on it. Kyle's reaction is the anomaly, and hid reaction in the beginning of this thread is equally unusual. I'm not accusing him of !!!!!!sm, I know he's not, but I am accusing him of the equally poor journalism that he derides the InQ for. And going back and saying 'may not be final board' without changing the statements made on that assumption, is like accusing someone of something and putting the Jim Rome 'allegedly' after it to cover yourself for lawsuits. It's FUD more than fact, and the way it's written pretends this wasn't a concern for the same group of people who were probably already calculating their PSUs for a Quad GTX configuration when news leaked of it's possible support (and even asymmetric SLi).

You may not understand the Veyron analogy, but it's very accurate, because while longer and more fuel consuming than a MacF1, it's also faster, which is really the point of the people paying for it, and even for VW also partially paying for it.

At this point in time if Matrox came out with a card that alone consumed 300W but was twice as fast as the GTX and offered more features, and still allowed the multi-GPU option, all for around the same price, do you not think it's be a success for the target market?

Personally I'm not going to be buying one, until Lasso is a reality (for other reasons than the poster above), but I'm also not complaining about them either, but I am interested in the technology and design, so I got sucked into Kyle article under the heading 'News', of which there is very little [H]ardNews in the article.

This talk of power requirements and card length alone is for people who can't afford to pay their electricity bill, and don't know how to upgrade or mod their case. OEMs are concerned with power and heat envelopes but for their Home/Office machines, not their Alienware/Voodoo marquee machines.

Until the R600 fully launches and demos in it's true environment, 3D/gaming, all this talk about power consumption and heat is like complaining about the cost of a new drug before you realize whether or not it can cure cancer and Aids all at the same time, or whether it's a placebo with bad side effects, hence this artice and the thread it spawned is composed primarily of FUD not news.

Other sites at least talk of the implications of the TeraFlop numbers and the composition of the shaders, here it all about the things that eMachine buyers should be concerning themselves.
 
The people here on this forum are just as open to sane discussion as anywhere else, they just do it with a little less self-importance and snobbery than other sites; it doesn't mean that abject rudeness is tolerated here.

That may be the case, but would you Expect Wavey Dave, Geo, or anyone else to post a similarly FUDy article/thread on the front page? Hardly.

And my tone matches that of Kyle's in the the first pages of this thread. Rudeness I'll admit to freely, but not as the only practitioner.

I've said my peace about this piece, and that's all that was my intention, it's not what I was originally planning on writing, but after reading the thread that lead up to it, it definitely seemed in keeping with the spirit sofar.

The reviews at [H] are great, this Op-Ed is nowhere near that calibre IMO.
 
It also includes being intelligent. Only a fool would purchase a product that heated up the house, required dremelling the case and required a kilowatt power supply at the time of purchase. If you overclock the crap out of something and it needs all of that, then see above, that's [H]. If you buy it already in that state, you are a fool.

However if it overclocks well beyond that, then what? Is it now justified?

What you described is part of the story, but complaining before you even know what you're complaining about is even more foolish. The GTX consumes as much or more energy and is longer than the GF7950GX2, was it worth it? I'd say so.

Now ask yourself if you can even begin to make the same statements about the R600 considering very little is known of it's performance or overclocking potential.
 
http://www.hexus.net/content/item.php?item=7995 - Just saw this, seems external is in the works. Maybe they are banking on this?

Lasso is kinda old news (right after the external PCIe spec announcement) but it's nice to see pics from Hexus meaning it's more than just a codename and a though but a part well into development.

I doubt however that AMD would rely would rely heavily on this in any way, it's likely to be niche for some time, and once they and nV move to more modular designs the need for a tiny breakout box will remain small, but the technology could be moved towards a full size bay like the QuadroPlex which would make more sense for an external gamer device and have far reaching application to gamers and professionals (for rendering or as GPGPU units).

So while there is likely alot to come from Lasso, it's unlikely that is will have any influence on the high end over this generation, except turning otherwise micro portables into capable gaming laptops.
 
Wow, a lead-in to Lasso technology, mentioning the Teraflop in an application no one here will most likely run, and dropping hints about spectacular performance all in one post.

That certainly overshadowed every piece of ill news I've heard about R600. Wow, I'm sold - I'll take two now.
 
If you have a problem with the numbers I would suggest you tell ATI that they should change their design requirements since those are where the numbers are coming from. Complaining to the messenger is just silly.

Why comment on the 'space required' versus actual length Kyle? Oh that's right you don't have one.
Still you could've checked the actual cards in other people's hands like that first posted by VR-Zone showing 12.5 inches for the OEM version. Could it be that the story isn't as poignant if your 13+" number isn't bigger than the GF7900GX2 length? :confused:
So ATi recommends the R600 required 13.6", what does the 7900GX2 require 14+"?

Or did you just forget that the GF7900GX2 was bigger than the GF7950GX2 and the GF8800GTX as well, and from the looks of things it's bigger than the R600 to?

And being as it's the OEM version like the GX2 was OEM only, the OEM can pick the case that fits it when building their UberRig, I think AlienDELL and VoodooHP can figure out how to get it in there, and as long as it is effective cooling and good performance it'll be another option just like the GTX.

Also the power requirements you state that ATi says the SYSTEM will require 300W and then you go on to comment on the GTX only requiring 180W (which is for the card alone not the system). And BTW kyle the fan on the OEM version runs @ 12v 2A (aka ~24W for the FAN ALONE !!) hence the OEM version being more than the typical retail cooling with it's impressive 12V @ 0.13-0.5A HSF (hey wow that's about the ~20W difference people are talking about!).

http://overclockers.com/articles1411/pic2.jpg
http://www.arctic-cooling.com/pics/ati5_04h.jpg

BTW, how do you get 300W out of a 75W PCIe and an 8+6 pin configuration, because you KNOW they aren't about to further shorten the market segment by making it PCIe 2.0 only as a requirement. (meaning it could be 2.0 but then one power connector would be optional [ala GF6Ultra] for when you plug into PCIe1.1 slots).

ALSO who gives a flying FAQ about power and size concerns when talking about these eP3n1$ boards for Bungholio feeders? And if your room is 90F in the summer get a Damn air conditioner and don't tell me about saving the F'in planet Kyle, if you aren't using an EDEN CPU with integrated VIA/S3 graphics you're not eco-friendly. So suck it up and stop complaining about stuff that should only be concerns of people who don't build killer rigs and are more concerned about cool and quiet rigs (for thing like HTPC or professional editing use [where a passive cooler is hardware security]). Seriously you run a review site for a living, make your home-office to accommodate this dude, it's not like you aren't getting paid and you're simply doing this in you spare time. :rolleyes:

What you're doing Kyle is similar to complaining about the fuel efficiency & length of a Bugatti Veyron! :mad:

Did you come up with the 'Length' porno reference before writing the body of the article? Looks like you tailored the content to match the title, not write the title after finishing the article. While I don't think you're pro either IHV, you do love to sensationalize every once in a while when the site gets dull, similar to other sites. Just makes you a questionable source for 'news'. Is S3/VIA next on the list, do their DX10.1 VPUs run on the souls of newborns?

Smells like a little number cooking @ [H], FUDged for effect Kyle, not enough page views recently since you don't have an early board and are obviously Pi$$ed at being left out unlike VR-Zone, Overclockers, etc?
Either that or maybe give you the benefit of the doubt, and it's just poor fact checking (who edits the editor?), either way nothing new, no better than the InQ despite your comments. At least they're known as a 50/50 crapshoot rumour-mill, you pretend to be a source for [H]ard News. :rolleyes:

You obviously need a break from the site dude, your objectivity is gone completely, and you're definitely no longer [H]ardcore if you're not just more concerned about these issues instead of performance, but so concerned about them they drive you to wild hysterics clouding your critical thought. :eek:

PROVE me wrong, BUT PROVE IT for FAQ sake!

What you posted is something I'd expect from a n00b who just found the level505 site and said 'OMFG look what I found about the R600!' without questioning it or doing a bit of research.

Go spend more time with your family and stop reading/writing Op-Eds for a while. :(
 
The Veyron is a W16, not W18 (two 9 cylinder engines you were thinking? interesting), and an R600 may be 'efficient' for what it brings to the table, however you don't know what that is yet do you, and obviously neither does Kyle, and that's the point. If all you can complain about is size and power/heat concerns, then really, how [H]ardcore are you? If the R600 does more for less energy consumption than the X1950XTX in Xfire (like the GTX often does) then it's a step forward now isn't it? It should be no surprise that something with much more performance is far more power consuming, and actually would take up less space than an Xfire assembly although it'll be more awkward and it may not fit in some cases Xfire might have, but that's similar to the GTX initially, some people had to decide how hardcore they were and whether a case upgrade was also needed at the time, otherwise there's always the GTX option, and the same goes here, there's always the XL option for those of you with little feet. Or do you not see how that relates?

It's the way he presented them, and some of the things he stated were factually incrorrect and either he purposefully manipulated the article for effect in talking about system power recommendations one moment and then single card requirements the next. You decide whether that's wrong, when the facts he has don't match the arguments he's trying to make. If he goes back far enough to evoke the venerable VooDoo series, and say that the R600 would be the longest, he needs to remember something more recent in the GX2 series and find out whether his statement still applies. If people weren't scared of by it's size with the potential for Quad SLi, those will be the same people who don't care about the length or power consumption of the R600, as long as it ends up being number 1, even if it's only a coupla lousy fps. That's the nature of this segment of the market.

I read the entire article, the entire thread, and the full source article and other people's take on it. Kyle's reaction is the anomaly, and hid reaction in the beginning of this thread is equally unusual. I'm not accusing him of !!!!!!sm, I know he's not, but I am accusing him of the equally poor journalism that he derides the InQ for. And going back and saying 'may not be final board' without changing the statements made on that assumption, is like accusing someone of something and putting the Jim Rome 'allegedly' after it to cover yourself for lawsuits. It's FUD more than fact, and the way it's written pretends this wasn't a concern for the same group of people who were probably already calculating their PSUs for a Quad GTX configuration when news leaked of it's possible support (and even asymmetric SLi).

You may not understand the Veyron analogy, but it's very accurate, because while longer and more fuel consuming than a MacF1, it's also faster, which is really the point of the people paying for it, and even for VW also partially paying for it.

At this point in time if Matrox came out with a card that alone consumed 300W but was twice as fast as the GTX and offered more features, and still allowed the multi-GPU option, all for around the same price, do you not think it's be a success for the target market?

Personally I'm not going to be buying one, until Lasso is a reality (for other reasons than the poster above), but I'm also not complaining about them either, but I am interested in the technology and design, so I got sucked into Kyle article under the heading 'News', of which there is very little [H]ardNews in the article.

This talk of power requirements and card length alone is for people who can't afford to pay their electricity bill, and don't know how to upgrade or mod their case. OEMs are concerned with power and heat envelopes but for their Home/Office machines, not their Alienware/Voodoo marquee machines.

Until the R600 fully launches and demos in it's true environment, 3D/gaming, all this talk about power consumption and heat is like complaining about the cost of a new drug before you realize whether or not it can cure cancer and Aids all at the same time, or whether it's a placebo with bad side effects, hence this artice and the thread it spawned is composed primarily of FUD not news.

Other sites at least talk of the implications of the TeraFlop numbers and the composition of the shaders, here it all about the things that eMachine buyers should be concerning themselves.

Depends on what model you want to talk about GrapeApe the original concept Veyron engine was a W18 with 3 6 cylinder blocks, I didn't know the 16.4 Veyron the engine changed ;). The veyron consumes more fuel then the F1? You really have to look into your cars again, the Veyron on average gets 16 mpg and the F1 gets 12 mpg, and the Enzo gets 14 mpg. Thats why I didn't understand your analogy.

Your underlying tone is not exceptable at B3D either, in no way did Kyle's article take sides nor was it passionately biased like your post and your continuing comments, it was actually more words of caution that might hurt AMD's sales of a r600 card if it was to be released like that for retail. I noticed in your user profile you are studying master in economics, your way of handling something that displeases you will not make you good in the IT business field.

Actually I don't really care about the power issue, thats why you haven't seen me state much about it, I do understand there is a very good chance the x2900 is very fast chip but it will come at a price; power and heat, and I've been saying this for quite some time too, its one of the earlier rumors which the r600 was going to consume more then the g80. And if you want to talk to me about this, you better have more then things then pulling something out of your purple ass. Don't take that seriously its just a joke :), I was a fan of the great grape ape when I was a kid too :).

I suggest you back up your comments against Kyle's articles, with something other then rambling nonsense. His article is backed up, I can confirm the power specs on the r600, I still believe the retial cards will use less power, which Kyle's article mentions. You seem to take a tangential approach when reading the article. Even entusiests have thier limits Mr. Ape. Trust me, I could buy a quadro plex tomorrow if I really wanted to, but would I, I do quite a bit of art rendering which would come in handy with galato, I would, but the money worth the extra performance I'm getting, not really.
 
Interesting ideas there from ATI, but yeah seems a bit on the large side.

FWIW I'd advise caution when speaking with Kyle, just an FYI.
 
Hypothetical situation. Is there ever going to be a point where some of you think they've finally gone to far with power requirements?

What if you need a 220 dry circuit to run your computer? How about then? Like when you plug in your brand new X3900 blah blah and it trips a breaker? Wife comes unglued and hits you in the head with a newspaper. lol

At what point does it stop being "whining" and end up being a legit issue?
 
when a system draws more then 2400 watts, I think thats when I would start bitching about it, when we have to start using more then 1 power socket to power 1 system, thats getting to be the point when IMHO is way too much.
 
when a system draws more then 2400 watts, I think thats when I would start bitching about it, when we have to start using more then 1 power socket to power 1 system, thats getting to be the point when IMHO is way too much.
A lot of people who live in houses built in the fifties would have a problem getting over 1500 right now. AFAIK that is about when one circuit is gonna start flipping a breaker. Plugging another power supply into another wall connection won't do you any good unless you pull from a separate circuit in the house. Extension chord maybe? lol!
 
Hypothetical situation. Is there ever going to be a point where some of you think they've finally gone to far with power requirements?

What if you need a 220 dry circuit to run your computer? How about then? Like when you plug in your brand new X3900 blah blah and it trips a breaker? Wife comes unglued and hits you in the head with a newspaper. lol

At what point does it stop being "whining" and end up being a legit issue?

For me, it stops at having to buy a new case and power supply every six months just to upgrade. If you look in my signature at my machines, it isn't because I cannot afford it. It's because this is getting ridiculous. The 8800 GTX was bad enough.
 
Wow, it takes a serious fan=boy to say "I know it's not that bad!" when the EIC himself is saying "I've got the top-secret ATI specs that aren't even available to you in front of me, it really is that bad." Bravo!
 
A lot of people who live in houses built in the fifties would have a problem getting over 1500 right now. AFAIK that is about when one circuit is gonna start flipping a breaker. Plugging another power supply into another wall connection won't do you any good unless you pull from a separate circuit in the house. Extension chord maybe? lol!

LOL yeah thats true, its pretty bad when a computer eats up more power then a washing and dryer machines combined ;)
 
are you suggesting that amd processors aren't energy efficient? don't you remember the entire p4 generation of cpu? you know, the ones that idle at 50'c+. the only reason intel seems to "get it" now was because users complained, reviewers glowered, and they lost market share.

Yes I am saying some of them are power hungry.I am ware of Intels misadventures in this area,as I own a few Intel Northwood/Pressy systems.It had very little to do with user requests.

I've questioned Kyle and Brent on a few occassions, there is a certain way to write a post which shows respect to the person, any person, doesn't have to be a mod or owner of a site.

As have I,and have never been banned,its called being tactful and polite.

Then I would suggest that you go make this type of post over there and see if they approve. My guess is that they would not. Razor1 is a member there, as am I, and several other people commenting in this thread.

The people here on this forum are just as open to sane discussion as anywhere else, they just do it with a little less self-importance and snobbery than other sites; it doesn't mean that abject rudeness is tolerated here.

I have seen some very ignorant/fan boi type posts over on Beyond recently,as well as Kyle and [H] bashing,and none of it has been poo poo'd :) "Kyle isnt the sharpest tool in the Shed"
..... Anyone who knows of Google can find it,and the site its on.... 3+ Mods have seen it,and done nada.Abject rudeness does seem ok over there. :)
 
A lot of people who live in houses built in the fifties would have a problem getting over 1500 right now. AFAIK that is about when one circuit is gonna start flipping a breaker. Plugging another power supply into another wall connection won't do you any good unless you pull from a separate circuit in the house. Extension chord maybe? lol!

I'm allready running into issues where I'm sure my studdy is just a few amps away from maxing the circut. The breaker box is 10 feet away in the bacement.
This is the truth:
If my PC sucks up any more power I'm going to have to run a new line.

I've got 2 PC's pulling ~ 1000 watts with monitors. Add in the 32" TV and other goodies I'm not far off. :(
 
Lets see....1 8-pin @ 150W, 1 6-pin @ 75W, PCI-e slot @ 75W.

Amazingly, that adds up to 300W!!! That is maximum spec plug draw guys. If you honestly think this thing is gonna pull 300W alone at stock then you are fools.

Mark my words, 300W, just like at every other recent high-end gpu launch, is a deliberate overstatement.

Removing emotion from the debate, who knows how good this card will be. Obviously if its power draw dramatically exceeds its supposed "superiority", then it will fail, simple as that. Debating true power draw (still independantly unmeasured) before any true 3D numbers are known is not only stupid, its against any scientific or logical method.


And InorganicMatter, if you honestly think someone's position at an online tech site directly relates to their ability to comment, then you too, are a fool. Stop kissing ass.
 
I have seen some very ignorant/fan boi type posts over on Beyond recently,as well as Kyle and [H] bashing,and none of it has been poo poo'd :) "Kyle isnt the sharpest tool in the Shed"
..... Anyone who knows of Google can find it,and the site its on.... 3+ Mods have seen it,and done nada.Abject rudeness does seem ok over there. :)

I didn't want to come right out and say that, but since you did :) Many of them are just really biased, and several are also filled a little too much to the brim of themselves for my tastes.
 
The veyron consumes more fuel then the F1? You really have to look into your cars again, the Veyron on average gets 16 mpg and the F1 gets 12 mpg, and the Enzo gets 14 mpg. Thats why I didn't understand your analogy.

Where are you getting our figures from? Even at it's best the Veyron doesn't get 16Mpg according to the gov, it avg is according to them 10MPG and 8City/15Hwy;
http://www.fueleconomy.gov/feg/noframes/22661.shtml
The Enzo is worse with 8/9/12;
http://www.fueleconomy.gov/feg/noframes/18305.shtml

Unfortunately no McLaren numbers there. But your research seems wanting.

Regardless of that, at speed the Veyron has increased drag and goes through fuel faster than any other production car on earth, while also going faster than any production car on earth. And that's my point, we aren't talking about Idle here we're talking about top speed/performance, people complaining about power consumption are like people whining about the fuel efficiency of their Supercar.

Your underlying tone is not exceptable at B3D either,

Sure there's never been words there :rolleyes: , yes, surprisingly there too considering the usual calibre of participants, but please c'mon. ;) Reality is that B3D is a different place IMO, there wouldn't be the need for such words to writers because such mediocre articles would be criticized in the forum, nor would there be FUD on the first page of such similarly poorly researched quality. Yet it still does have it's share of name calling now and then.

Heck I wouldn't even be reading/replying to this if I hadn't been advises to by a friend who can't comment. I don't frequent here much anymore (did more in earlier part of the decade), too much A vs B to begin with and this article spawned even more, heck you even accuse me of it.

in no way did Kyle's article take sides nor was it passionately biased like your post and your continuing comments,

Who said it/I took sides, that was other people not me, I said it was sensationalized 'the sky is falling' FUD to generate hype. I specifically said he's not a F-Boi just a sensationalist when it suits the page hit need. It's not that he's pro or anti ATi/AMD, it's that he's over-reacting which is something someone in his position and with his influence shouldn't do. You or I have the luxury of over-reacting (and I'll use it her now thanks) but he has a responsibility to the readers if he wants to be considered a 'news source'.

it was actually more words of caution that might hurt AMD's sales of a r600 card if it was to be released like that for retail.

How is this going to hurt sales based on Kyle's mis-read? And also what is it supposed to be a cautionary tale after the design is finalized? As if they could dope the chips and change SOI composition at the last minute, move to 65nm? When it comes down to it the high end is not about efficient anyways, it's about performance, if we were talking about the X2600/GF8600 or lower then I'd say there's more of a case there for power consumption concerns and such and article.

I noticed in your user profile you are studying master in economics, your way of handling something that displeases you will not make you good in the IT business field.

The profile is old, completed degree, and doing quite well at work thanks. As for the advice though, your and Kyle's fact checking ability would make you worse for the job after you figres kept being way off the mark. Tech businesses don't like it when your numbers don't match, and now neither does the gov't. But thanks for your concern and advice.

Actually I don't really care about the power issue, thats why you haven't seen me state much about it, I do understand there is a very good chance the x2900 is very fast chip but it will come at a price; power and heat, and I've been saying this for quite some time too, its one of the earlier rumors which the r600 was going to consume more then the g80. And if you want to talk to me about this, you better have more then things then pulling something out of your purple ass.

See that's the difference, I'm not talking about A vs B, I expect the R600 to consume more than the G80, based on design, transistor count,rumour, and ATi/AMD's own words, but I don't distort that into something it's not. Nor do I think it'll matter much. Personally I think SLi and Xfire are a ridiculous extravagance that is less than 1% of the overall market, but a large portion of the enthusiast market, however like the Veyron and HumVees they have their market and those people don't care. Like I said Quad SLi would sell a ton tomorrow if nV enabled it (with good drivers of course). And the same with the R600 should it launch with Quad support. I've had dual CPUs since 8 years ago for my video editing rigs, and really, it was just as extravagant at the time and the last build 5 years ago I bought a 'then' HUGE 460W Enermax, because you buy what you need/want not because someone else tell you to or not to.

His article is backed up, I can confirm the power specs on the r600, I still believe the retial cards will use less power, which Kyle's article mentions.

Show me these Power Specs you can confirm, because Kyle's statement as written is misleading;
"ATI is stating that a SINGLE R600 high end configuration will require 300 watts of power (+/-9%) and a DUAL R600 "CrossFire" high end configuration will require, as you might guess, 600 watts of power (+/-9%). Compare that to a single GeForce 8800 GTX that will pull 150 to 180 watts. Add in a CPU to that mix and you overtake most power supplies’ peak ratings on the retail shelves today."

So either Kyle's adding a CPU to an already high end 'CONFIGURATION' that includes a CPU or he's claiming that AMD is saying that a single R600 is drawing 300W (which is clearly what he says in a later post). That section needs editing, because it's borked like the other numbers. And his replies when people point out the OEM nature were far from receptive.

If Kyle is going with 'facts', then why is his information so out of line with everyone else's, people who actually have their hands on the boards? Writing such an article with such weak info amounts to nothing more than FUD. The onus is on Kyle to get the numbers correct like the 300W vs 240W, or 'configuration' vs single card. It was further compounded by his comments later;
"I have yet to see the short reatail very with a cooler on it.....I can't imagine trying to dissapate 300 watts of heat out of there considering how hot the 8800s get pushing 165."

Meaning he is talking about the card/chip alone and not the system. And he's going based on a Crossfire recomendation sheet from Nov, there's been 2 re-spins at least since then (likely 2 this year alone), so saying that can't change is laughable, who knows if they didn't try to improve some leakage with those.

Alot of the rest of my reply is in direct response to the beginning of the thread, especially THIS Gem;
"I own this site and reported the information. Dailytech and the Inq report as much right as they do wrong, so using them to hold up your arguments does not mean much to me."

Doesn't sound like Kyle was trying to fact check anything there, just casting aspersions on other sites claiming to be better than them. BTW the InQ was his call, which is why I throw it back, we know the InQ is questionable, but entertaining. Kyle's claiming to be better and specifically about this article.

Anywhoo, like I said, I said my peace. I think the article needs a re-write because right now it comes off as FUD completely contrary to other information out there from people with cards in hand.
 
Wow, it takes a serious fan=boy to say "I know it's not that bad!" when the EIC himself is saying "I've got the top-secret ATI specs that aren't even available to you in front of me, it really is that bad." Bravo!

:rolleyes:

Have you tried not trolling every R600 thread? I don't think you'll be happy until Nvidia is putting up candidates for Congress.

I'd say it takes a bigger "fan=boy" to take every word the EIC says as law until you've seen the proof yourself.
 
check around a bit, you will find the gas mileages that I posted, I actaully have quite a few automobile books here that I got them from. Actually the numbers for the high seem low there, usually gets around 20. I just looked up the gas mileage, I think the book had a typo, was looking at popular science magazine, and looked up at thier site on the web and they have it as 12 so, 12 for the Veyron, 12 for the Mclaren, full acceleration 3 for the Veyron, lets see about the Mcleran, can't find anything on the Mcleran, but I can say my Audi S4 at full acceleration doesn't get more then 4 mpg either, so I don't execpt a Mclaren to get much more then what the Veyron gets.

I'm going to have to talk like you now.

Watch the name calling - Dr.Evil

The way you are trying to support a screwed up view is more then an idiot its moronic. You have no basis to back anything up other then rumors and speculation from other website, what Kyle has is a specs sheet from AMD, which has much more wieght then anything else that has been said around the web.

Actually you talk like this there too, if you remember we had an arguement there once about the 9800 being used as a mid rage product to fight the 6600 because the x600-x700 wasn't good enough?

I don't care where you post, but you post the same way everywhere even when you are very wrong, so it seems to be a habit for you. Bais in you seems to be an inherited trait, if you don't want to look like an attention deficient baby, start changing the way you post, and start readed articles properly.

240 watts is presumed at this point, no one has seen the retail boards yet! You want him to confirm something that is not possible at this point? That would be in your definition of good journalism is when he lies or basis a story on rumors, that is what the Inq does, so by your own words you want to believe in the Inq? He did mention these might not be final boards, as I siad your thoughts are tangential, you are talking about sides and focusing on parts of the article. Nov till now how many months? You are telling me AMD is on what respin right now on? How do you know that, I want to see your proof and don't base it on rumors.

Watch the name calling - Dr.Evil For your sake I hope you mellow out.
 
r.



Sure there's never been words there :rolleyes: , yes, surprisingly there too considering the usual calibre of participants, but please c'mon. ;)


The caliber of the participants in any discussion is reflected by the manner in which they conduct themselves.

We are at [H]forum. Kyle owns this site, and pays for your ability to even post here. In any place in the world, it is the height of rudeness to enter someone's home as a guest and proceed to insult them. I enjoy the frank discussion that takes place here at [H]forum. I am also a guest here. And as one guest to another, disagreement with an article, or even a flaming of an article or opinion is one thing; personal attacks are quite another.

Your posts define your caliber, and by extension your credibility. I find both of them lacking.
 
check around a bit, you will find the gas mileages that I posted.... ..but I can say my Audi S4 at full acceleration doesn't get more then 4 mpg either, so I don't execpt a Mclaren to get much more then what the Veyron gets.

I didn't say the McF1 gets much more (and at that level 'much more is a relative term 4 vs 2 is more than the 240W versus 180W differences we're talking about for the G80 versus R600. So at the faster than McF1 250+ MPH the Buggatti is worse, and the figures quotes are in the low 2MPG, but that's also faster at the same time. So once again the analogy fits, more performance for more power/consumption, and the only counter-argument you offer sofar doesn't match gov't tests on a public site. So let's call that a dead horse, Buggatti at top speed is both faster and more fuel consuming than the McF1, the former is a given, the later is reported.

You have no basis to back anything up other then rumors and speculation from other website,

Heck the basis for my criticism is even in what Kyle himself writes, like I said before comparing 'configuration' specs to single card specs is a poor basis for the claims made.

what Kyle has is a specs sheet from AMD, which has much more wieght then anything else that has been said around the web.

Not when he appears to be taking the information out of context. Perhaps he should've checked with AMD/ATi whether or not the figures were for single card or for the whole configuration, like the wording implies. Was that even attempted? The article doesn't say 'we tried to contact ATi over the past 5 months we had this spec sheet about whether it's single card or system, but they didn't return our calls'; at least that would show that some effort was made to get the information correct, the only thing that Kyle does have in hand and can be interpreted in many ways if it states 'configuration' and not card. The same thing happened with previous generations of cards two where everyone got overexcited about the requiredment ATi and nV provided cards because they started talking about 500+W PSUs.

Actually you talk like this there too, if you remember we had an arguement there once about the 9800 being used as a mid rage product to fight the 6600 because the x600-x700 wasn't good enough?

And that argument can go in many ways especially depending on the context of the discussion, at launch the the cards weren't much of a difference, and there were situations where each was preferable, so depending on context it has always been able to go either way, just like the X800XT vs GF7600/X1600, so depending on the discussion you'd have to look at what was said, because it's unlikely due to A>B in the IHV, so much as which card has which features.

I don't care where you post, but you post the same way everywhere even when you are very wrong, so it seems to be a habit for you. Bais in you seems to be an inherited trait, if you don't want to look like an attention deficient baby, start changing the way you post, and start readed articles properly.

'Very wrong' is this similar to you MPG numbers? :rolleyes:
'start readed articles properly' ? Sure thing coach! :p

240 watts is presumed at this point, no one has seen the retail boards yet! You want him to confirm something that is not possible at this point?

Well it's pretty easy to confirm that the OEM fan reuires significantly more power than the current retail solutions from ATi, nV and add-on vendors like AC. So going with that and the actual pictures of the cooler as the basis for this assumption is better than the idea that it's impossible to conceive of the retail boars without that extra 3" of OEM fan and it's load.

That would be in your definition of good journalism is when he lies or basis a story on rumors, that is what the Inq does, so by your own words you want to believe in the Inq?

No like I said the InQ is a crapshoot, but entertaining like TheWeeklyWorldNews, I don't take them as a 'news' source, but a rumour mill. And my definition of good journalism would be Kyle contacting ATi/AMD and saying, can you clarify the following as I'm about to write an article (omitting mentioning the sensationalizing aspect of course) and I want to be sure I understand this correctly. That was obviously not done if all he's coming back to is the fact sheet, and it's iffy wording at best, and not saying ATi confirmed that number to be good for a single card or system.

Nov till now how many months? You are telling me AMD is on what respin right now on? How do you know that, I want to see your proof and don't base it on rumors.

Well nothing more than reviewers' statements at actually seeing A13 at CES first hand at the time of the article and now A15 being the one bandied about on the latest sightings. But of course their word at seeing the actual die firsthand versus Kyle's extrapolation of other information to make claims about an actual physical item without seeing it first hand let alone observing his claims. My assumption that there has been a spin or two since Novemer based on the info from people who've seen the dies first hand versus Kyle's assumption that the 'configuration' specs from 4+ months ago are the same as the single card requirement, my assumption seems more reasonable. And I'm not putting it out there as news, so the onus, like I said remain on the author to check his facts, especially if pretending to be better than the InQ, I have no such aspirations or claims, unlike Kyle's. And based on previous statements in this thread it seems like double-checking the specs on the sheet with AMD didn't seem to even cross his mind and he was quite resistant in the beginning of this thread despite people's concerns expressed with tactfulness and deference to the grand poobah, yet met with sarcasm and dismissed. The concept of measure twice (or ten times) and cut once seems lost in this article. Adding small caveats later is along the same journalistic practices of front page news page 10 retraction, long after the rumour makes the rounds of the forums and spreads the FUD. Speaking of edits to the original article, why not include the fact that it's not 'ATi stating' but "ATi stated in a Nov '06 document". Because that would show the information to be as stale as most information from last year.

At this point in time it doesn't much matter anymore, the article is unstickied from the front page, and will likely fade away without a correction. However, if you want we can revisit this at launch time and see what's what. I expect you will say, 'that was then this is now' type of defense, which pretty much applies to the spec sheet from Nov.

You are one real arrogent idiot. For your sake I hope you mellow out.

I'm pretty mellow, I write this while siting on the couch watching TV and solving world hunger (less people / more food, done!), and far from being an idiot, but if that characterization helps you, go with it. :cool:
 
Back
Top