ATI R600 Confirmations on Size and Power

What the happened to this place?

Good God Almighty this is the HARD FORUM right? What the hell happened to you guys? Since when is 100+ more watts a big deal to the HARD user out there. Since when is having a 9+ inch card seen as something terrible and not PC Hardware Homo-erotic.

Jesus H. Super Christ, it used to be where if it took a damn reconditioned V8 to get an extra 50Mhz on a Celeron that person was seen as a Prophet on here. Something akin to Moses with an E-Penis and a fetish for silicon.

Too long of an OEM card :rolleyes: WHO THE HELL HERE BUYS AN OEM CARD OR AN OEM PC ANYWAY. IF YOU DO, GET LOST! THIS ISN'T PC HELLO KITTY ISLAND ADVENTURE FORUMS, IT'S HARDOCP.

Who gives a rats ass if the damn thing requires that your neighborhood lights dims a little. Who the hell cares if you need to cut a hole so your card will stick out, IF IT KICKS THE LIVING SHIT OUT OF ANYTHING OUT IN THE MARKET WHO THE HELL CARES.

Anyone here who has been in this little hobby for more than 13 years have already spent thousands on LONG, HOT, POWER HUNGRY parts that were mere performance increments to what we see today, and we all walked around with our E-Penis's proudly then.

Good God, what the hell happened to you guys. All we care about is the real life experience, and a small dose of real benchmarks to sink our teeth into. It used to be where you would mod your fridge to be your new case, with your damn Pentium 2 right next to last's night chicken salad all for what 100Mhz.

All I care about is how fast and how good the damn thing makes my games look. If it requires me to kidnap some fat kid and have him run on some treadmill\generator contraption with a cake dangling in front of him so be it. It's worth the extra 20 FPS!!!!

Yes I am saying you're NOT HARD for caring about 100+ extra watts
Yes I am saying you're NOT HARD for caring about an extra 2+ inches.

I am all for being green, but cry me a freaking river. I use fluorescent bulbs, drive a Honda and recycle. I deserve my power hungry makeshift nuclear reactor just so I can shoot your ass just a little bit quicker in BF2, all the while sucking down a chicken salad sandwich which it's oder reminds me of what being HARD is really about.

Pick up your skirt ladies, quit your crying and let's see some benches.
 
The performance will make or break it. As long as it's fast and has great features relative to the competition, it will succeed. Why complain about needing a better PSU when you're dropping $600 on a vc in the first place?


PSUs used to last a while, but now it seems like I buy a new one with every cpu/mb/vc upgrade anyways. :rolleyes:
 
I personally could care less about the length of the card. But the reason I am not installing 8800 SLI is because I do not want two of those bastards roasting in my office all summer long. They produce a lot of heat at idle, a lot more when they are under load. From what it looks like here, one R600XTX will produce as much heat at idle as two 8800 at idle. I just don't want that kind of heat in my office. Already gets 90F in here during the summer...
 
I'm just curious, why are we still going on about the length of the card? The one they are showing is the OEM version or more possibly a stream only card. That is NOT the one you would buy unless you buy a Dell (or other OEM brand). The one you would buy from a place like Newegg is NOT 13.6 inches long. The most accepted rumor is 9.5 inches which is shorter than the 8800GTX, by the way.

Also, the 300w is the maximum possible with the connectors. That does NOT mean it pulls 300w. First of all, you would never use the maximum amount of power the connectors could provide. You would always give it a safety range. If it needs both the 8 pin and 6 pin, I doubt it would pull anywhere near 300. If it did, they would go with two 8 pins for safety. Second, the new 8 pin connector is most likely backwards compatible with the 6 pin. The most accepted rumor right now is you only need two 6 pins and maybe a 6 and an 8 if you want to overclock. That would limit the maximum watts to under 225 at normal clocks. With safety range, I would say it comes to close to 200w which is what has been reported by places like EEtimes.

It is also possible that the ones they showed off at the flops demo were stream only processors. Those may have more ram and slightly different specs than the gaming cards since those who would buy it would not care about power consumption .
 
This is nuts. I already have to have an exhaust fan in my computer room or it gets over ten degrees hotter than the rest of the house. Now I'd need to install a separate air conditioner unit. Next I'll need to add a generator or a 15KW solar array to prevent meltdown of the main coming in off the street.
 
just one comment on the power requirments

You don't buy a dodge viper or Porsche 911T to save gas =p


True, but you'd damn well believe I'd bitch when your Porsche outruns, gets better gas mileage, and is cheaper than my Viper (just a counter analogy that isn't meant to show the actual relationship between the two cars. So, don't bash me.)
 
True, but you'd damn well believe I'd bitch when your Porsche outruns, gets better gas mileage, and is cheaper than my Viper (just a counter analogy that isn't meant to show the actual relationship between the two cars. So, don't bash me.)

OMGZOR!11!!1111 you newb123423asdfasd, teh viper is teh beztezt car evar!(it is!!! jk)

but anyways, well thats still to be determined, if the porsche(8800gtx) ends up being faster then the viper(R600) then u can bash ATI all u want =p until then, while its a valid complaint from a user perspective I don't think thats ATI's or the high end crowds concern, most people with systems that have a gtx currently have 600+W PSU's , I think if it does outrun the GTX with enough space between the 2, then people will forget all about 300w max rated power requirement. I don't think ATI/AMD is dumb enough to repeat the X1800 fiasco.
 
I just have to add this comment into the discussion for the people who are saying ATI is late.

I completely 100% agree that ATI is late to the game. But as for it not getting any business because of this, I disagree on. How many people do you think actually upgraded to the 8800? Sure you go around here and see alot of people with the 8800, but those people are still probably a miniority. Alot of people are waiting for this card to either A.) Buy it or B.) Drive down Nvidia prices. Also there is alot of people waiting for the next generation so this generation is cheap. There is still alot of people running 6800's, X800's, 7800/7900's and X1800/X1900's and still completely satisfied with the cards.

And the midrange market has still yet to be seen or less you count the 8800GTS 320MB as midrange. So ATI can still make significant progress in this group if they have a quality midrange card. Nvidia is getting ready to release the 8600's as well. There is also the lowrange market but im sure most of us dont even go that low.

Basically all im saying is, even though the 8800's been out for a while, there is still alot of people who havnt jumped on the 8800 bandwagon.
 
What the happened to this place?

Good God Almighty this is the HARD FORUM right? What the hell happened to you guys? Since when is 100+ more watts a big deal to the HARD user out there. Since when is having a 9+ inch card seen as something terrible and not PC Hardware Homo-erotic.

:snip:
I am all for being green, but cry me a freaking river. I use fluorescent bulbs, drive a Honda and recycle. I deserve my power hungry makeshift nuclear reactor just so I can shoot your ass just a little bit quicker in BF2, all the while sucking down a chicken salad sandwich which it's oder reminds me of what being HARD is really about.

Pick up your skirt ladies, quit your crying and let's see some benches.

Um, if all that you care about is performance, power and the fastest video card on the market, why are you running an X1900XTX?

Why not an NVidia 8800 GTX? Why no Crossfire? Why no SLI? Why no Core 2 Duo, Quad FX or Quad Core?
 
Um, if all that you care about is performance, power and the fastest video card on the market, why are you running an X1900XTX?

Why not an NVidia 8800 GTX? Why no Crossfire? Why no SLI? Why no Core 2 Duo, Quad FX or Quad Core?


BURNED!
 
Um, if all that you care about is performance, power and the fastest video card on the market, why are you running an X1900XTX?

Why not an NVidia 8800 GTX? Why no Crossfire? Why no SLI? Why no Core 2 Duo, Quad FX or Quad Core?

maybe he can't afford it or is waiting for the R600 =p?
 
Good power supply's that can handle this are in the $120-150 range, and soon-to-be quad core CPUs are gonna need more juice anyway. It's all the cost of doing business in the fast lane. Im looking at the OCZ 850W GamerXtream myself... the 520 Powerstream has been flawless these past couple years.

I will want to get a goo experience with Crysis when it comes out, so that means quadcore and Vista and R600XTX. So be it.

Ive NEVER settled for stock cooling on anythng, so I could give a shit what the stock cooling is.

So quit ur bitchin [H]SOFTies.

Kyle, since the ATI card has adjustable in software voltages, you can do what I do for 2D "idle" use on my X1900.... turn it down to the min, lower clock speeds to the min, and run 30C idle.

When its time to game it up.... crank up the AC baby!!!! :cool:
 
the only people who would bite for R600 with double the heat output are those with water cooling systems since atleast then u can take out the 3 inches of the fan that increases length or if thats just OEM then the bulky fan they'll have to put on the retails causing lots of heat inside the computer. And all that is assuming its significantly faster then whatever nvidia puts out at that time.
 
Pick up your skirt ladies, quit your crying and let's see some benches.

I laughed SO HARD at the whole thing... because it's true to some extent.
What you dodge is, most of those extremes were JUST THAT. The extremists taking full advantage of their right to do whatever they could to the card/board/cpu. Not only just for bragging rights (speedwise) but MOSTLY for experimentation and "cool factor". In short, because they could.

What made the "V8" and Fridge examples you mention so [H]ard? The LENGTHS they went to, to eek out that one extra milli-penis of performance. NOT the amount of performance itself in MOST of the cases. (And I have to agree, I still think most of em were cool as hell!! Even though wantonly wasteful...more power to them for doing them!)
They were doing "unreal things" with an item that was geared to the mass population.

The point I guess is, things are geared in the opposite direction where graphics in particular are concerned (considering what the current implications are).

Size of dies are shrinking, and should equate to less power. As well as cheaper pricing because of more yields available, thus getting BETTER technology into more boxes.
Should be a win-win. BUT, what we see along with the *expected* speed/playability increases is insane pricing, sizes (of pcb's) increasing, and power draw increasing.
________________________________________________________

*humors the conspiracy theories for fun* (since |CR|Constantine humored the epeen's) :D

I can almost hear the conversations: "But you know Americans...unless it guzzles more gas, and goes from 0-8000 fps, we'll never be able to make them believe it's worth $xxx.oo"
*debate ensues...*
"Well, ok, how about this? We build rows of additional resisters just to dissapate power onto the gpu, AND make it bigger! It'll continue to perpetuate the upgrade process for all our partners!! Everyone in the industry'll LOVE IT!!!" $$$$$$$$$$$$$

roflmao... sorry. couldn't resist :p
 
OMGZOR!11!!1111 you newb123423asdfasd, teh viper is teh beztezt car evar!(it is!!! jk)

but anyways, well thats still to be determined, if the porsche(8800gtx) ends up being faster then the viper(R600) then u can bash ATI all u want =p until then, while its a valid complaint from a user perspective I don't think thats ATI's or the high end crowds concern, most people with systems that have a gtx currently have 600+W PSU's , I think if it does outrun the GTX with enough space between the 2, then people will forget all about 300w max rated power requirement. I don't think ATI/AMD is dumb enough to repeat the X1800 fiasco.

x1800? The x1800 outperformed the 7800gtx (barely), what fiaso :confused:
 

Burned how?

Just because I did not feel the need to jump ship to Intel or nvidia for their latest offerings does not mean that I can't point out the HARD wannabe's complaining about a little more wattage. Excuse me Melinda if I think a bunch of so called HARD users (or Nvidia die hards) what I like to call them come on here a bitch to high heaven about 100+ more watts.

Even though at this point in time I do not have an uber high end rig you don't hear me complaining and bitching about the R600 being too long. I might not go to crazy lengths anymore to get that little bit extra but I sure as hell am not going to complain to a company that has delivered in terms of fantastic performance in the past about a measly 100+ watts of pure innuendo.

But if you must know I purchased my first home back in August and spending money on two companies I have no love for seemed the right choice buttercup.
 
Even though at this point in time I do not have an uber high end rig you don't hear me complaining and bitching about the R600 being too long.

No, you instead bitched,moaned and ridiculed everyone else for not sharing your view that this isn't an issue. You also ridiculed others for doing the very same thing you are currently doing, spending their money elsewhere.

Which possibly makes you someone with ulterior motives for discouraging enthusiasts from voicing their objections given the information presented in the article and this thread.

Or just a hypocrite, Buttercup ;).
 
was just talking about this thread with my girlfriend.

she said when she started dating me that she learned there was no such thing as "too long"
 
Burned how?

Just because I did not feel the need to jump ship to Intel or nvidia for their latest offerings does not mean that I can't point out the HARD wannabe's complaining about a little more wattage. Excuse me Melinda if I think a bunch of so called HARD users (or Nvidia die hards) what I like to call them come on here a bitch to high heaven about 100+ more watts.

Even though at this point in time I do not have an uber high end rig you don't hear me complaining and bitching about the R600 being too long. I might not go to crazy lengths anymore to get that little bit extra but I sure as hell am not going to complain to a company that has delivered in terms of fantastic performance in the past about a measly 100+ watts of pure innuendo.

But if you must know I purchased my first home back in August and spending money on two companies I have no love for seemed the right choice buttercup.

You have just BURNED Ockie. :p
 
was just talking about this thread with my girlfriend.

she said when she started dating me that she learned there was no such thing as "too long"

wow, so u would want 13.5 inches in your hole! LoL!

The R600, just speechless, ppl are wasting their energy, jus wait and see what really becomes of the Product.

Remember, computer ppl, like us, r suppose to be smart, intelligent, and peaceful, until the fraggin starts.
 
was just talking about this thread with my girlfriend.

she said when she started dating me that she learned there was no such thing as "too long"

anything over like 8" hits the cervix and apparently feels like getting punched in the gut. :<
 
i cant believe this thread went 10 pages without making that joke before i did.

damn thats like an emo thread without the grass cutting joke for 10 pages. you all are slippin
 
it looks like amd's acqusistion of ati really hasn't changed that firms devil-may-care engineering philosophy. i mean, on one hand they make high end cards that deliver tremendous visuals, easily matching-and sometimes besting-their competition. on the other hand, their products are hot, loud, late to the market, not easily or reliably overclockable, and now excessively power hungry.
 
OK - I'd call you "living in denial", Lazy Moron. :)

13 inches is absolutely freaking huge


Except the FACT no consumer will be purchasing this card since it will only be OEM. Don't forget the FACT the PCB (actual card) is under 10". :eek:
 
it looks like amd's acqusistion of ati really hasn't changed that firms devil-may-care engineering philosophy. i mean, on one hand they make high end cards that deliver tremendous visuals, easily matching-and sometimes besting-their competition. on the other hand, their products are hot, loud, late to the market, not easily or reliably overclockable, and now excessively power hungry.

R600 was in the pipelines LONG before AMD bought ATI. It would have been financial suicide to scrap it and start a completely new product. We won't see AMD's real influence on ATI until the next product cycle.
 
Not EVERYONE who comes here has that concept. I don't OC much nor
do I require the best and leave the rest. I come here to see product reviews
and gain general knowledge in the forumns.

I care about the power requirements because my pocket book does. Many
of us here simple cannot or will not fork it over. This is not the space race
where unlimited funds abound. there are limits.

Sissy is not a nice name.


I totally agree,as I think many here forget that overclocking was all about (and still is to me,an many others)best bang for the buck,without spending a fortune.... This is
why I just built a E4300 system @ 3.4 Ghz....Low Cost relative to High performance...

:p


My GTS gives allot of rendering power for the amount of wattage it demands,as does my Conroe,it seems that Intel and Nvidia "Get It",and others do not...
 
There is absolutely no way the R600 will pull 300 watts.

PCIe slot - Delivers 75 watts max
6 pin PCIe connector - Delivers 75 watts max
8 pin PCIe 2.0 connector - Delivers 150 watts max

That's 300 watts with a 0 watt tolerance. No video card company would ever design a card that would consume the maximum amount of power that can be supplied to it.
 
My GTS gives allot of rendering power for the amount of wattage it demands,as does my Conroe,it seems that Intel and Nvidia "Get It",and others do not...

are you suggesting that amd processors aren't energy efficient? don't you remember the entire p4 generation of cpu? you know, the ones that idle at 50'c+. the only reason intel seems to "get it" now was because users complained, reviewers glowered, and they lost market share.
 
Kyle,
I remember the good 'ol days....... have you gone soft on me? Banning, questionable titles and other of the ilk.
(or maybe just more sarcastic?)

The part I can't understand is how ATI is going to try and "sell" these cards to Dell, Gateway, and the like. They can't skimp on the power supply. And that is something they always love to do.
What am I saying. It will be a $500 upgrade, and sold as a feature :rolleyes:

<rantings of a [H] lurker>

A lot of Dell PSU's I've seen and heard about were under rated. Besides, this will be an enthusiast feature which usually indicates other high end items that draw a lot of power (combined that is). It's not like Dell's going to throw this into every Dimension desktop falling off the aseembly line. So it's not going to be a big deal to include a high end PSU along with the two R600 and Quad core CPUs in the few systems (relatively) they'll sell with those features.
 
A lot of Dell PSU's I've seen and heard about were under rated. Besides, this will be an enthusiast feature which usually indicates other high end items that draw a lot of power (combined that is). It's not like Dell's going to throw this into every Dimension desktop falling off the aseembly line. So it's not going to be a big deal to include a high end PSU along with the two R600 and Quad core CPUs in the few systems (relatively) they'll sell with those features.
I should clarify:
If the R600 give somewhat simmilar preformance to a 8800 and Dell and others can have that as an option for the top end machines I could understand if they sidestep the R600. Bigger case and bigger powersupply..... for the few machines that they might sell with it, is it cost effective to even offer it?
(I understand "few" is still a large number but still)

Now if the R600 creams the 8800's in preformance I agree with the statments of earlier posts.
There will be uber [H] that have the $$ to put together a bastard box that needs a dedicated 15A line run from the fuse box.
 
There is absolutely no way the R600 will pull 300 watts.

PCIe slot - Delivers 75 watts max
6 pin PCIe connector - Delivers 75 watts max
8 pin PCIe 2.0 connector - Delivers 150 watts max


That's 300 watts with a 0 watt tolerance. No video card company would ever design a card that would consume the maximum amount of power that can be supplied to it.
Very good point.;)
 
There is absolutely no way the R600 will pull 300 watts.

PCIe slot - Delivers 75 watts max
6 pin PCIe connector - Delivers 75 watts max
8 pin PCIe 2.0 connector - Delivers 150 watts max

That's 300 watts with a 0 watt tolerance. No video card company would ever design a card that would consume the maximum amount of power that can be supplied to it.

Not intentionally, I wouldn't think. Once it is made though, there isn't a whole lot to do about it but sell it and try to make it sound exciting and hardcore.

If it doesn't consume that much power, and isn't that large, then now is the time for ATI to say so. If it wasn't true I would imagine they would be screaming at the top of their lungs. The silence and delays tend to lend it credence.
 
Back
Top