R600 Pics

What's with the "throw power consumption out the window" mentality? I care about power consumption, and I somewhat-care about card length (owning a UFO has its advantages). Does this make me anti-ATI or pro-NVIDIA?

Last I checked, 200+ watts isn't "sissy pansy low wattage". 200 watts is a typical Dell rig running full tilt.

It seems like all nuts have finally arrived in this basket of a thread. Rest in peace, once valuable thread.


thats not the point either, my point anyways was that people are complaining that R600 uses so much more power then an 8800, the difference is only 40w, people are so SHOCKED about 240w, but they are perfectly fine with 200w :p its not that 200W is so outrageous its that 240W is so much more outrageous then 200w
 
What's with the "throw power consumption out the window" mentality? I care about power consumption, and I somewhat-care about card length (owning a UFO has its advantages). Does this make me anti-ATI or pro-NVIDIA?

Last I checked, 200+ watts isn't "sissy pansy low wattage". 200 watts is a typical Dell rig running full tilt.

It seems like all nuts have finally arrived in this basket of a thread. Rest in peace, once valuable thread.

Yeah, I guess there's nothing more to say really.
 
thats not the point either, my point anyways was that people are complaining that R600 uses so much more power then an 8800, the difference is only 40w, people are so SHOCKED about 240w, but they are perfectly fine with 200w :p its not that 200W is so outrageous its that 240W is so much more outrageous then 200w

Actually, a 8800 card (even the GTX), pulls only about 180 watts under load. So the difference is not "only" 40 watts.
But that's not the point either. The point is, 240 watts is alot and us, as consumers, should be worried about that. At least I am.
 
No, it's not about promoting NVIDIA...
If these rumors are true, the power consumption of these cards are edging a limit, that we though would not be reached in a while.
NVIDIA had a problem with their Anisotropic Filtering and AA+HDR in most games. They fixed it. ATI had a problem with heat and power consumption and according to these rumors, they didn't fix it (at least in what power is concerned). We're simply discussing what these rumors mean for us, consumers. And power consumption IS an issue.
Stop trying to turn this into a "flag waving" contest, because it's not.

Again, how is it that you're so worried now about power consumption? did you make the same remark about the 200w 8800? ofcourse power consumption is an issue, but you are still purchasing a $600 videocard, its like buying 2 7900GTX or X1950XTX cards for SLI/Xfire, don't expect your 400W PSU to handle those, just like people upgraded to 5-600W PSUs for an 8800 GTX, I don't think its going to take another upgrade to run an R600,

I upgraded my 420PSU so icould use an 8800GTX, I'm not gonna hold a grudge against ATI because they require a 500+W psu for their card also
 
Actually, a 8800 card (even the GTX), pulls only about 180 watts under load. So the difference is not "only" 40 watts.
But that's not the point either. The point is, 240 watts is alot and us, as consumers, should be worried about that. At least I am.

Sorry are the specs on paper 180w? someone said it's 200w
 
I have never seen so much sissy pansy bullshit in my entire life.

Last time I checked this is HARD OCP where sheer raw power matters most.

If you want to talk about sissy pansy low wattage and a "Mini Me" form factor size then go to sites like Anand.

Who cares if the thing is as long as my penis in a cold shower.
Who cares if the thing draws more power than some small towns.

All I care about is how fast it can go and how good it makes a game look.

That's all that freaking matters.

Jesus Christ, what the hell happened to this place :eek: . It used to be where you would come here and see how some guy rigged his Celeron to an outside generator all the while keeping it stuffed into his wife's refrigerator for an extra 300MHz. Now I come here and half of you Fellas are bitching about an extra 100 watts or so?:rolleyes: Oh cry me a river, then jump in it with your plugged in toaster.

You know what I am going to do? I am going to be putting two of these bad boys in my system, and instead of getting a faster psu, I will getting some homeless guy on a bike pedaling his ass off to a small generator in order to power those two cards. All the while I will be playing Obvilion in crazy high resolution while sucking down a triple cheeseburger (from a styrofoam container) and pumping out more methane to destroy the ozone from my arse.

This is HARD OCP not Hello Kitty Eco-PC Island Adventure.
QFT...that said...holy shit this card is huge. Don't let your small animals near that thing or else you are gonna need tweezers to get them out of the heatsink.
 
lol, I dont care if this thing is slow, I'm going to buy one when I upgrade just so I can get a reaction out of people. =)
(Assuming it fits in my case that is)
 
NVIDIA had a problem with their Anisotropic Filtering and AA+HDR in most games. They fixed it. ATI had a problem with heat and power consumption and according to these rumors

Nvidia did fix AF & AA+HDR, but I don´t think you can call current G80 models as "cool running and low wattage" cards. People gasped at x1800/x1900 cards temperatures but when G80 throws in the same numbers temperature wise, its suddently O.K :rolleyes:

bobrownik, I think you just said what most of us were thinking :D
 
Silus said:
ATI had a problem with heat and power consumption and according to these rumors, they didn't fix it (at least in what power is concerned).
Let's not jump any guns here. We have no idea what kind of performance per watt we're going to see with R600, so let's not run out and assume that ATi hasn't "fixed" their long-time tribulations with managing loaded power consumption (always remember that ATi cards typically do pretty damn well keeping consumption down under idle). Once we get some real numbers, we'll see where we are.

Silus said:
Actually, a 8800 card (even the GTX), pulls only about 180 watts under load. So the difference is not "only" 40 watts.
If the XTX ends up 30-40% faster than the GTX, we'll know where that power went. I'm not concerned about a card sucking power when that power gives it that performance edge, but I am concerned about a card that sucks down power that doesn't really give me anything to go on.

We can't really put any perspective on the power issue until we get an idea of performance, but I also don't think we should instantly dismiss it with some sort of "who cares, upgrade your PSU" type response.
 
In regard to power, people should care about it. It is inversely related to cpus right now. The p4 --> c2d is a prime example. Intel got smart and capitalized on power per watt. The c2d's draw less power and are more powerful. Gpus are headed in the opposite direction.

I think that will start to change however. I mean 1200W psus are ridiculous. My 620W psu and 8800 gtx heat up my room like a sauna as it it. You will see a difference in your elect. bill. You will increase your carbon footprint.

No what else? You can keep crankin more power into gpus, but until cpus, multiple cores, and games are optimized for each other, you will have this thing called a BOTTLENECK. So your end user experience in games will not really improve until everyone gets on the same page.
 
I'm with Constantine.
If you worry about power consumption, you're not [H].

Which is fine, it means you're just a regular consumer, and that's okay. But I think soon we'll have a new definition of [H]-ness: rig 2 R600's together and see if you can dim the lights in your house just by launching a 3D application.
 
Let's not jump any guns here. We have no idea what kind of performance per watt we're going to see with R600, so let's not run out and assume that ATi hasn't "fixed" their long-time tribulations with managing loaded power consumption (always remember that ATi cards typically do pretty damn well keeping consumption down under idle). Once we get some real numbers, we'll see where we are.

I didn't "jump any gun". I made a comment on these rumors, which, if true, maintain the trend of ATI producing power hungry cards. Of course we'll need to wait and see the actual numbers, but this is a rumor thread and that's where we base our speculations.

Anyway, since you're one of the few that actually try to discuss your opinion , instead of "waving a flag", I guess I'll leave this thread alone.
 
If you worry about power consumption, you're not [H].
Let me try to understand your stance a little better here...

Assume, for a moment, that the XTX is going to be 20% faster than the GTX across the board. Everything it does, it does faster, and the IQ is the same between both cards. Assume that both cards have the exact same price. Feature-wise, the cards are basically identical.

If the GTX uses around 180W, what is the threshold for how much power the XTX can consume before you reconsider your stance? If the XTX draws double, 360W, and uses some obscene number of power connectors, do you still buy the XTX? What about 720W? 540W? 1400 watts?

If you're drawing the line on who is and who isn't [H]ard, perhaps you'll also draw the line on how much power is too much power when we absolutely know that the advantage of the power-sucking card is around 20%.
 
where are teh GTX numbers coming from, one says 200w one says 180 =p? whats next 25w? or 500w? does anyone have a source period on the GTX power consumption #s, not TESTED #s, #s on paper, since thats all we have for R600?
 
I looked it up. 8800GTX is around 180w peak in 3d mode.

Performance per watt in videocards is about the last thing I worry about. You probably waste more electricity by leaving your front porch light on all night. Or in my case, leaving my tubed preamp on 24/7.
It truly is a red herring unless you're running one of those gamecube looking PC cases. Isn't the performance what really counts ?
 
As far as I remember speed = more energy and Im sure ATI is not making the cards more power hungry for shits and giggles. Besides, you guys are whining about a rumor also.


I want a Dodge Viper, but I dont want 12/20 mpg. Whaaaaaaaaaaaaa!!!!!!!!!!!!!!!
 
I have never seen so much sissy pansy bullshit in my entire life.

Last time I checked this is HARD OCP where sheer raw power matters most.

If you want to talk about sissy pansy low wattage and a "Mini Me" form factor size then go to sites like Anand.

Who cares if the thing is as long as my penis in a cold shower.
Who cares if the thing draws more power than some small towns.

All I care about is how fast it can go and how good it makes a game look.

That's all that freaking matters.

Jesus Christ, what the hell happened to this place :eek: . It used to be where you would come here and see how some guy rigged his Celeron to an outside generator all the while keeping it stuffed into his wife's refrigerator for an extra 300MHz. Now I come here and half of you Fellas are bitching about an extra 100 watts or so?:rolleyes: Oh cry me a river, then jump in it with your plugged in toaster.

You know what I am going to do? I am going to be putting two of these bad boys in my system, and instead of getting a faster psu, I will getting some homeless guy on a bike pedaling his ass off to a small generator in order to power those two cards. All the while I will be playing Obvilion in crazy high resolution while sucking down a triple cheeseburger (from a styrofoam container) and pumping out more methane to destroy the ozone from my arse.

This is HARD OCP not Hello Kitty Eco-PC Island Adventure.

100% agree with this Rant. Good job putting it to words Mr. Leary.
 
Let me try to understand your stance a little better here...

If you're drawing the line on who is and who isn't [H]ard, perhaps you'll also draw the line on how much power is too much power when we absolutely know that the advantage of the power-sucking card is around 20%.

You got it!
If your [H], then you wouldn't be asking that question.
However, since I am [H] and am omnipotent, I shall answer.
The answer is 'infinity'.
To quote myself "Thou shalst not consider power consumption in the relentless pursuit of 3D power"
 
Frankly I don't care that much about power consumption. My rent includes utilities and it's locked in for another year and a half so I could potentially modify my microwave to launch the space shuttle and I wouldn't have to cough up another dime.

Speed comes at a price. If you want 200fps at max res with all of the eye candy, expect to use some power to push all that hardware. I'm sure that we will see cards that use less power eventually, but don't count on that until the 65 or 45nm cards start coming out.

Length is certainly more of an issue than power consumption -- or at least it should be -- to most people. However, it's already been stated that the release cards will be 9 - 9.5 inches, so stop complaining about it.

As for all of this nvidia vs ATI bullshit, it's the same as usual. Nvidia has the performance crown and all of the nvidia clowns come in here and tout 'their' companies achievements while the ATI crowd tries to make outrageous claims about the next release from ATI. 2 months from now ATI will most likely have it and then the roles will reverse themselves. The rest of us who "buy whatever we can and like it" get caught in the crossfire (no ATI pun intended) and are stuck feeding semi-intelligible logic into these ridiculous rants. ;)

FWIW I'll be going AMD/ATI for my next video card. :eek:
 
You got it! If [you're] [H], then you wouldn't be asking that question. However, since I am [H] and am omnipotent, I shall answer. The answer is 'infinity'.
If you're going to give bullshit answers, the least you can do is make them realistic.

I asked a serious question, and I half-expected a serious answer. Can you answer the question?
 
No. Because if the new XTX turns out to be faster than whatever nVidia can offer, people need an excuse to say "uhmmm... hmmmm.... well..... it consumes more power than mine! so HAH! take that! yea..."

I hear that when I walk my friend's GSXR 600. "But you've got a bigger engine !"
Yeah, so ? That's not my problem you can't afford the cubes. :p
 
If you're going to give bullshit answers, the least you can do is make them realistic.

I asked a serious question, and I half-expected a serious answer. Can you answer the question?

Oh, it was a serious question?
Sorry.
I just automatically asume it's a joke when it comes from someone with 'nVidia zealot' in their sig, posting in an ATI forum.
My bad.
 
I just automatically asume it's a joke when it comes from someone with 'nVidia zealot' in their sig, posting in an ATI forum.
Nope. It was a serious question. The sig is something of a joke, sure.

So, knowing that, can you give the question another go?
 
I think we all need to remember that these specs are leaked, and may not be true. Also, we don't know how old that picture is, for all we know, it could have been taken in December. So let's all take a deep breath, and relax.


Also, I'd like to comment about the people who are saying "Omg! This consumes too many watts!"... You wouldn't go out and buy a brand new sports car, knowing it only gets 15miles to the gallon would you? Of course you would, because you want performance, not a hybrid.
 
I have never seen so much sissy pansy bullshit in my entire life.

Last time I checked this is HARD OCP where sheer raw power matters most.

If you want to talk about sissy pansy low wattage and a "Mini Me" form factor size then go to sites like Anand.

Who cares if the thing is as long as my penis in a cold shower.
Who cares if the thing draws more power than some small towns.

All I care about is how fast it can go and how good it makes a game look.

That's all that freaking matters.

Jesus Christ, what the hell happened to this place :eek: . It used to be where you would come here and see how some guy rigged his Celeron to an outside generator all the while keeping it stuffed into his wife's refrigerator for an extra 300MHz. Now I come here and half of you Fellas are bitching about an extra 100 watts or so?:rolleyes: Oh cry me a river, then jump in it with your plugged in toaster.

You know what I am going to do? I am going to be putting two of these bad boys in my system, and instead of getting a faster psu, I will getting some homeless guy on a bike pedaling his ass off to a small generator in order to power those two cards. All the while I will be playing Obvilion in crazy high resolution while sucking down a triple cheeseburger (from a styrofoam container) and pumping out more methane to destroy the ozone from my arse.

This is HARD OCP not Hello Kitty Eco-PC Island Adventure.

I'm going to have to quote this again for truth!

I agree, I don't care how much power it draws, that's why I have a 610w PSU.
 
Another sneak peak at the retail version:

sneakpeak.jpg
 
I think we all need to remember that these specs are leaked, and may not be true. Also, we don't know how old that picture is, for all we know, it could have been taken in December. So let's all take a deep breath, and relax.


Also, I'd like to comment about the people who are saying "Omg! This consumes too many watts!"... You wouldn't go out and buy a brand new sports car, knowing it only gets 15miles to the gallon would you? Of course you would, because you want performance, not a hybrid.

It's all relative.

I wouldn't buy a regular sports car that got 4 miles to the gallon. If the car could hover and then fly over the top of traffic jams, well then hell, I'd buy it if it got 1/2 a mile to the gallon.

It just depends on what you are getting for that amount of energy consumption.
 
and... some rumored benchmarks!
Looks like an 11% spread in 3DM05 between the XTX and the GTX. If these numbers are accurate, I'd expect a wider spread in 3DM06, between 15-20%.

Funny -- that's pretty much how I called it.
 
Yeah take those numbers with a serious grain of salt. They mention R600 should be 800MHz yet the numbers in the benckmarks show 700MHz? Plus all the bandwidth is likely intended for AA/AF which wouldn't be shown in those benchmarks.
 
May have been a typo on VR-Zone's part. 700 is the number seen twice in the original post.

And yeah, with excess memory bandwidth, anti-aliasing is where the X2800s will shine. G80 takes a somewhat different route with coverage sampling anti-aliasing.

I'm not putting a great deal of faith in these either, but they look fairly realistic to me.
 
We'll probably all be in the old folks home by the time they come out.

I'm starting to believe the power consumption numbers. If they were less than that, I would think ATI would have come out and said "No, the R600 won't consume that much power."

I'm also getting so suspicious, I'm beginning to believe that nVidia is purposefully delaying the release of solid DX10 drivers so that they can pull a rabbit out of the performance hat when the R600 debuts, just to overshadow its release. If that turns out to be the case, there will be some very angry 8800 series owners. Well, angrier than some of them already are over the driver situation.

I think I'm completely sick of both companies. We need a third player. These two are so deadlocked against each other, we need someone to come along with a great graphics card and shake both of them up. It would a) force nVidia to get some drivers out b) force ATI to actually release some hardware c) make both of them put up or shut up.

It's really too bad that the Delta Chrome sucks. I always did like S3, but they couldn't even write a driver for Windows 95; I'd hate to see what their XP driver (or heaven help us a Vista driver!) looks like. Intel might actually be the one to put a hurting on both of them, and as much as I have disliked Intel's business practices in the past, I'd truly like to see them step into the ring and get both companies back in the game.

This whole DX10 release has been plain awful. I'm already tired of it, and there aren't even any games yet to play on it. No cards from ATI, no drivers from nVidia, and most of the games I currently own and play aren't supported under Vista (though many supposedly work fine). I should have just bought a fast DX9 part last November, but oh no, I had to wait.

Here I sit 4 months later, and I still have no video card. nVidia has dribbled out a cut down GTS and still want too much for it, and ATI has no hardware available and probably won't have mass availability until April.

Grrr. I'm tired of both of them.
 
Even some of the other numbers seem strange. The numbers infer 28 ROPs/TMUs and with G80 listed as having 2x the shader operations how is it even close? And I have no idea where those Vertex numbers are coming from.

Also if the card they were using was actually running 700MHz and is intended to run 800MHz that's gonna up those margins significantly. If AA is involved they will probably start going up in a hurry as well. ATI has always been fairly efficient with AA methods compared to Nvidia so the numbers could become downright crushing fast with all the bandwidth they have.
 
Even some of the other numbers seem strange. The numbers infer 28 ROPs/TMUs and with G80 listed as having 2x the shader operations how is it even close? And I have no idea where those Vertex numbers are coming from.

Also if the card they were using was actually running 700MHz and is intended to run 800MHz that's gonna up those margins significantly. If AA is involved they will probably start going up in a hurry as well. ATI has always been fairly efficient with AA methods compared to Nvidia so the numbers could become downright crushing fast with all the bandwidth they have.

Those numbers are based on a guess: http://www.hardforum.com/showpost.php?p=1030617253&postcount=3
:rolleyes:
 
Back
Top