Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
There are review sites out there that do performance-per-watt comparisons (techPowerUp is one). Personally, I'd rather have [H] stick to its normal testing strategies and only entertain other metrics if they have time left. Anyway, considering power consumption simply as a cost or environmental factor is only scraping the tip of the iceberg I think. Really, power consumption and performance/watt comparisons show the quality of a part.There has been a lot of complaining about heat lately and I just don't understand it. There are lots of cards out there if you want to make a HTPC or a laptop. But for an SLI setup? I mean we are way beyond the linear price/preformance portion of the curve. At 190 vs 240 (guessing?) you've got only a 100W difference between two cards. Even at 190 to 300W you are looking at 220 Watt difference, Quality 750-850W to Quality 1KW PSU would be under 100$. And if femri is in the 30% faster range, it is over a 50% faster system.
Show me a 500W Video card that is 5 times as fast, used 6 times the power, costs 6 times as much as the competitor's 100W video card and I'll show you the card enthusaists are going to buy. I'd venture while it's a good "look at me stat" most people interested in enthusiast class systems are much more interested in more fps. Really there is nothing efficent about SLI or crossfire systems. People do it because they want more graphics power.
However, if you really want to look at power because some people really are concerned about that, then lets have a good long look at it. And this is something you might seriously think about adding into your review(s). Joules per frame. (or frames per joule of course). If Femri takes 240Watts and delivers 60 FPS while an ATI is delivering 45FPS at 190W then the femri is actually more energy efficent!
I don't have a magic ball, nor am I under an NDA (I wish, but if I was I'd be forced to review it instead of play with it although some say that is one and the same), so I don't know who wins in that department. But I do know that it is how we should be comparing energy numbers.
Cost + Weight = You still have a problemjust put all copper heatsinks on every gpu, problem solved.
There are review sites out there that do performance-per-watt comparisons (techPowerUp is one). Personally, I'd rather have [H] stick to its normal testing strategies and only entertain other metrics if they have time left. Anyway, considering power consumption simply as a cost or environmental factor is only scraping the tip of the iceberg I think. Really, power consumption and performance/watt comparisons show the quality of a part.
For instance, and I'll admit right here I'm a big fan of the 58xx series, I think the 5870 is a fantastic part. Here you have a single-GPU part with the same performance as the previous generations fastest dual-GPU card (GTX295) that consumes only 60-65% of the load power and ~30% of it's idle power. Quite amazing in my eyes. Now fast forward to your comparison of Fermi vs. 5870, with the same (very generous) numbers. What happens if you give the 5870 the extra 50W of power that the Fermi is using? I know my 5870 @ 1.0GHz doesn't consume 240W. I'm not at home right now so I can't get numbers, but say you have a 1GHz 5870 (~17.6% increase in clockspeed, let's say nets 15% performance boost) that consumes 220W. Now you've closed that gap and given the consumer a choice. How? Through efficiency. Efficiency yields flexibility. The reason Fermi is projected to run so hot is because NVIDIA had to crank it's clocks and power consumption to yield an acceptable performance delta. Well, any end user can do the same with Afterburner thanks to the recent edition of software voltage adjustments in mid-range and high-end cards. Now if you can at that time buy a 5870 for $300 while Fermi costs $400, guess what's going to be bought?
The reason Fermi uses more power is because of the difference in number of transistors/die size.
Fermi running hot has more to do with the necessary voltage required to run at the desired clock speed.
Because the design is inefficient, and hence they need more voltage and power. You could continue this though.Which is driven by the total resistance in the circuit which is in turn driven by the number of transistors which at a given process size is given by the die size. Do continue though.
Which is driven by the total resistance in the circuit which is in turn driven by the number of transistors which at a given process size is given by the die size. Do continue though.
It's inefficient? You have magic preformance numbers and a real TDP then? If it is 30% faster than a 5870 with a TDP of 190W then a TDP of 247W is exactly the same efficency. Please continue on how you have preformance numbers and actual TDP to make your claims.Because the design is inefficient, and hence they need more voltage and power. You could continue this though.
Based on core difference and assume ECC memory accounts for clock differences you'll end up with a TDP of 257W. That is of course assuming the "safety factor" on TDP between a tesla card and a geforce is the same.Nvidia could have their standard-bearer have a TDP equal to that of Cypress, but it would undoubtedly have an undesirable level of performance (for Nvidia, since it will be much closer to the 5870s) and still cost a great deal more to make than the 5870. They would have to sell Fermi-based GPUs and boards for much less than they would want.
Tesla C2070 has a TDP of 225W with only 448 cores and likely a lower clockspeed than the Geforce equivalent. It's likely that the GeForce Fermi's TDP (and actual consumption) will be much higher than that.
If you want to depreciate the technology to something that's over four months older, you go right ahead. I thought companies were supposed to be progressive. Wotan explained it best, but it seems like they will be "overclocking" the cards to get a desirable performance delta. With every piece of silicon, you have a point where you get diminishing returns in performance for power added, they may be going past this point, given the ~250W TDP thrown around.It's inefficient? You have magic preformance numbers and a real TDP then? If it is 30% faster than a 5870 with a TDP of 190W then a TDP of 247W is exactly the same efficency. Please continue on how you have preformance numbers and actual TDP to make your claims.
If you want to depreciate the technology to something that's over four months older, you go right ahead. I thought companies were supposed to be progressive. Wotan explained it best, but it seems like they will be "overclocking" the cards to get a desirable performance delta. With every piece of silicon, you have a point where you get diminishing returns in performance for power added, they may be going past this point, given the ~250W TDP thrown around.
The Nvidia ION is now the best high end graphics card because of it's amazing power efficency!
Has anyone actually crunched the numbers on this?Actually the ION/9400M isn't exceptionally power efficient, just low-power.
Hey, you wanted to compare power efficiency, now that I did don't get your panties in a knot.You're right, we should magically expect more out of a card when it is late to market. In those four months they are going to redesign TSMC's 40nm process in less time than it took TSMC to design it in the first place so it is not a leaky POS!
We'll ignore the fact that it is late but appears to be significantly faster and will instead focus on energy efficency as the primary metric of judging a video cards preformance. There's good news! The Nvidia ION is now the best high end graphics card because of it's amazing power efficency!
You're right, we should magically expect more out of a card when it is late to market...
Hey, you wanted to compare power efficiency, now that I did don't get your panties in a knot.
Most people buying graphics cards are not well informed. Most consumers don't know that the GTX 280/5 has been out for ~18 months now and the 5870 was "just" released.Of course we expect more out of cards that arrive on the market later. Generally, consumers don't care whether or not the GPU has been under development for 6 months or 6 years. Consumers don't want excuses, even if those excuses are valid. Consumers only desire products that are worth buying, and if GF100 comes to market after RV870, then it has to perform better to be worth buying.
However, this is ignoring brand loyalty and marketing - I'm assuming consumers are well informed about the card's performance and impartial which is not always the case.
Not trying to be a dick, so please don't take this that way. But if your comp is never going to require more than a 500W PSU why does your sig say a 620W PSU?Hmm, well, for me it's not a matter of 250W at x performance vs. 200W at y performance vs. 150W at z performance. I will not do more than 200W on a single card (and I will only have a single card), and it better run games pretty well with 2 to 4x AA. My whole computer shall never require more than a 500W power supply. Ever.
We are at a point where everything is actually becoming more and more efficient from incandescent bulbs drawing 60-100W to the energy-efficient 8-20W kind, etc. The same should occur for computer hardware, but given that computer hardware's performance is increasing at a good pace, I'm willing to settle for a 200W limit (even though before, that would be unthinkable).
So, bring it on nV, if you have a 200W card, I will be comparing it to my current 8800 GTX, and to ATI's HD5870. If you don't, then I'll be looking as many lower as it takes to find a worthy upgrade to my current card.
"I didn't prooving [sic] someone wrong?" Yah, you really proved that one.I didn't prooving someone wrong and effective use of sarcasim constitued me getting my panties in a knot. Guess I learned something new today.![]()
That will be interesting to see as well. AMD's 5xxx series set the bar quite high.It's the idle power consumption that I worry more about since my PC is on 24/7 but I'm not gaming 24/7.
The only thing you can come up with is I double tap a letter? That's not funny, that's just plain sad."I didn't prooving [sic] someone wrong?" Yah, you really proved that one.
You also forgot proper punctuation after didn't and you misspelled sarcasm as well (which is sad, considering most browsers have built in spellcheckers). Considering your name is "vengence," I shouldn't be surprised. Either way, the forums aren't your place to have a temper tantrum. Go do another activity to blow off some steam then come back to a discussion about video cards, as I won't be responding to anymore of your childish banter.The only thing you can come up with is I double tap a letter? That's not funny, that's just plain sad.
You also forgot proper punctuation after didn't and you misspelled sarcasm as well (which is sad, considering most browsers have built in spellcheckers). Considering your name is "vengence," I shouldn't be surprised. Either way, the forums aren't your place to have a temper tantrum. Go do another activity to blow off some steam then come back to a discussion about video cards, as I won't be responding to anymore of your childish banter.
What's "lol" about this is that they'll both produce the exact same amount of heat. Basic thermodynamics.so Fermi haters here think that a 40nm gpu with 200W will run as hot as the 90nm with 200W?
lol.
What's "lol" about this is that they'll both produce the exact same amount of heat. Basic thermodynamics.
Because someone post some wild ass guess on the next gen hardware and we spend four plus pages beating the hell out of it. Soap box on Friday what can I say.
Yep.But dissipating it gets harder and harder the smaller you go.
Only if you can keep it pure during a spill, which you realistically can't, unless the inside of your case is a cleanroom.If your worried about a leak there are non corrosive, non conductive fluids you can buy to run in your system. I think even pure H2O would work if I remember correctly, but check first.
Not trying to be a dick, so please don't take this that way. But if your comp is never going to require more than a 500W PSU why does your sig say a 620W PSU?
They're essentially the same. I was actually aiming for a 520W, and then NCIX had a 620W on sale for cheap, so I got this one. That doesn't mean that I'm going to push it to its limit. I think my system is 300W max. In fact, I don't think it will even get to 300W (honestly).