Fermi to be the hottest single chip ever

Wasn't the 2900XT TDP 215W? It was one of the greatest failures ever made.
 
Wasn't the 2900XT TDP 215W? It was one of the greatest failures ever made.
Lol, no, not at all. Either some of you are noobs and only saw this less than stellar release or are very forgetful. For instance, the FX5800 was staggeringly worse. The HD 2900 XT wasn't a bad card, it just wasn't great. It couldn't compete with the 8800GTX and Ultra in high resolution gaming, but, for instance, it was the king of benchmarking. It was a new architecture and AMD was testing it's strengths and weaknesses.

Anyway, history lesson over, Fermi sure as hell seems like it. Considering two 6 pins give you 225W and NVIDIA went with the 6-pin/8-pin configuration, I'm going to guess we're looking at least at GTX280 levels, and probably higher.
 
All in all, we believe that end users won’t mind, as long as it stays faster than ATI’s RV870-powered HD 5870 card.

What it all boils down to in the end. Your dealing with enthusiast level cards and the end user needs to understand power / cooling.
 
the FX5800 was staggeringly worse

yes the GeForce FX 5800 Ultra was the ultimate power hog

What it all boils down to in the end.

not true...if Fermi is only slightly faster then a 5870 then it will matter as people will stick with ATI...if it's a lot faster then people won't care as much...heat/power consumption are big issues for a lot of people and ATI is gonna hammer this point in the next few months
 
not true...if Fermi is only slightly faster then a 5870 then it will matter as people will stick with ATI...if it's a lot faster then people won't care as much...heat/power consumption are big issues for a lot of people and ATI is gonna hammer this point in the next few months

IMO if your dealing with enthusiast level cards then you should have already accounted for power / heat and made the proper arrangements.

The price vs heat vs power etc. comparison will be true to some on here but most of the time people just want the biggest stick.
 
not true...if Fermi is only slightly faster then a 5870 then it will matter as people will stick with ATI...if it's a lot faster then people won't care as much...heat/power consumption are big issues for a lot of people and ATI is gonna hammer this point in the next few months
You would think they would, but AMD's marketing has been "meh" at best. Here's a golden opportunity for them. Personally, considering my setup, Fermi's power consumption/heat output is enough for me to pass it by.
 
IMO if your dealing with enthusiast level cards then you should have already accounted for power / heat and made the proper arrangements.

The price vs heat vs power etc. comparison will be true to some on here but most of the time people just want the biggest stick.

Oh how wrong you are. The way most people get started down the road of enthusiast level hardware is by upgrading their store bought brand name pre-assembled PC's. Ours is a very small niche compared to the mainstream market that brings the actual bulk of earnings for add-in board vendors like Nvidia and AMD(ATI).
 
Oh how wrong you are. The way most people get started down the road of enthusiast level hardware is by upgrading their store bought brand name pre-assembled PC's. Ours is a very small niche compared to the mainstream market that brings the actual bulk of earnings for add-in board vendors like Nvidia and AMD(ATI).

I'm not even talking about how people upgrade???.....I'm just saying people need to be educated on their purchase and to expect any trade-offs. Yes most don't know and we see post dealing with "Can my Dell psu handle this bah bah bah".
 
Why are you arguing over the piece of unreleased hardware? :D

Because someone post some wild ass guess on the next gen hardware and we spend four plus pages beating the hell out of it. Soap box on Friday what can I say.
 
I'll believe it when I see it. I *think* it might be a warm card, but we don't even know what clockspeed these cards will be released at.
 
How are we supposed to overclock this thing, much less cool it down when overclocked :confused:

I mean if TDP is that sky high, tri slot coolers :p
 
The HD 2900 XT wasn't a bad card, it just wasn't great. It couldn't compete with the 8800GTX and Ultra in high resolution gaming, but, for instance, it was the king of benchmarking. It was a new architecture and AMD was testing it's strengths and weaknesses.

Hahaha... come on guy. By that rationale, nVidia is just testing Fermi's strengths and weaknesses.
 
How are we supposed to overclock this thing, much less cool it down when overclocked :confused:

I mean if TDP is that sky high, tri slot coolers :p

My 4870x2's each have a TDP of about 270w and I'm running them both using the stock dual-slot coolers and overclocking on them. 200w TDP Fermi should not be that big of a deal really.
 
Hahaha... come on guy. By that rationale, nVidia is just testing Fermi's strengths and weaknesses.
And they will be. And if it so happens that it doesn't compare so well to the 5870 it doesn't mean it is "one of the greatest failures ever made" either. There are plenty of other examples of that, the phrase was used incorrectly there.
 
And they will be. And if it so happens that it doesn't compare so well to the 5870 it doesn't mean it is "one of the greatest failures ever made" either. There are plenty of other examples of that, the phrase was used incorrectly there.

Just having some good natured fun...lol. Yeah, they will attempt to verify their design goals: did they miss or hit a home run or somewhere in between.
 
My 4870x2's each have a TDP of about 270w and I'm running them both using the stock dual-slot coolers and overclocking on them. 200w TDP Fermi should not be that big of a deal really.

4870x2s are crap oc'ers generally. I had one and it barely went over 810 on stock and 820 with accelero 4870x2 installed so Im guessing you just maxxed the CCC to 800 core, right.

Nice overclock on your Q9650 btw. Is that 24/7 and what OS are you using?
 
NVIDIA seems to lack efficiency in their designs or something. o_O All they ever seem to do is make their video cards get hotter and hotter as they get faster and "better." Sooner or later they're going to have to do something about this.

Perhaps my logic here isn't absolutely correct (but partly in some senses). Hmm.
 
Water cooled editions. I get the feeling we are going to see a lot more of them than we normally do. And I don't really consider that a bad thing. I would love for there to be more factory installed water blocks on vid cards. It will bring competition to a market that sorely needs it. (A $400 vid card with a $50 water block tossed on it does not make a $600 product imho.)
 
4870x2s are crap oc'ers generally.
Hopefully GT100 doesn't really need any/much overclocking, though I'll admit I'm not really that optimistic at this point :(

NVIDIA seems to lack efficiency in their designs or something.
Actually, NVIDIA does really well with respect to performance/watt (or at least they have historically). There are the obvious exceptions like the FX series, but for the most part, they've done pretty well in this area.
 
seems like Nvidia has sold their soul to the devil to make Fermi better then ATI's 5000 series...performance at all cost, even if it means reaching critical mass in terms of heat and and power...they need Fermi to succeed at all costs

trust me once Fermi is out, 6 months down the line the real Fermi will be released with much lower TDP, heat output and power requirements...Intel, AMD, Nvidia etc have all been talking about power output being a major focus of future products...look at the upcoming specs for Sandy Bridge or Istanbul...it's all about the low TDP

the Fermi release in March is not what Nvidia wants to release...it's what they feel they have to put out to compete with ATI...all the early adopters are gonna get burned (literally) when Nvidia releases their Fermi refresh 6 months down the road
 
Sure are a lot of opinions and statments on things we have no real clue about.
 
the Fermi release in March is not what Nvidia wants to release...it's what they feel they have to put out to compete with ATI...all the early adopters are gonna get burned (literally) when Nvidia releases their Fermi refresh 6 months down the road

The early adopters getting burned for buying early? Here's a news flash, anyone buying a graphics card understands that there is always something better around the corner.
 
Sure are a lot of opinions and statments on things we have no real clue about.

True, but we can make a semi-educated guess that Fermi is going to be hot and consume a great deal of power due to it's size, and Nv's goal of crushing the 5870/5890. Nv is going to volt it up, and clock it up as high as possible to accomplish that imho.
 
Water cooled editions. I get the feeling we are going to see a lot more of them than we normally do. And I don't really consider that a bad thing. I would love for there to be more factory installed water blocks on vid cards. It will bring competition to a market that sorely needs it. (A $400 vid card with a $50 water block tossed on it does not make a $600 product imho.)

Most full cover blocks for new cards are $100-$150, not $50. And you're paying for them to install the block and warranty it as well, so they're naturally going to mark it up over what you'd pay just buying the card and block separately.
 
My 4870x2's each have a TDP of about 270w and I'm running them both using the stock dual-slot coolers and overclocking on them. 200w TDP Fermi should not be that big of a deal really.

Not trying to flame, but are you really comparing a dual gpu card to a single gpu card?
 
Water cooled editions. I get the feeling we are going to see a lot more of them than we normally do. And I don't really consider that a bad thing. I would love for there to be more factory installed water blocks on vid cards. It will bring competition to a market that sorely needs it. (A $400 vid card with a $50 water block tossed on it does not make a $600 product imho.)

The day everything goes watercooling(without noob kits like the Corsair h50) is the day I stop gaming on my pc. Fuck that noise, that's just too much hassle for my tastes. I'm too paranoid over a leak....
 
Not trying to flame, but are you really comparing a dual gpu card to a single gpu card?

Its funny cuz it can be interpreted both ways. You can't compare a dual card but then at the same time an 8 + 6 pin and all the hardware specs put this Fermi tdp very close to a dual AMD card.

disclaimer: Nobody knows final power draw but you can put 2 and 2 together to get a ballpark. And its in that dual card ballpark.
 
Last edited:
.heat/power consumption are big issues for a lot of people and ATI is gonna hammer this point in the next few months

Heat ill give someone, but power consumption?...pfft buzzwords for forum trolls. Nobody buys a high end card based on power consumption.
 
Yo ATI, I’m really happy for you and imma let you finish, but NVIDIA's Fermi is the greatest GPU of all time!
 
All in all, we believe that end users won’t mind, as long as it stays faster than ATI’s RV870-powered HD 5870 card.

To 90% of the buyers out there they won't even know or care.
 
Back
Top