NVIDIA's Fermi GF100 Facts & Opinions @ [H]

Status
Not open for further replies.
I just wanted to post after reading the thread, and say that I think the GF100 series will probably do okay considering it was built to scale downward well, but it will not shine unless DX11 is adopted by game developers. From what I have seen, this card is built specifically with DX11 in mind.

You have the concentration on tessellation, along with the cache layout and the parallel pipelines. This card was built to keep alot of the data and computations on card instead of trading off to the CPU and system memory, and if I understand what is going on there, it will require DX11 programming for much of that to be kept on card as opposed to being sent to the CPU. Then with those parallel pipelines for information to go down you have increased the bandwidth for complex rendering. If the numbers are to believed it only costs 7% performance going from 8xcsaa to 32xcsaa.

If DX11 gets adopted, I have a feeling the ATI series is going to hit a major memory bottleneck, yet with the history of DX10 adoption there is every likelihood it will be years before we see DX11 start to come into its own.

Fermi might be a couple years before its time and built with performance and not profit in mind, but it looks to be a very powerful card if the software can catch up with it.

If they put out a $500 512sp card on release, I might have to sli them for a 3 24" monitor setup. If they only push out the 448sp card I'm not sure what I'll do.
 
Last edited:
I disagree, a lot.

I'd wager 1 to 2 percent of all people who are buying these cards even know what kind of power consumption and heat output these cards account for.

Remember, you probably bounce between here and a couple of other enthusiast sites. Most will go to Compusa, eBay, Best Buy, et cetera, and buy the card that says, on the box, that it's the fastest one available.

They will see Fermi, see that it's a monster, and buy one. Then, if they need a bigger PSU, they'll just go back to the same story they bought the video card and buy a bigger PSU.

I can almost guarantee this.

EDIT - and my opinion on the second part of your post; I disagree with you there too. When devs get more horsepower, they tend to use it. That's why we have games like Crysis, Bioshock, et cetera and we aren't still seeing new games come out that look and act like Pong. It might be a slow adoption process, but if devs know they have more horsepower, they use it.

I'd have to disagree. Many people who will go to bestbuy to buy a video card will buy whatever the people working there tell them to buy. Not to mention that when they walk in they will see crazy high prices for everything. Hell they still have radeon 4850 512 megs for $150 in the stores. People aren't going to want to buy a card that will require a brand new poer supply.

They aren't going to want a card that requires a new case with better cooling


They don't want a card that will heat their house in the winter

They don't want a card will drive up their bills


They want whatever is cheapest to run the sims 3 or wow or mabye just mabye a shooter from the last year or so and the majority of those wil lbe directed at $100-$200 cards.
 
Well of course. The majority of people aren't buying high end graphics maybe 0.5 to 1% of people have high end graphics cards. It's the reason why intel is winning in the graphics market. Power users might be willing to shell out extra for a video card but most people don't build their own computers. So truly most high end users are going to understand about power and heat.
 
Well of course. The majority of people aren't buying high end graphics maybe 0.5 to 1% of people have high end graphics cards. It's the reason why intel is winning in the graphics market. Power users might be willing to shell out extra for a video card but most people don't build their own computers. So truly most high end users are going to understand about power and heat.

Thats the other problem. Its the lower end cards that will drive dx 11 and nvidia hs none in site still mean while ati's cards are raising down the price curve.

They are already down to the $100 market and the 55x0 series will be out in febuary and they have been showing off an intergrated dx 11 motherboard.

So whats going to happen is that ati will lead the development of the early dx 11 games as ati has already shipped to its partners and oems 2m and counting dx 11 parts before nvidia even announced their high end part.
 
I've been quite tough on Fermi in previous replies but have to admit I'm still undecided about my next gpu.

Fermi is by far the best architecture in terms of gpu compute type tasks and lets face it that is the way both gaming and general computer use is going.
Also Nvidia strongly supports and drives Cuda and it is really the only game in town at the moment. Nvidia will continue to drive Cuda at the expense of opencl.
I have doubts that ATi will drive the development of opencl, their history shows that they are not proactive in developer support.
Also ATi's compute architecture requires that the code be specifically optomised as it does not deal well with complex code but requires simpler code....again will ATi have the leverage in the developer market and the drive to ensure this is done....history says no...all the while Nvidia will be working with developers optomising for their hardware and I'm sure paying no regard to the requirements of ATi cards.
So personally I think as things evolve ATi hardware will turn out to be far less efficient at compute type tasks than Fermi and Fermi will be able to take advantage of both Cuda and Opencl developments.
My only concerns are that Fermi is going to be too expensive, hot and power hungry and perhaps is before its time....not sure if there would be enough gpu compute development in the lifecycle of my gpu upgrades 18 to 24 months approx to warrant at this time buying a more expensive hot and power hungry card.
 
Last edited:
The radeons offer very good compute power. Its silly to think otherwise. The fermi may be a compute monster but its also a monster in every sense of the way.

IF it is indeed a very hot running chip that drains a ton of power I don't know why anyone would invest long term on it.

ALso you have to factor in that almost 1B tranistors are added to the fermi over the cypress. So the question is what can ati do with those 1B tranistors with thier next gpu due out this fall.

Currently with the new that sw tor is set for spring 2011 and the fact that my computer runs dragon age and most likely mass effect 2 along with anything else i want with just my lowly 4850 i doubt i'm buying a card till this fall or after.
 
The radeons offer very good compute power. Its silly to think otherwise.

My concern is that potential (albeit not as much potential as fermi) will not be realised.....Cuda is already well developed and being actively driven by Nvidia.

Nvidia is not going to change tack and drive Opencl to the advantage of their competitors...they are going to continue to drive Cuda and will still be in a position to use any Opencl developments that occur.

ATi will not drive Opencl developement effectively and infact I doubt they will even ensure that software is specifically optomised to address the short comings of their architecture.

Cuda will be the only game in town for quite a while yet and Ati will in terms of compute tasks will end up being second rate due to un optomised software, that is if there is actually any compute type stuff that will run on it in the first place.
 
Last edited:
They don't want a card that will heat their house in the winter

It's not all bad. In winter, it saves on heating bills, which in turn gets countered by the cooling bills needed for it in summer. So it all balances out :)
 
I can't believe "personal heater" is now a plus when considering a new video card.
 
lol gonna carry that tower around room to room?

No, I figured I'd just close the vent to the room it was in and that way only be heating it when I was in the room. If there were any real heat to begin with. Seeing how we're still ignoring difference between femri and a 5870 card is less than the heat from a 100W light bulb. But do go on.
 
I can't believe "personal heater" is now a plus when considering a new video card.

Sad to say, I haven't had any heating on in my room for years due to the heat output of my computers.

I know I am not the only one either.
 
Fermi Is already a failure for NV. Likely to be six months late to market , this generation is their nightmare. Their direct competitor has already capitalized on NV's mistakes and will continue to do so. If AMD manages to release their next generation in Q4 It won't matter how fast Fermi is, it will be obsolete too soon. Even if Fermi does blow away the 5870 at release, it will remain irrelevant if They cannot get full volume immediately on release despite using a relatively large die on a now notoriously shaky 40nm process yield-wise. Another problem, which will hurt regardless of Fermi availability, will come from AMD price cut's on the 58XX series cards and/or the appearance an obvious 5890. AMD would, of course, be remiss if they did not drop prices if they are confronted with competition that is significantly faster. Of course, AMD price cuts may be irrelevant if Fermi's price tag cannot be kept under control... and, historically speaking, ouch. We know that they have to charge as much as they possible can thanks to a late start. NV has now all but promised a significant performance increase over it's AMD counterpart. If this thing doesn't reflect that same margin in real life benches 3 months from now people had better mad. Though I'm sure the sheeple will do nothing but rave even if it shows a very unlikely 5% advantage.

Furthermore, I'm more than a bit disappointed by peoples lack of skepticism regarding Fermis performance and, in particular, this "preview" Actions always speak louder than works in industries like this. NV just told us they have the new champ and that everything is coming up roses... no problems however, it's just late because we wanted to perfect it at great personal cost. Something stinks. No company that is getting it's pride trampled as well as being killed financially will hold a press event for what they would have us believe is a superior product, and then proceed to offer up no final product details (Frequencies, MSRP, release date, etc.). Look at the motivation. If they actually had a release date, price, or info on availability the would tell us, they have everything to gain right now from transparency because more people will save up their money for Fermi. If this is really all good news they would let us have it because it would just further hype the release and sweat AMD. On the flip side, If you really don't have you act squared away you have an NDA and you show a few benchmarks using prototypes. Somewhat ironic is their love of tesselation. They were never too fond of such nonsense before, but we can be sure now that it is one thing their card will do well because it is a large part of the precious little they gave us. Their actions speak volumes. Fermi is still under development and for a good reason. Instead, people come away from this magic show already talking about how this GPU will compete with the 5970 and will retail for ~$500 all the while talking about what a good job NV did despite dropping it 6 months late. They just put you to work on a bunch of unconfirmed benchies and let your wild imaginations run with the story. They know how hungry everyone is for this part, they know all they have to do is dangle the mysterious shiny carrot and let the masses ruminate... they just saved (read: made) millions out of nothing because you're all saving your scratch up again.

It's a shame. History has shown that when competition falters the consumer will be made to pay. I get to look forward to NV apologists and AMD fleecing. SIGH! This should be the bipartisan perspective, I have love for my dollar, NOT either of these mega-corps.
 
Fermi Is already a failure for NV. Likely to be six months late to market , this generation is their nightmare. Their direct competitor has already capitalized on NV's mistakes and will continue to do so. If AMD manages to release their next generation in Q4 It won't matter how fast Fermi is, it will be obsolete too soon. Even if Fermi does blow away the 5870 at release, it will remain irrelevant if They cannot get full volume immediately on release despite using a relatively large die on a now notoriously shaky 40nm process yield-wise. Another problem, which will hurt regardless of Fermi availability, will come from AMD price cut's on the 58XX series cards and/or the appearance an obvious 5890. AMD would, of course, be remiss if they did not drop prices if they are confronted with competition that is significantly faster. Of course, AMD price cuts may be irrelevant if Fermi's price tag cannot be kept under control... and, historically speaking, ouch. We know that they have to charge as much as they possible can thanks to a late start. NV has now all but promised a significant performance increase over it's AMD counterpart. If this thing doesn't reflect that same margin in real life benches 3 months from now people had better mad. Though I'm sure the sheeple will do nothing but rave even if it shows a very unlikely 5% advantage.

Furthermore, I'm more than a bit disappointed by peoples lack of skepticism regarding Fermis performance and, in particular, this "preview" Actions always speak louder than works in industries like this. NV just told us they have the new champ and that everything is coming up roses... no problems however, it's just late because we wanted to perfect it at great personal cost. Something stinks. No company that is getting it's pride trampled as well as being killed financially will hold a press event for what they would have us believe is a superior product, and then proceed to offer up no final product details (Frequencies, MSRP, release date, etc.). Look at the motivation. If they actually had a release date, price, or info on availability the would tell us, they have everything to gain right now from transparency because more people will save up their money for Fermi. If this is really all good news they would let us have it because it would just further hype the release and sweat AMD. On the flip side, If you really don't have you act squared away you have an NDA and you show a few benchmarks using prototypes. Somewhat ironic is their love of tesselation. They were never too fond of such nonsense before, but we can be sure now that it is one thing their card will do well because it is a large part of the precious little they gave us. Their actions speak volumes. Fermi is still under development and for a good reason. Instead, people come away from this magic show already talking about how this GPU will compete with the 5970 and will retail for ~$500 all the while talking about what a good job NV did despite dropping it 6 months late. They just put you to work on a bunch of unconfirmed benchies and let your wild imaginations run with the story. They know how hungry everyone is for this part, they know all they have to do is dangle the mysterious shiny carrot and let the masses ruminate... they just saved (read: made) millions out of nothing because you're all saving your scratch up again.

It's a shame. History has shown that when competition falters the consumer will be made to pay. I get to look forward to NV apologists and AMD fleecing. SIGH! This should be the bipartisan perspective, I have love for my dollar, NOT either of these mega-corps.

Wow.

Quote: "Fermi Is already a failure for NV"

Quote: "If AMD manages to release their next generation in Q4 It won't matter how fast Fermi is, it will be obsolete too soon. Even if Fermi does blow away the 5870 at release, it will remain irrelevant if They cannot get full volume immediately on release despite using a relatively large die on a now notoriously shaky 40nm process yield-wise."

Quote: "Furthermore, I'm more than a bit disappointed by peoples lack of skepticism regarding Fermis performance and, in particular, this "preview" Actions always speak louder than works in industries like this."

Sounds like you made up you mind as well without any facts.
 
Fermi Is already a failure for NV. Likely to be six months late to market , this generation is their nightmare. Their direct competitor has already capitalized on NV's mistakes and will continue to do so.
No, they actually have had huge supply problems and couldn't bring volume cards to market before the holiday season.

If AMD manages to release their next generation in Q4 It won't matter how fast Fermi is, it will be obsolete too soon.
No, it is a die shrink and not a new architecture. The 4890 hardly made the GTX 280 obsolete, this will be no different.

Even if Fermi does blow away the 5870 at release, it will remain irrelevant if They cannot get full volume immediately on release despite using a relatively large die on a now notoriously shaky 40nm process yield-wise.
It's funny how quickly you forgot ATI's volume problems but now you claim if Nvidia has them all is lost while for ATI it was all gain?

Another problem, which will hurt regardless of Fermi availability, will come from AMD price cut's on the 58XX series cards and/or the appearance an obvious 5890. AMD would, of course, be remiss if they did not drop prices if they are confronted with competition that is significantly faster.
AMD will drop prices, no question there. However you seem to believe Nvidia is going to price it's cards as if price cuts aren't going to happen.

Of course, AMD price cuts may be irrelevant if Fermi's price tag cannot be kept under control... and, historically speaking, ouch. We know that they have to charge as much as they possible can thanks to a late start.
Historically speaking, Nvidia has either signficantly faster or launched first. The 4XXX series was the first seriously competitive series from ATI in a while. ATI undercut Nvidia's prices INSTEAD of taking huge margins on those cards SPECIFICALLY to take market share. ATI could have easily launched those cards at 100$ more a card and just pocketed the change, it was a buisness decision on the part of ATI to play a race to the bottom game. I've already shown on these boards that Nvidia could easily "afford" to launch their GF100 card at a 400$ price point without loosing money even with the horrific yields.

NV has now all but promised a significant performance increase over it's AMD counterpart. If this thing doesn't reflect that same margin in real life benches 3 months from now people had better mad. Though I'm sure the sheeple will do nothing but rave even if it shows a very unlikely 5% advantage.
"People better be mad?" Why would you waste your life getting mad at a graphics card company? Just go buy the other camps card and be done with it.

This is why you aren't running multi-billion dollar company. You think you should tell your competitor all your secrets all while selling Ferraris at cost.
 
Lol.

One thing is for sure, people are stirred up about Fermi.

Ati fanbois better hope it comes out kicking. Cause if it does, it will mean lower prices for them red cards. Thats good for you. It's good for everyone.
 
tokenkopf is right! nv wont release it until it is faster than 5800...even if it takes a year!
 
all this is win/win for me. my upgrade window is open for a while. either fermi will kick ass and be priced well or I'll get a 5870 at a cheaper price. :)
it will be nice to see them on the market soon and multiple reviews and comparisons.
 
Yes, AMD had supply problems... That is my reasoning for potential problems with fermi with a larger die which which could further delay the already late GPU. Regardless, it's a compounding problem. How many of anything has NV sold since the 58XX launched? Zero competition means higher demand for AMD which certainly didn't help supply. Looking forward, AMD still has a few months to enjoy the current competition, and they claimed 800k dx11 GPUs shipped by mid december. NV better hope they ironed out all of those pesky fab issues and then some. Don't deny that this has been a windfall total victory for AMD.

Suppose the 6000 series starts shipping in Q4 though, If we speculate (not that we would EVER do such a thing) another two months of no supply, that will give fermi what, 3-5 months out in force before the 6000 series hits? That would be a disaster however unlikely.

A 5890 refresh would simply allow AMD to further dissect price points and own the mainstream. If AMD were together enough to release simultaneously with Fermi it certainly wouldn't be good for NV even though it will likely be slower. We still don't know if Fermi will scale to the popular Price points or what it will cost NV to do so.

So where does Fermi get the flexibility to slash prices? The 5800 has already financed it's own price cuts. If NV want's to sell at a loss that's fine... even though they certainly don't seem to be willing to do that with their current line up, I understand your optimism fully. If NV manages to launch at $400 I will literally eat my shorts... it won't happen. Regardless That price will skyrocket with the huge demand and possible supply issues.

Good point, Vengence. People shouldn't be mad... unless they invest their hopes in a company, wait for months on end living off of false promises, and then find out their uber card is not 1.6x faster than the 5870 and that it might cost them over $500. The fact that you specifically addressed my post tells me you are just as invested.

Secrets? If AMD isn't already well into development on the 6000 series they are done for. I'm sure they already have something on the drawing board for the gen. after that. Besides, is AMD gonna steal their launch date and price? Copy their core freq.? Steal their fully disclosed test rig? No, and that's what I want. NV is pushing innovation hard... I've seen people innovate themselves right out of business before (not saying that will happen). Miss your demographic or sell to a ghost market... that's bad juju.

If this post is too long go to google translator and convert it into lazy/apathetic for free! ...or ignore it ;)
 
I intend to wait. It's win/win. Either I get a smokin' new Fermi or ATI drops their prices a little OR they release a refresh to compete. That's worth a couple months. Worst case scenario, Fermi is a flop and I end up with an ATI card I should have bought 2-3 months earlier for the same price. Nothing severe. Considering the potential upside, it's a gamble I'm willing to take.
 
Its the guy from my sig /points /laughs

Welcome to the forum totenkopf ignore the free pr spinners everything you said has some logic to it. AMD does have its roadmap for next gpu in Q4 2010 - Q1 2011. He can laugh all he wants. A refresh will come way before that. So Fermi's debut and grand entrance will may be short lived.

Anyways Fermi is late as mentioned 1k times in the thread, people also don't realize there are several teams working on it. 1st team : initial release this year. 2nd team: refresh later 3rd and 4th so on ~ all late
it all trickles down.
 
Anyways Fermi is late as mentioned 1k times in the thread, people also don't realize there are several teams working on it. 1st team : initial release this year. 2nd team: refresh later 3rd and 4th so on ~ all late
it all trickles down.



I think way too much is said about Fermi being late. No you don't like being late but its not like AMD has never been late.

If Fermi does its job it won't matter that its late. If it doesn't do its job it wouldn't matter if it had been early.
 
I think way too much is said about Fermi being late. No you don't like being late but its not like AMD has never been late.

If Fermi does its job it won't matter that its late. If it doesn't do its job it wouldn't matter if it had been early.

Says you. Total tally thus far AMD has shipped 2million gpus, Nvidia ~ 0
Now add in another 3 more months till Fermi release oh my. For each month thats time bought for AMD to work their counter. Ya I'm sure we end gamers don't care as long we get the video card at the time we need it but to Nvidia and their shareholders it matters.
 
Says you. Total tally thus far AMD has shipped 2million gpus, Nvidia ~ 0
Now add in another 3 more months till Fermi release oh my. For each month thats time bought for AMD to work their counter. Ya I'm sure we end gamers don't care as long we get the video card at the time we need it but to Nvidia and their shareholders it matters.

And the same thing was said about AMD not having a DX 10 part for months as well and yet AMD survived.

All it takes a good product.
 
And the same thing was said about AMD not having a DX 10 part for months as well and yet AMD survived.

All it takes a good product.

Bad comparison . How bout a card just a video card? My point wasn't DX11 part but A video card period.
 
still happy with my 5850 after 4 months now.
was hoping I get to compare to a fermi card but none is out yet, go figure.

however, fermi seems to be a good card even despit the billions of transistors manufacturing issues.
I wont hold the numbers in stores hope up high as tesla market might get the biggest number of cards.
just much more cash per card there.
 
What I want to see in 1 year:

GF100 as a main display with a GF100 based midrange card (probably half a GF100 die) as a physics/geometry processor all working off the same driver installation, and no bugs.

By the time this comes out, my 4870X2 might be comepletely outdated and I would be willing to upgrade. Currently, I won't be upgrading simply for this chip. I've had this 4870X2 for almost a year, and it has done quite well for me. The 5870 and 5970 are nice, but not worth spending money on right now. I got an 8800GTX just as the 9800GTX came out, and didn't upgrade until the 4870X2 got down below $600. If the GF100 comes out below $450 and a mainstream card based on it comes out at $150 and came be used as a physics engine only, then that might be my next step.
 
I wonder when we start to see in game Opencl applications coming out like Bullet physics if we will have the facility to chose for it to run on a separate/auxilliary card rather than on your primary graphics card like you can with Physx?
 
Isn't the 8800GTX, debatably, better than the 9800GTX?

Yeah, that's why I got mine. They cut the memory bandwidth down to just over 2/3 with the 9800GTX and increased the core clock, strangling the chip of bandwidth. I've still got it on my secondary machine.

I was out of work for 3 1/2 months, then low on income for another 4 months, but I'm back to my full income now, and that will even increase in 3-6 months as I go from contract to perm. As soon as I refill my savings (which will take me 6 months) and pay back the debt I incurred during my low income time (another 3 months) I will begin upgrading again. That means: Core i7 900 series upgrade with 12GB of memory first, then shortly after that a graphics upgrade. Whoever wins the race at the end of 2010 will likely have my business.

I've gone back and forth a few times between ATI and and nVidia (Starting with the Rage128, then Geforce DDR, then Geforce 3, then Geforce 4, then Radeon 9700, then on and on.) I go with what is the 3rd from the top, because that point is usually the best performance/price ratio, as long as the drivers are stable. I had stability issues with 9700, and I'm not going back to that again.
 
What I want to see in 1 year:

GF100 as a main display with a GF100 based midrange card (probably half a GF100 die) as a physics/geometry processor all working off the same driver installation, and no bugs.

By the time this comes out, my 4870X2 might be comepletely outdated and I would be willing to upgrade. Currently, I won't be upgrading simply for this chip. I've had this 4870X2 for almost a year, and it has done quite well for me. The 5870 and 5970 are nice, but not worth spending money on right now. I got an 8800GTX just as the 9800GTX came out, and didn't upgrade until the 4870X2 got down below $600. If the GF100 comes out below $450 and a mainstream card based on it comes out at $150 and came be used as a physics engine only, then that might be my next step.
Until we see the next generation of consoles, I think anyone gaming on a single 24" monitor and below is going to be able to run lots of AA on even the most demanding titles for a very reasonable sum of money. However, I think when the next gen consoles come online... I think we'll see DX11 come out in it's glory. IMHO the reason we never saw a real adoption of DX10 was the lack of console support.
 
Status
Not open for further replies.
Back
Top