NVIDIA's Fermi GF100 Facts & Opinions @ [H]

Status
Not open for further replies.
Sorry, but that's ridiculous. It was simply a supply issue brought about by underestimation of demand for the holiday season in the run-up to Fermi launch. Call it a supply drought "bubble".

Its not a supply issue where Nvidia was suddenly floored by the demand, Nvidia cut down on stocks of GTX card long before the holiday season hit. The GTX architecture cards came out in like June of 2008, Nvidia could have ramped up the production if they wanted too, its not new tech. Nvidia is not stupid, they know how the market works and if they could have flooded the retail channels with GTX cards they were selling for a profit, they would have.
 
These cards are really nice and all (GF100 and 5970 etc), but for someone with the rig in my signature, and a 1680X1050 screen, I see little or no reason to upgrade from my GTX260. I have no bottlenecks, I can max out the settings on %95 of new games at high FPS, and my current card is quiet and runs relatively cool.
Nvidia and ATi are putting out great products, and the new generation seems to continue that trend, but unlike previous generations, I do not see a lot of press/advertising/spin about upgrading, and the benefits I would get.

In my mind, Eyefinity, the Nvidia analogue and their 3D tech are all niche gimmicks. The price of entry is too high, and it's too complicated for most users (those folks who make up the majority of the red and green customer base). As such, they do not represent much of a value proposition making me lean towards upgrading.

I guess my question is... in three months, or 6 when supply gets higher, why should I upgrade?
 
I think if more games make use of tesselation in the future then this card will really do well. Here's to seeing how retail cards and drivers and what AMD does to counter this perform.
 
Great to see nvidia redo their GPU architecture, hurray! .. in a way I didnt feel like hurraying at the launch of their last GPU. And from my perspective, awesome that it looks like they´ve created a geometry monster. I cant stand seeing cylinders, arcs ie. that arent round in games! It ruins the immersion. This is good news. It is going to be very interesting to see how long it will take game devs to really make this tesselation tech shine in games (I am guessing quite a few months...Dirt 2 was underwhelming in this way). I don´t understand why they waste time on ray tracing for a gaming card. Its not there yet, but nice for work flows in visualization sure.

Deal breakers for me are price and TDP. Temps though completely connected to TDP and fan noise (if one doesnt want water cooling) are pretty important too, but not a deal breaker itself. There is no way I am getting a card to replace my annoyingly high idle wattage (90W!) 4870, that has a similar or higher idle wattage. The power draw under full GPU utilization on the other hand, isnt a deal breaker. When I game, I game, and who cares about the power draw at that point? But at all other times it makes no sense to have a video card running at the same power cost of all my combined lamps and utilities together.

Its seriously "crazy" to trust nvidias own benchmarks at this point IMHO. No matter the company, I never ever take their own findings seriously. I mean, come on.. what are they omitting, how is it tweaked and what is their agenda (make more money for themselves and the shareholders).

Is there still no word on a better defined launch date for Fermi?

Anyway, like many here I think nvidia is succeeding in pressing pause on the enthusiasts buying eager for a little while. Id like to give nvidia the benefit of the doubt even though the omissions in this paper launch are harrowing. Don´t let this go on for too long though, nvidia, the slide show effect is not going to have a lasting effect on my buying eager. As soon as those great late January/February game releases start happening, I might just have to get what is available. Hurry up nvidia!!
 
I guess my question is ... in three months, or 6 when supply gets higher, why should I upgrade?
Why should anyone convince you that you should? In fact, you probably shouldn't. People should upgrade when their requirements are not being met. Simple.
 
You do realize the 285 GTX is still over $300?.....so saying the GTX 380 will be $400 by summer is flat out wrong.
It's not unrealistic to guess that the GTX 285 is currently being produced in very low quantities. As supply is diminished, demand tends to increase relative to supply, resulting in higher prices for consumers.

And G80 sounded like rubbish due to the near silence from Nvidia and Charlie constantly thrashing the hardware on TheInq claiming that R600 was going to kick its ass.
Yeah, I recall how many people bought into the idea that G80 wouldn't even sport a unified architecture and would be terribly inefficient. Neither actually came true.
 
I said it a couple of weeks ago and I now repeat it since the information presented in the article bears it out: betting against nVidia's ability to execute on product design is a loser's game. They've only failed in that department once, with the FX 5000 series. Whether or not they execute in the marketplace is less certain, of course, but anyone who enjoys technological innovation should be drooling over the fresh engineering in Fermi.

Paper launch? You call it that when retail products are ballyhooed but not available for purchase. nVidia is not making any promises here.

Pre-emptive publicity? Of course. But so what, when you've got something so interesting to talk about?

What interests me is the potential simplicity in the scalability of this design. It has four major units. A die with one good unit makes an entry-level card, two for mid-level, three for enthusiast, and four for high-end. Look over the slides and ponder a budget card with 128 stream processors, or a $100-150 card with 256. Sounds tasty to me. And if binning proves to be that easy for them, it helps alleviate the potential wafer costs of the chip considerably.
 
So we have a paper launch, a "oh god, please don't leave me" press event.

We have "numbers" saying the new card is fantastic at tesselation, which seems to be a DX11 only feature, and no numbers for anything else.

Massively increased complexity, transistor count, power requirements, heat output, and probably very low clock speeds so it doesn't turn into an ingot melt inside your case.

Great job, Nvidia, really inspiring.
 
[H]ydra;1035200397 said:
"If we see NVIDIA trump now-AMD in this department it will be a big slap in the face."

I'm wondering if this should have been "trump AMD now"


amd_k6-2_logo.jpg
 
More BACK TO THE FUTURE stuff, which isn't always a bad thing.

All this talk of and emphasis on Tessellation...sounds rather like the KYRO on steroids but w/T&L capabilities this time around.
 
Why should anyone convince you that you should?

If they want to sell cards, red and green should.
It was a slight rhetorical question, not really meaning 'me' per se, rather, the group of consumers representing folks like me:
Those who want great performance, but have a relatively limited budget.
 
Nah... Fermi = Fail

Fermi is the product of Nvidia's arrogance.....It was a product conceived to prove Nvidia's bold statement that GPU>CPU.....and to beat what Nvidia mocking called "laughabee"....their efforts to prove this has made them take their eye off the real market for their product.....Pride really does come before a fall.

Intel could afford to experiment with Larrabee after all it is not their main business and if it didn't work out then they just lose the R&D costs (albeit a large cost) but at least they still have their main products in the market place.

Not so with Nvidia which is now left with a late graphics product which is the mainstay of their business which is unmanufacturable at an affordable/competetive price with real thermal and power consumption issues.

Seems to me putting this together with their failing chipset business that Nvidia may soon be partaking of their last meal before exercution...a great big slice of humble pie.
 
Last edited:
More BACK TO THE FUTURE stuff, which isn't always a bad thing.

All this talk of and emphasis on Tessellation...sounds rather like the KYRO on steroids but w/T&L capabilities this time around.

Well it seems DX11 finally makes actual use of tessellation. Although when we see games fully using it, no idea.
 
I'm not sure how this crap needed to be under an NDA. Nvidia said it can't give more out so as not to give away information to the competition? Doesn't that mean you're current release is a media show and has little to do with NDA? Instead, you're depending on the concept of NDA to generate excitement, not the information itself. I'd definitely have a few sharp words in the article about Nvidia too, as I'd be no fan of being a puppet to Nvidia's marketing.

The Nvidia Surround really is a bummer. When you buy a GF100 you'll have to account for SLI immediately, which kills the wallet. You can't just buy the best card and be done. You have to make a sacrifice. It's not like Nvidia is in the best value business. Either you go SLI and take a hit from the scaling but get nfinity, or buy a single card and lose nfinity. I don't see this playing out well for Nvidia.

Sure you can argue the validity of multimonitors, but that argument should be left for the "value" crowd, or entry level crowd. Your top of the line card not able to push 3 monitors is a problem. Why should I buy a $600 card that can't run Nvidia Surround?

In my opinion Nividia is setting itself up for failed expectations. Many consumers are brand loyal for arbitrary reasons and you've given them plenty of reasons to taste the red side. Then you follow up with an incomprehensible slide show touting a theoretical increase in gaming performance of games that are already playable. Nice move.
 
Well it seems DX11 finally makes actual use of tessellation. Although when we see games fully using it, no idea.

During this next year?
Dirt 2 has a small experiment by using it for the crowd.
No significant gain or reason to imo other than to make sure it works ok.
 
So we have a paper launch, a "oh god, please don't leave me" press event.

We have "numbers" saying the new card is fantastic at tesselation, which seems to be a DX11 only feature, and no numbers for anything else.

Massively increased complexity, transistor count, power requirements, heat output, and probably very low clock speeds so it doesn't turn into an ingot melt inside your case.

Great job, Nvidia, really inspiring.

People have been spreading the "nVidia's parts are hot!" comment for some time. The same was said of the 8800 and the GTX280 I believe. And sure they were not cool running parts, but nothing so freakishly bad as to require anything special other than a properly cooled rig, which isn't a problem for most people who buy this kind of hardware.
 
These cards are really nice and all (GF100 and 5970 etc), but for someone with the rig in my signature, and a 1680X1050 screen, I see little or no reason to upgrade from my GTX260. I have no bottlenecks,

Your bottleneck is your low resolution. 1680X1050 is good for web browsing and emailing grandma but any PC gamer holding on to such low resolution screens is just cheating themselves out of a much better experience.
 
People have been spreading the "nVidia's parts are hot!" comment for some time. The same was said of the 8800 and the GTX280 I believe. And sure they were not cool running parts, but nothing so freakishly bad as to require anything special other than a properly cooled rig, which isn't a problem for most people who buy this kind of hardware.

The "parts are hot" comments usually come from the really high failure rates on the early gen GTX parts. Nobody would care or even blick at the temps if the card never died. A lot of card were running hot and failing. Its not always people's rigs that are at fault.
 
Not so with Nvidia which is now left with a late graphics product which is the mainstay of their business which is unmanufacturable at an affordable/competetive price with real thermal and power consumption issues.

Once again not the first time that people have made the accusation over an nVidia card. And I know it won't be the last. If you're right, nVidia is going to loose a ton of money. I would hope that they have engineered a better process than one that's going to lead them into bankruptcy.

The heat and power issues will work themselves out, they always do usually for both nVidia and AMD. Maybe not a lot of overclocking headroom but the part will function just fine for most people with the proper environment for, lots of power and plenty of cooling.
 
Games are indeed made with much higher quality sources that are then scaled down to run at decent/good performance on modern hardware. It would be great if nV could get devs to release higher-quality options, for sure... it's a risky move PR-wise as your game can easily be slammed as "poorly coded" or "crappily optimized" like Crysis was, simply because it allowed for higher settings though :(. I haven't seen the article you're mentioning, but if that part's correct that has me excited for PC gaming more than "eyefinity" or "3d vision shutter-glasses" ever have.

this should not be done at all shit like this is fucking killing the pc gaming market no side should have a code advantage at all
 
The "parts are hot" comments usually come from the really high failure rates on the early gen GTX parts. Nobody would care or even blick at the temps if the card never died. A lot of card were running hot and failing. Its not always people's rigs that are at fault.

I'm not saying that there aren't issues. I bought for GTX 280s and I did have to RMA on of them. Not sure if it was a heat issue or not but there was some extreme jerkiness in some SLI games. Narrowed down the card I thought was causing the problem, RMA'ed it and I've not looked back.

So my GTX 280 experience has been extremely positive.
 
Does GF100 name relate to one GPU itself? So for instance a GF100 isn't a dual GPU related offering? It's just the name of the high-end GPU right?
 
I'm repeating myself from last year, but what makes this a paper launch? Sega launched the Sega Saturn like this in the 90's, a presentation followed by the words "it's in stores now", and it really was in the stores that very afternoon. These are just slides and words, I don't see them saying you can get them now?

(Yes yes, a paper launch killed my grand parents)


AvsP will be something to keep watch on, regarding tessellation, the aliens are looking mighty movie-like when you crank it up. Might be a good tessellation in games benchmark, it's a February or March title.



Does GF100 name relate to one GPU itself? So for instance a GF100 isn't a dual GPU related offering? It's just the name of the high-end GPU right?

Not sure but there's been GF100 and GF104 numbers thrown around for a while, and that they might be non-expensive/performance parts. Model numbers I thought but then we're back to GTXs, Gs and G92.
 
Your bottleneck is your low resolution. 1680X1050 is good for web browsing and emailing grandma but any PC gamer holding on to such low resolution screens is just cheating themselves out of a much better experience.

I disagree with this statement. Considering that most hdtv's in the 36 and less range are 720p, with native ranging from 1366x768 up 1680x1050, I really don't see why this "low resolution" is so bad. It's still quite a bit higher than an xbox 360 can decently handle. Not to mention at resolutions like 1440x900 and 1680x1050 you can usually use the remaining power to crank up the AA. I do admit that I love my 1920x1080 monitor, but that doesn't mean I've forgotten back when I played Doom 3 on 640x480.

Just because it's not the maximum resolution on the market doesn't mean he's cheating himself out of a much better experience. Not to mention he'll have the cash left over to upgrade the next time it's necessary for him to.

[On Topic] I really like the idea that gf100 is going the tessellation route. I've always been a big fan of it, since 2005 or so, and am glad to see them put a real effort behind it. I personally think it'll be as big of a switch as normal mapping. The SLi thing for Surround doesn't bother me either. I already have a 285, so if I want a good multi-monitor experience it won't be too hard for me to do. If someone wants it on a 380 series they will buy it there too. I've seen people here with graphics setups of well over $1k, and I see it as being similar.
 
Your bottleneck is your low resolution. 1680X1050 is good for web browsing and emailing grandma but any PC gamer holding on to such low resolution screens is just cheating themselves out of a much better experience.

I disagree with this statement. Considering that most hdtv's in the 36 and less range are 720p, with native ranging from 1366x768 up 1680x1050, I really don't see why this "low resolution" is so bad. It's still quite a bit higher than an xbox 360 can decently handle. Not to mention at resolutions like 1440x900 and 1680x1050 you can usually use the remaining power to crank up the AA. I do admit that I love my 1920x1080 monitor, but that doesn't mean I've forgotten back when I played Doom 3 on 640x480.

Just because it's not the maximum resolution on the market doesn't mean he's cheating himself out of a much better experience. Not to mention he'll have the cash left over to upgrade the next time it's necessary for him to.
+1., Exactly. Plus, the fact remains that my system is very well balanced. The CPU, GPU, Ram and resolution are all approximately on par with one another. Sure, I COULD game at higher resolutions, but my main concern is, given that my specs (or more precisely the horsepower represented by them) are pretty average for a midrange gaming machine, what value do these new cards have? The last generation brought great DX10 capabilities, MUCH better AA/AF capabilities (Af is essentially free anymore), and killer backwards horsepower (older game performance). FOr the ~$200 videocard buyer, I don't see these offering much of an incentive.

I mean, put it this way. Is there any reason, in March+, for me to replace hardware that came out in June 2008? I don't see a real reason for either company.
Given the nature of technology, that's a little surprising to me.
 
Does GF100 name relate to one GPU itself? So for instance a GF100 isn't a dual GPU related offering? It's just the name of the high-end GPU right?

GF100 is one GPU. GF100 is the highest end GPU based off of the Fermi architecture.

No other GPUs were discussed.
 
Not sure but there's been GF100 and GF104 numbers thrown around for a while, and that they might be non-expensive/performance parts. Model numbers I thought but then we're back to GTXs, Gs and G92.

The only official GPU that has been discussed is GF100, anything else is fud.
 
GF100 is one GPU. GF100 is the highest end GPU based off of the Fermi architecture.

No other GPUs were discussed.

So you're sure about that. I've read other reviews where they weren't so certain that this was indeed the highest end part, though it would seem the most plausible thing.
 
Not to mention at resolutions like 1440x900 and 1680x1050 you can usually use the remaining power to crank up the AA.
Exactly! Given identical pitch pitch, there's little difference in fidelity between a 1920x1200 image and a 1680x1050 image. The former is more "eye filling", which is subjectively beneficial to some, but not meaningful to others. The days where fidelity increased in correlation with resolution are basically over if you ask me.

In fact, I'm of the opinion that as resolution increases (and pixel pitch stays the same), the need for AA becomes more pressing and more difficult to play without given that the percentage of aliased pixels per rendered frame increases as resolution (and display size) increases. Given this, there are actually more opportunities for distracting aliasing on a larger display than on a smaller, lower-resolution one.

Given the choice between 1680x1050 and 4xAA/16xAF and 1920x1200 and 2xAA/8xAF (as an example), I'll pick 1680x1050. My next monitor is going to be a 1920x1200 or 1920x1080 display, but I find nothing to complain about when it comes to so-called "low res" gaming.
 
No one mentioned the bezel correction paragraph on page 29 of the whitepaper?

If this makes it into GF100's shipping drivers, AMD will have to kick it into gear and get that figured out in their drivers.
 
Very informative read to the extent of currently known information.

I would speculate nVidia themselves haven't been able to 'lock down', per se, the full power requirements, heat dissipation for the GF100. I could be wrong yet this lack of detail is very disconcerting. Of course, as stated in the article, cautionary measures were *probably* put into place to keep nVidia's stock from getting hammered even more.

What makes me cringe the most is the fully unknown power requirements. Using all available power from the PCIe bus plus additional power from separate power supply feed is, to me, cautionary. And I take this from the videos and pictures displayed online. If one card takes that much power than imagie the power source one would have to have to run 2 or 3 cards.

Still, this is an interesting approach. I hope nVidia succeeds yet at the same time, without being able to see the equipment the comparisons were made with I cannot in all good means formulate an opinion of positive or negative with regards to the current information presented.

Still have my doubts. Real World performance may prove me wrong in the end.
 
I'll give you a fact. Most enthusiasts and gamers have already voted with their wallets and went for a 5800 series ATi card. nVidia screwed the pooch on this. I wanted two next gen nVidia cards, oh well...
 
I'm ready to buy as soon as I have the money. I'm going to most likely get the ATI 5850. nvidia is late to the game. Physisx or however you spell it and CUDA are completely worthless to a gamer. proprietary formats are a bad thing. Ia m proud to have a cell phone that charges through mini-USB. the only reason to wait would be through prices dropping slightly due: to competition. The GF100 is rumored to be possibly 10% faster than a 5870. We don;t know if this is true or not. We don;t have pricing information. I'm going with AMD/ATI. They got great performance and at the right price. :D
 
I can't say I'm too impressed. Looking at the demo and synthetic benches, the card looks wicked fast and powerful. But look at the Unigen benchmark, and its only 1.6x that of a 5870. Sounds like the 5970 will still be every bit as fast (maybe even faster?) as the GF100. I know the 5970 isn't quite 2x 5870, but 60% scaling is usually pretty easy to get with CF/SLI nowadays.
 
Very informative read to the extent of currently known information.

I would speculate nVidia themselves haven't been able to 'lock down', per se, the full power requirements, heat dissipation for the GF100. I could be wrong yet this lack of detail is very disconcerting. Of course, as stated in the article, cautionary measures were *probably* put into place to keep nVidia's stock from getting hammered even more.

What makes me cringe the most is the fully unknown power requirements. Using all available power from the PCIe bus plus additional power from separate power supply feed is, to me, cautionary. And I take this from the videos and pictures displayed online. If one card takes that much power than imagie the power source one would have to have to run 2 or 3 cards.

I smell 2900-levels of heat and power.... The die is going to be HUGE. Given the architecture we have seen, you can light the whole die up at once with work to be done. It is going to be a pig....likely a fast pig, but a pig in terms of heat and power nonetheless.

For the guys that have been poopin' on our 1KW PSU reviews for a while saying they are worthless....well, they will not be worthless anymore. Not if you are running NV Surround on a GF100 system. That all said, I think we have only reviewed possibly one PSU capable of supporting a Quad-SLI GF100 box...
 
2 lower spec cards in SLI should be fine for 3D surround.
I dont think too many people will be getting the top end card in SLI !
 
Status
Not open for further replies.
Back
Top