Fermi is going to change gaming forever

"Yo, ATI, I'm really happy for you. I'm letchu finish. But Nvidia had one of the best GPUs of ALL TIME."

Nope. I really had nothing more to contribute to the discussion. Just like all of these threads about Fermi.
 
"Yo, ATI, I'm really happy for you. I'm letchu finish. But Nvidia had one of the best GPUs of ALL TIME."

Nope. I really had nothing more to contribute to the discussion. Just like all of these threads about Fermi.


True but can you compare anything that Nvidia has ever produced that has had the track record and longevity of this card?
radeon9800.jpg


the only thing I can even think of that comes close is the Ti4000 series. We can only hope that the new "Fermi" is even a fraction of what those two cards were as far as value and long life.


...that begs the question? Do Hardware developers want you get get long life from your products? Because after all the longer you have it, the less money your giving them.
 
True but can you compare anything that Nvidia has ever produced that has had the track record and longevity of this card?
radeon9800.jpg


the only thing I can even think of that comes close is the Ti4000 series.

the 9700pro lasted even longer. Was the card i used the longest.
 
I don't think ATI even designed the 9700. Didnt they buy a company, then acquired it?
 
True but can you compare anything that Nvidia has ever produced that has had the track record and longevity of this card?

the only thing I can even think of that comes close is the Ti4000 series. We can only hope that the new "Fermi" is even a fraction of what those two cards were as far as value and long life

the 8800GTX lasted longer then the 9800 Pro
 
And ATI somehow does not want to make tessellation and geometry power a standard thing with power to back it up? You're talking about ATI here, the company that has offered a tessellation unit in every card they've produced since the HD 2900 XT. Who do you think got tessellation into the DX11 spec in the first place?



Of course it is. You've evolved to make paragraphs, so now I don't have to reformat your post before I can read it. That's a PARADIGM SHIFT in my forum reading experience! It MUST have been due to the tessellation powerhouse that is Fermi.



However, a year later, Nvidia released their GeForce 3 with multi-sample AA, and it performed impressively. Meanwhile, the Radeon 8500 (released after the GF3) had "optimized" super-sample AA. I should know, I bought the 8500, and the AA performance was terrible.

See here: http://techreport.com/articles.x/2515/22

NOTE: I'm not talking about Quincunx. I mean straight-up MSAA, which made the GF3 %50 faster than anything else on the market at the time.



The Radeon 9700 is exceptional for two reasons only:

1. it ushered-in DX9
2. It was the first time ATI released a more powerful gaming card than Nvidia.

The MSAA 6x mode was interesting, and the 4x mode was rotated grid, but those were the only differences. In terms of performance, the 9700 Pro simply caught-up to what the GeForce 3 could do. It was only the massive performance of the 9700 Pro that made it a standout.


"Once we reach 1024x768, though, Quincunx is the only playable AA mode on any card."

Taken from your article. LOL. 9700 pro made MSAA playable at high resolution (1600x1200) a common thing in all new games. Geforce3 was not even CLOSE to being an AA powerhouse like the 9700 pro. Nice try !
 
the 8800GTX lasted longer then the 9800 Pro

QFT.

A lot of people did not get rid of there 8800 GTX's/GTS's until Nvidia released the G200 series & others waited until the Radeon R800 series came.The 8800's started to phase out in Q3 2009 due to the R800's but still quite a bit of people still own & use one. The R300's started to phase out at the start of the G70 series. The push of PCI Express also helped a huge amount with the phasing out of the R300's.

R300 - Q3 2002 - Q4 2005 (estimate)

G80 - Q4 2006 - Present
 
8800 Ultra - 2007

GTX 260 / 280 - 2008

The original 8800 came out in 06 yes, but it had started getting superseded by 2007/2008. I mean who is seriously trying to push the latest games on 19x12 or 25x monitors today besides the casual gamer?
 
not only will it cost 50% more but it will also have 50% more heat...can anyone say 750-800 watt PSU minimum

I thought CPU and GPU manufacturers were working on reducing heat/power requirements on all future products...Fermi is pushing the industry back 10 years

10 years ago the geforce 256 was out, and no way that took a 750 watt PSU.:rolleyes: I know because I had one with a craptacular 300 watt powersupply.

Its not power requirements that go down, its performance per watt that is supposed to go up. In that regard Fermi will probably be worse than 5XXX series, but still better than the GTX2XX series.
 
Exactly. ^^^^ lol. 750 watt psu with a G256! lol!!!! I think I ran one good on a 235 watt.
 
Yeah, played a fair bit of Crysis, and the trees are a lot better, but still need a lot of work.

Used a Ti4200 for maybe two years, then a 6600gt for a year. The 8800gtx lasted me the longest time of any video card. Coming in second for me might actually be the GTX 260, I don't see me upgrading until well after Fermi release.
 
Thats a good looking tree? Look at the angular nature of it, the square look.

You realize Crysis is still going to make the GF100 sweat right? You think it's going to revolutionize trees with a fucking tree benchmark or something? It is the job of the game developer to make the nice looking trees not nvidia or ATI, it's just their job to make the ultimate tree rendering card with super awesome TWIMTBP tree optimizations.
 
No I know full well it's up to the devs to do better trees lol. But the facts are alot of "things" like trees, cars, and characters are too angular in nature. They need refining, be it fermi or Ati 6800. The Fermi won't make a shitty tree look better....no, but it may give more power to take care of that so it doesn't happen again. ;)
 
crysis_demo_2.jpg


Thats a good looking tree? Look at the angular nature of it, the square look.

Look at trees from half a decade ago. If you can't see the massive leap then I don't know whats wrong with you.


Also fermi isn't going to fix this. It may push more triangles than the new radeons but it wont be as massive of a leap as you expect and most of the triangles are going to go towards ground textures and other things. Trees will still be modeled by hand and not for nothing but if you want to create leaves out of polygons you will kill the fermi. There are tens of thousands of leaves on those trees in crysis that along with everything else getting tessellation would kill the fermis . It would kill 8 fermis working together.

The main reason textures have lagged so far behind other things is because they take a crap load of bandwidth and textures in the last 10 years or so along with ram have been able to scale up and fake more polygons .

I agree that dx 11 is the way foward. Fermi is not the first dx 11 card on the market. Devs have been using cypress chips since last year while nvidia is just now getting a handfull of fermis into the hands of developers.

By the time devs really start using tesselation for more than just the ground an a few other choice things to speed up performance we will all be buying next gen gpus or the gen after that.
 
No I know full well it's up to the devs to do better trees lol. But the facts are alot of "things" like trees, cars, and characters are too angular in nature. They need refining, be it fermi or Ati 6800. The Fermi won't make a shitty tree look better....no, but it may give more power to take care of that so it doesn't happen again. ;)

Why not the radeon 5x00s ? You do know that ati took fixed tranistors and put the same tesselation unit in all of their chips from the 55x0 to the 58x0s ? Performance of the ati tech is going to scale very well through all the ranges. Nvidia doesn't have the same fixed tranistors and as hader power goes down so will tesselation.
 
No I know full well it's up to the devs to do better trees lol. But the facts are alot of "things" like trees, cars, and characters are too angular in nature. They need refining, be it fermi or Ati 6800. The Fermi won't make a shitty tree look better....no, but it may give more power to take care of that so it doesn't happen again. ;)

Crysis was a game that could run on many different video cards, despite that, it was still too demanding and not enough people had the hardware needed to run the game adequately to make the game as successful as Crytek would have liked. Despite that, you think that dev's are going to spend years of development time make a game that can be only be properly experienced on a VERY specific set of hardware? A set of hardware that the majority will not have? Not gonna happen.

Of course, graphics will continue to get better. More polygons, better geometry, better physics, but it will have nothing to do with Fermi, it has to do with simple evolution. Hardware gets more powerful with time and in turn, software is written to take advantage of it.
 
How is Fermi going to achieve this bold claim?

And why will Nvidia achieve it, and not ATI?

For any major changes to happen the game developers have to be on board, and for that we have to have wide spread adoption of hardware. The simple fact is that Fermi doesn't really bring anything new to the table, it's just more power at the end of the day.

The only game that stresses my 5970 is Crysis and honestly this is just 1 game, and it's a couple of years old now, we've not really seen nothing as impressive or demanding since then, despite constantly having more power available to us. So what do we even need all this extra power for?

It will be years before we see any significant graphical advancement in our games, we're going to have to wait until the consoles go through a hardware refresh and release their new generations before we see any significant movement. Sure we might see the odd game use some DX11 effects and maybe another game like Crysis which pushes the boundries, but the great majorety of games are going to be a blurry mess of console abortion.
 
How is Fermi going to achieve this bold claim?

And why will Nvidia achieve it, and not ATI?

For any major changes to happen the game developers have to be on board, and for that we have to have wide spread adoption of hardware. The simple fact is that Fermi doesn't really bring anything new to the table, it's just more power at the end of the day.

The only game that stresses my 5970 is Crysis and honestly this is just 1 game, and it's a couple of years old now, we've not really seen nothing as impressive or demanding since then, despite constantly having more power available to us. So what do we even need all this extra power for?

It will be years before we see any significant graphical advancement in our games, we're going to have to wait until the consoles go through a hardware refresh and release their new generations before we see any significant movement. Sure we might see the odd game use some DX11 effects and maybe another game like Crysis which pushes the boundries, but the great majorety of games are going to be a blurry mess of console abortion.

I disagree.

Dx 11 makes it very easy to down grade to dx 10 it takes days from what i've been told. I think your going to see alot of games start this year to use dx 11 for speed increases over the consoles dx 9 ports and the nyou will see many games simply have dx 11 pathes because it will be much faster and bring alot to the table and you will see pc gaming pick up steam again.

Esp if rumors of the next gen consoles getting delayed another couple years are true. Alot of pc gamers who jmped to the next gen consoles will jump back as they tired of the visuals there.
 
I disagree.

Dx 11 makes it very easy to down grade to dx 10 it takes days from what i've been told.

Consoles can't run DX10 paths, so not sure why this is relevant.

I think your going to see alot of games start this year to use dx 11 for speed increases over the consoles dx 9 ports and the nyou will see many games simply have dx 11 pathes because it will be much faster and bring alot to the table and you will see pc gaming pick up steam again.

I'd like to think so, I'd really like PC gaming to accelerate and pull ahead of everything else where it should naturally be, but everything I see in the industry now tells me that the PC market is just too small and insignificant in comparison to the console market to bother making games dedicated to the PC. Most of our big AAA games are at worst a console port, at best a multiplatform game with all the constraints of a console.

Esp if rumors of the next gen consoles getting delayed another couple years are true. Alot of pc gamers who jmped to the next gen consoles will jump back as they tired of the visuals there.

The visuals of the current gen consoles have been tired for a long time already and thats changed nothing. It doesn't make me hopeful.
 
As much as I'm a hardcore gamer, PC gaming is a niche. It just is. Advances in GFX brought on by competition between ATI and NV can only serve to provide a more compelling argument as to why play games on PCs.

I'm all for the competition, and I can't wait to see what NV will bring to the table. Fermi might not change gaming forever, but what it will do is provide somewhat of a rebirth to PC gaming. Will it be the be all end all? Probably not, but we can use all the advantages we can get.
 
Remember that current Fermi vs 58x0 battle is probably going to decide which company DX11 GPUs will be used in next gen console once they come out somewhere in 2013-2014.
 
No I know full well it's up to the devs to do better trees lol. But the facts are alot of "things" like trees, cars, and characters are too angular in nature. They need refining, be it fermi or Ati 6800. The Fermi won't make a shitty tree look better....no, but it may give more power to take care of that so it doesn't happen again. ;)

So what does Fermi bring new to the table again. How will it change gaming forever? Nobody's answered yet. And are we really being gouged by AMD right now? Inquiring minds want to know.
 
Crysis anyone? Still successful, and very much THE game all nerds always push to show off technology. Nerds are STILL trying to max this game. Buying multiple $500 videocards to do so. You think of some game comes along that pushes tessellation and geometry nerds won't eat it up?

You calling me a nerd? :p
 
battle? doesnt nvidia needs card to show up on the field before the can fight? oh wait we have nvidia supplied benches...

just saying
 
I have to say, I think there were a few key moments that should have ended this thread within the first 3 pages as a rediculous thread filled with avid supporters from both sides with little facts to argue with.

That being said, with the late-release, limited information and constant jerking of our chains, does anyone else think this card was named after the wrong person? I mean Fermi was a great physicist but why didn't nVidia name if off Kirchoff :D. You know, that circuit-current algorithm that sounds just like jerk-off? It seems oddly more appropriate.
 
Remember that current Fermi vs 58x0 battle is probably going to decide which company DX11 GPUs will be used in next gen console once they come out somewhere in 2013-2014.

I've read few weeks ago that Microsoft has already chosen which of the two will design their next generation graphics subset, and it will be ATI, again.
 
in my opnion fermi will be a huge deal in the long run, because nvidia was able to build it. as time goes on they will be able to make it smaller and better, much like what happened with the 8800gtx.

I personally hate AMD direction with their GPU's it reminds me of that AMD use to be like before the athalon, low cost parts that were sort of good bang for the buck but didn't push tech anywhere.
 
in my opnion fermi will be a huge deal in the long run, because nvidia was able to build it. as time goes on they will be able to make it smaller and better, much like what happened with the 8800gtx.

I personally hate AMD direction with their GPU's it reminds me of that AMD use to be like before the athalon, low cost parts that were sort of good bang for the buck but didn't push tech anywhere.

Another insightful post. :p
 
I personally hate AMD direction with their GPU's it reminds me of that AMD use to be like before the athalon, low cost parts that were sort of good bang for the buck but didn't push tech anywhere.

eyefinity and dx11 over half a year before NV? ;)
 
in my opnion fermi will be a huge deal in the long run, because nvidia was able to build it. as time goes on they will be able to make it smaller and better, much like what happened with the 8800gtx.

I personally hate AMD direction with their GPU's it reminds me of that AMD use to be like before the athalon, low cost parts that were sort of good bang for the buck but didn't push tech anywhere.

You hate AMD becuase you're a fanboy, nothing more nothing less. They've come out with poweful cards with new technology that use next to no power in 2D mode... And you're complaining?
 
Consoles can't run DX10 paths, so not sure why this is relevant.

No consoles can't run dx 10 paths. However dx 10 cards have been out now for years what 4 years now ? There is a huge install base of them that has largely taken over the install base of dx 9 cards even with those who just want to play peggle or wow. dx 11 cards are just getting kick started there are over 2m dx 11 parts sold or in the market thanks to amd less than a half a year after dx 11 launched.

So now what happens is a dev programs their dx 9 path for consoles and a dx 11 path for pcs. the direct x path can dynamicly shut off features so it runs properly on dx 10 10.1 cards. So devs have very little extra coding to support all 3 major graphics ip's for two big markets



I'd like to think so, I'd really like PC gaming to accelerate and pull ahead of everything else where it should naturally be, but everything I see in the industry now tells me that the PC market is just too small and insignificant in comparison to the console market to bother making games dedicated to the PC. Most of our big AAA games are at worst a console port, at best a multiplatform game with all the constraints of a console.

Don't forget. 2011 consoles mean some form of dx 11 in these consoles (Even sony's becuase they will likely go with amd or nvidia (perhaps power vr) but all of them develop for pcs and have dx 11 capable hardware or will shortly. So devs will start experimenting with dx 11 to get a jump on next gen development. You see that now with AVP and other games will follow. At the start of the year it will be teh expection that targets dx 11 , at the end of the year it should be a mix half and half and next year it will all be dx 11 but the odd ball game that doesn't.

The visuals of the current gen consoles have been tired for a long time already and thats changed nothing. It doesn't make me hopeful.

Whats changed is the foot print of dx 10 and how cheap fast cards are. The 5670 is not a great card. But its faster than the 3870 which came out in late 2007. Its a little over 2 years and not only is raw performance greatly up (sometimes twice as fast) but then there is dx 11 support on top of it all while using less power. The 55x0 parts should post similar performance as the 3870 and cost even less and use less power.

These are the parts that sell to the majority of pc owners that buy add in cards either through dell /hp or whatever or at bestbuy. Even igps are getting faster. The radeon 4200 is faster than the 3200 , dx 11 igps are coming and they should be faster than both of those. Laptop parts are getting faster and using less energy.

All these parts will see performance gains going with dx 11 /10 paths over dx 9 and its going to get to a point where dx 9 in pcs is in the minority. You will see a tidal wave hit that moves gaming program along.
 
You know what people miss is that fermi WILL bring something to the table. Just like all other GPUs it will be a stepping stone to whatever lies in the future. Maybe it fails, maybe not. We don't know that right now but really people seem to think the 5870 is so revolutionary? Question I have to ask is how many people have the kind of money to lay down on the 5870 and it's "revolutionary" co-products, aka three screens.Lets say I go buy three decent 24" monitors...that's what $450ish x3 or $1350. Well I can go to dell and buy two computers for that cost, and not really crappy ones at that. So maybe 0.1% of users are going to run this technology. So lets presume that 2 million GPUs have Eyefinity, then that means roughly 20 000 people are going to use multi-monitor. The only thing that is going to change in the near future is that companies are going to push cheaper and crappier monitors. TN panels are bad enough as it is and I'd stick with 22-24" IPS screens. So what does the 5870 really bring to the table...1600 shaders? The only thing I see AMD revolutionizing is ultra killer super duper parallel processing. Which is precisely what Fermi is designed for, so this makes me question something why does someone need that many shaders? You know I would buy a 5870...but it's totally not worth it, the benefits I get...none really. I'll wait and see what happens but I might buy another nvidia card or if the prices become low enough I might buy an ATI card (I doubt it, their drivers piss me off **hint hint Linux power user). Anyways whatever happens is whatever happens and I doubt we can really change it by discussing it.
 
You know what people miss is that fermi WILL bring something to the table.

So it won't change gaming forever? But it certainly doesn't bring anything new

Just like all other GPUs it will be a stepping stone to whatever lies in the future. Maybe it fails, maybe not. We don't know that right now but really people seem to think the 5870 is so revolutionary? Question I have to ask is how many people have the kind of money to lay down on the 5870 and it's "revolutionary" co-products, aka three screens.Lets say I go buy three decent 24" monitors...that's what $450ish x3 or $1350.
Add in another $450-600 for another Nvidia card in sli buddy cuz that's what its going to take if you go the Nvidia route for "nfinity"
But at least give credit here where its due AMD pushed gaming more than what Fermi has thus far with Infinity. Just a checkbox feature not required but nice to have. You can easily buy 24" TN panel all arguably decent @$200.
But that's a bit off topic just wanted to point out your hyperbole.

Well I can go to dell and buy two computers for that cost, and not really crappy ones at that. So maybe 0.1% of users are going to run this technology. So lets presume that 2 million GPUs have Eyefinity, then that means roughly 20 000 people are going to use multi-monitor. The only thing that is going to change in the near future is that companies are going to push cheaper and crappier monitors. TN panels are bad enough as it is and I'd stick with 22-24" IPS screens. So what does the 5870 really bring to the table...1600 shaders? The only thing I see AMD revolutionizing is ultra killer super duper parallel processing. Which is precisely what Fermi is designed for, so this makes me question something why does someone need that many shaders? You know I would buy a 5870...but it's totally not worth it, the benefits I get...none really. I'll wait and see what happens but I might buy another nvidia card or if the prices become low enough I might buy an ATI card (I doubt it, their drivers piss me off **hint hint Linux power user). Anyways whatever happens is whatever happens and I doubt we can really change it by discussing it.

For whatever lack of what you think AMD brings to the table you can pretty much say the same for Fermi lol! They came 6 months late and came with Nfinity and Nvidia Surround Vision. 2nd place doesn't cut it as it pertains to this thread title.
 
Last edited:
Back
Top