Fermi is going to change gaming forever

Wow, the AMD loyalist are really coming out of the woodworks as of late. All of the Fermi news has their panties all in a bunch =P. .

funny stuff AMD loyalist have been here since the 4 series. Were you in a coma? Fermi news and guess who's reeeeallly coming out of the woodwork? hmmmmm Actually the green fans all disappeared in late Nov, ya know when Fermi was supposed to be teh second coming of the G80. So here we are spoonfed marketing from nvidia with no solid numbers, performance, 1 questionable benchmark from a spoonfed Nvidia test machine, release date?
lol Just doing our jobs as your reality check. Your welcome.
 
Last edited:
DX11 does do tesselation (58XX series / 59XX series) as well, but they are purportedly gonna be doing it faster, AND using hardware made specifically for that job. Big difference there.

Apologies if someone hit on this before me, but ATI has had proprietary "Truform" hardware tessellation in cards since late 2001. DX11 just creates a standard so that game developers will bother coding for it.
 
funny stuff AMD loyalist have been here since the 4 series. Were you in a coma? Fermi news and guess who's reeeeallly coming out of the woodwork? hmmmmm Actually the green fans all disappeared in late Nov, ya know when Fermi was supposed to be teh second coming of the G80. So here we are spoonfed marketing from nvidia with no solid numbers, performance, 1 questionable benchmark from a spoonfed Nvidia test machine, release date?
lol Just doing our jobs as your reality check. Your welcome.

http://www.youtube.com/watch?v=UHysmKGGLA8 Seriously! :p

I mentioned AMD loyalist since they are out in full force ever since CES. You have to agree though, it's a much nicer word than the one that starts with fan & ends with boy :D. It is expected though as with any product from AMD, Nvidia, Intel, etc. I'm just poking fun at the fact =).
 
http://www.youtube.com/watch?v=UHysmKGGLA8 Seriously! :p

I mentioned AMD loyalist since they are out in full force ever since CES. You have to agree though, it's a much nicer word than the one that starts with fan & ends with boy :D. It is expected though as with any product from AMD, Nvidia, Intel, etc. I'm just poking fun at the fact =).

At least link something funny:) nice try though. When you can't attack the facts and message you gotta attack the poster. Again we've been here since the 4 series just passing that on to you for a second time. Have fun.
 
Fermi doesn't even have real independant hardware tesselation. Fermi is most likely a scraped GPGPU (General-Purpose computation on Graphics Processing Unit) project they had going that they took off the shelf to make into GT100 as a last ditch effort not to fall behind too much. The tesselation it has is slapped onto it along with a bunch of other stuff to the shader modules because it wasn't originally a graphics chip but a general purpose chip for data processing like Larabee. Instead of independent real tessellation hardware, this stuff might not scale as well when you bring the other parts into the mix.

The Fermi is probably comparable to the 5970 in terms of tesselation performance. You have to wonder why, at CES they compared it to the 5870 so much to show how awesome it's tesselation performance was (in a synthetic unigen benchmark that wouldn't expose weaknesses in other areas of the chip) instead of the 5970 which is it's equal in price range...and yet you can't even do triple monitors with a single Fermi when you can with a single 5970 (or any 5700/5800 model!).

That's like bringing out your new 500HP car and going to a car show and claiming that it's so much faster than the cheaper 300HP model from another company instead of a car in the same power and price range as your own.

I'm no AMD fanboy, I'd prefer to stick with Nvidia but what really changed gaming forever was triple monitor gaming (well, first thanks goes to Matrox) with Eyefinity. Nvidia has the problem where you can't do that unless you buy two cards. Wait for the next generation of Nvidia when they possibly have their stuff together. Fermi is overpriced (low yields) and overly power-hungry, and more of a stop-gap I believe.
 
Last edited:
Fermi doesn't even have real independant hardware tesselation. Fermi is most likely a scraped GPGPU (General-Purpose computation on Graphics Processing Units) project they had going that they took off the shelf to make into GT100 as a last ditch effort not to fall behind too much. The tesselation it has is slapped onto it along with a bunch of other stuff to the shader modules because it wasn't originally a graphics chip but a general purpose chip for data processing like Larabee. Instead of independent real tessellation hardware, this stuff might not scale as well when you bring the other parts into the mix.

The Fermi is probably comparable to the 5970 in terms of tesselation performance. You have to wonder why, at CES they compared it to the 5870 so much to show how awesome it's tesselation performance was (in a synthetic unigen benchmark that wouldn't expose weaknesses in other areas of the chip) instead of the 5970 which is it's equal in price range...and yet you can't even do triple monitors with a single Fermi when you can with a single 5970 (or any 5700/5800 model!)

It may be due to the fact that the GF100 is a GTX 360 which people seem to be completely forgetting about. In a sense, the GTX 360 WILL be competing with the 5870. The GF104 which will be the GTX 380 when it releases will be the card to compare to the 5970. That is if AMD has nothing newer out by the the time the GTX 380 releases. I don't think Nvidia would be stupid enough to compare what is considered a mid to high range GTX 360 model to AMD's highest end dual GPU Goliath. Even if it did perform better " on paper ", it would be so narrow that the comparison would be mute to most.
 
It may be due to the fact that the GF100 is a GTX 360 which people seem to be completely forgetting about. In a sense, the GTX 360 WILL be competing with the 5870.

That's if the prices line up. Aren't people saying that GT100 is going to be a $600 part? How does that compete with what will be a sub $400 5870? The 5970 retail cost is down to $600 itself already with MSRP a little higher but will drop as well.
 
Fermi doesn't even have real independant hardware tesselation. Fermi is most likely a scraped GPGPU (General-Purpose computation on Graphics Processing Unit) project they had going that they took off the shelf to make into GT100 as a last ditch effort not to fall behind too much. The tesselation it has is slapped onto it along with a bunch of other stuff to the shader modules because it wasn't originally a graphics chip but a general purpose chip for data processing like Larabee. Instead of independent real tessellation hardware, this stuff might not scale as well when you bring the other parts into the mix.

The Fermi is probably comparable to the 5970 in terms of tesselation performance. You have to wonder why, at CES they compared it to the 5870 so much to show how awesome it's tesselation performance was (in a synthetic unigen benchmark that wouldn't expose weaknesses in other areas of the chip) instead of the 5970 which is it's equal in price range...and yet you can't even do triple monitors with a single Fermi when you can with a single 5970 (or any 5700/5800 model!).

That's like bringing out your new 500HP car and going to a car show and claiming that it's so much faster than the cheaper 300HP model from another company instead of a car in the same power and price range as your own. Wait for the next generation of Nvidia when they possible have their stuff together.

I'm no AMD fanboy, I'd prefer to stick with Nvidia but what really changed gaming forever was triple monitor gaming (well, first thanks goes to Matrox) with Eyefinity. Nvidia has the problem where you can't do that unless you buy two cards.

Ya that is the latest consensus I'm hearing too. Fermi was originally supposed to be a Tesla/GPGPU project. They were working on something that didn't pan out in design and used it as a fallback.

Also I can't find anything else on their tessellation beyond what they posted here. The only close up info they provide is here
nvidia_fermi_slide_06.png


The first link in regards to tesselation, if you want to look at it objectively has no further info than just red and green bars. The rest of the tests nobody else can reproduce since only Nvidia has them. That alone stinks of bs. Are people this gullible? (lets make that a rhetorical question)
 
Last edited:
I have no doubt that Nvidia will shove a high price tag down the consumers throat as that is the norm for them. They will want to suck up as many first week buyers as possible before the price starts to fall to something more reasonable. I would honestly be shocked however to see a card that reads GTX 360 for $600+. I am suspecting no higher than $549.99 with my actual guess of it being $499.99. Their has to be someone smart enough over at Nvidia that will know that launching with a $600 tag for a mid to high range part will cause nothing but PR headaches with the hell they'll catch from consumers.
 
http://www.youtube.com/watch?v=UHysmKGGLA8 Seriously! :p

I mentioned AMD loyalist since they are out in full force ever since CES. You have to agree though, it's a much nicer word than the one that starts with fan & ends with boy :D. It is expected though as with any product from AMD, Nvidia, Intel, etc. I'm just poking fun at the fact =).

You mentioned AMD loyalist because you're an nVidia loyalist and you've got nothing else to counter with except to "poke fun" I mean seriously, who's the one being unreasonable? The AMD camp most of whom fully expect a new and yet unreleased nVidia card to have a slight performance advantage over competing and existing AMD cards. Or the nVidia camp who think a Fermi is going to change the world of gaming?

What reliable sources do you have that share this sentiment? So far i've counted 0
 
Bullies! WTF! :eek::cool:;) I know damn well Ati had truform with the Radeon 8500 way back in 2001. It was a good concept, good idea, but marred by being much too early to the table, not enough power, and no standardization. Nvidia seems like it wants to make tessellation and geometry power a standard thing with power to back it up. That is good for ALL of us in the LONG run. Run along now trolls....run. ;)

Evolution my friends....it's key. The Geforce 2 GTS had working anti aliasing, but did it perform optimally? Of course not. However, some 2-3 years later, the beast named the Radeon 9700 Pro came along and MADE THAT HAPPEN by giving us playable, smooth, anti aliasing. That is evolution. It's all good for us.
 
You mentioned AMD loyalist because you're an nVidia loyalist and you've got nothing else to counter with except to "poke fun" I mean seriously, who's the one being unreasonable? The AMD camp most of whom fully expect a new and yet unreleased nVidia card to have a slight performance advantage over competing and existing AMD cards. Or the nVidia camp who think a Fermi is going to change the world of gaming?

What reliable sources do you have that share this sentiment? So far i've counted 0

That was the OP's thread topic but no where did I mention that it would change the world of gaming. I shouldn't have any reliable sources because as you mentioned, IT"S AN UNRELEASED PRODUCT!!!. It's all in the wording my friend & that's why I use lines such as " if the early benchmarks shown holds up " when referring to the Nvidia slides shown. It doesn't confirm it to be accurate but just something to start with. Do not worry though, once official & PC enthusiast site based reviews are up, I'll have plenty of reliable sources to share with you =).
 
That was the OP's thread topic but no where did I mention that it would change the world of gaming. I shouldn't have any reliable sources because as you mentioned, IT"S AN UNRELEASED PRODUCT!!!. It's all in the wording my friend & that's why I use lines such as " if the early benchmarks shown holds up " when referring to the Nvidia slides shown. It doesn't confirm it to be accurate but just something to start with. Do not worry though, once official & PC enthusiast site based reviews are up, I'll have plenty of reliable sources to share with you =).

You chimed in this thread when the "AMD loyalists" voiced their disagreement with the OP. Either you agree with him or you don't. If you agree, then my argument stands. If you don't, then your posts are hypocritical.

You also called people fools for using Charlie as a source while failing to come up with anything yourself, despite the fact that he's been pretty accurate in regards to Fermi. Over-dramatic, yes, but pretty accurate nevertheless. So who's foolish? Those using a source that has been pretty accurate to make an argument or those who are making arguments armed with nothing more then their own personal opinions to back them?
 
Wow, the AMD loyalist are really coming out of the woodworks as of late.

I haven't seen any of that actually. Would you please link me to some examples of AMD "loyalists" coming out of the "woodworks" lately? Thanks in advance.

I will say though that if the early benchmarks shown holds up when the GTX 360 is released, then AMD better have an response and quick.

Reread that sentence and then go watch http://www.youtube.com/watch?v=UHysmKGGLA8 as you so kindly linked to kadozer. Also, you must admit it's a bit amusing that you say that AMD better have a quick response to Fermi when Nvidia completely failed to quickly respond to AMD for some time now, which was a disappointment to me as I was really looking forward to that release. Hopefully Nvidia can turn things around soon and get back in the game.

If a mid to high range GTX 360 can push out 20+ FPS in Farcry 2 & Dark Void over AMD's top single GPU 5870, then imagine what a 512SP GTX 380 will do to it =P.

I really wish you would have taken the time to read that article I linked to because it would have balanced out the Nvidia marketing you went through.

It's still to early to confirm those benchmarks but they seem to be legit so far.

Why do they seem legit? What are you basing this on?

For the record, whoever believes in anything that Charlie says is just as much of a fool as he is.

I realize you've been on the forums for 5.1 years now, but I'd like to know if you've taken the time to read The [H]ard Rules?

"(1) Absolutely NO FLAMING, NAME CALLING OR PERSONAL ATTACKS. Mutual respect and civilized conversation is the required norm." Please keep this rule in mind during your future posts.

You have to agree though, it's a much nicer word than the one that starts with fan & ends with boy :D.

You couldn't come up with a nicer word for "fool" earlier? Why start being nice now?

It may be due to the fact that the GF100 is a GTX 360 which people seem to be completely forgetting about. In a sense, the GTX 360 WILL be competing with the 5870. The GF104 which will be the GTX 380 when it releases will be the card to compare to the 5970.

Where did you get this information from? Would you please link me to the source? Thanks in advance.

I have no doubt that Nvidia will shove a high price tag down the consumers throat as that is the norm for them. They will want to suck up as many first week buyers as possible before the price starts to fall to something more reasonable. I would honestly be shocked however to see a card that reads GTX 360 for $600+. I am suspecting no higher than $549.99 with my actual guess of it being $499.99. Their has to be someone smart enough over at Nvidia that will know that launching with a $600 tag for a mid to high range part will cause nothing but PR headaches with the hell they'll catch from consumers.

Yet again I wish you had read that article for another perspective.

That was the OP's thread topic but no where did I mention that it would change the world of gaming. I shouldn't have any reliable sources because as you mentioned, IT"S AN UNRELEASED PRODUCT!!!. It's all in the wording my friend & that's why I use lines such as " if the early benchmarks shown holds up " when referring to the Nvidia slides shown. It doesn't confirm it to be accurate but just something to start with. Do not worry though, once official & PC enthusiast site based reviews are up, I'll have plenty of reliable sources to share with you =).

If you'd like to troll people for fun please find another forum. Your posts are more of a disruption than a contribution.
 
Last edited:
I think it's funny how Charlie's article that some are referring to ( the one about Fermi being unmanufacturable) came out right after nVidia gave up some info about there new card, lol. Coincidence? I think not. Maybe he did it to slow down some of the Fermi hype? I think so :).
Cause we all know where Charlie stands with his views.

But thinking that Fermi will change gaming forever, lol. The only things that change gaming is if they start actually making games for the PC, and DX11.
 
Bullies! WTF! :eek::cool:;) I know damn well Ati had truform with the Radeon 8500 way back in 2001. It was a good concept, good idea, but marred by being much too early to the table, not enough power, and no standardization. Nvidia seems like it wants to make tessellation and geometry power a standard thing with power to back it up. That is good for ALL of us in the LONG run. Run along now trolls....run. ;)

Evolution my friends....it's key. The Geforce 2 GTS had working anti aliasing, but did it perform optimally? Of course not. However, some 2-3 years later, the beast named the Radeon 9700 Pro came along and MADE THAT HAPPEN by giving us playable, smooth, anti aliasing. That is evolution. It's all good for us.

Umm funny that the card that you are stating as giving us playable whatnot is an ATI card... It's pretty obvious that ATI and Nvidia have been playing the leapfrog game for years. Expect incremental power gains. That's the norm. I haven't heard of any feature that Fermi is supposed to have that is going to 'revolutionize' the gaming industry except to be a little bit faster because it came out later. Will Fermi even do triple monitors or are is Nvidia still 'locking' that feature for their much more expensive and profitable Quadro card line?
 
Fermi is the name of the architecture, not like anyone else has terribly creative names for these things anyway.

Yeah, I was just horsing around, giving the original premise of this thread the "serious" attention it deserves.

But having "Bolo" in your computer does sound mighty d_mn cool.
 
Crysis was "a" defining point in game development and how the future of game development would be. Sadly, Crysis was highly pirated causing Developers to abandon making titles for PC only. While though, there are games that are still made PC only. Will end users see Games being made like Crysis were, NO! They only have themselves to blame too. Games should be made like Crysis, where the game outpaces the hardware technology and cripples the system. For as newer technology is developed, it makes the community as a whole want to go back and optimize and replay that game.

That is how I look at this whole Nvidia and Ati card situation. Most games are consoled now of days from a profit producing standpoint. When we all wonder why we are behind on the times for PC game development, look to the past if you can stand it. The hardware may be great but like Brent says if they dont develop the game with the technology than were all out of luck.
 
I think it's funny how Charlie's article that some are referring to ( the one about Fermi being unmanufacturable) came out right after nVidia gave up some info about there new card, lol. Coincidence? I think not. Maybe he did it to slow down some of the Fermi hype? I think so :).
Cause we all know where Charlie stands with his views.

But thinking that Fermi will change gaming forever, lol. The only things that change gaming is if they start actually making games for the PC, and DX11.

Did you really think Charlie wouldn't respond at all? :eek: :p I just like how it provides balance from one extreme to another. :cool:
 
[H]ydra;1035215218 said:
Did you really think Charlie wouldn't respond at all? :eek: :p I just like how it provides balance from one extreme to another. :cool:
Oh no I knew he would, somebody has to try to do possible damage control for ATI, lol.
 
[H]ydra;1035215218 said:
Did you really think Charlie wouldn't respond at all? :eek: :p I just like how it provides balance from one extreme to another. :cool:

Oh no I knew he would, somebody has to try to do possible damage control for ATI, lol.

Here's where Charlie talks about how this article came about. Its actually pretty funny read as some members egg him on about his thoughts on the recent tidbits about Fermi. His response was the latest article.

http://forum.beyond3d.com/showthread.php?t=56155&page=19


btw to answer the OP's question I would say Fermi ups the ante on tessellation but nothing else. Tessellation has been around for a long time. Hopefully we get some if not alot more games now we have this checkbox feature being pushed from both sides. I'm all for it because I'm tired of the stuff they are using now.
 
Microsoft implemented hardware tessellation in DX11, HD 5000 is the first hardware that supports DX11 and yet Fermi is the one which is going to change gaming forever? What a great logic to follow there :eek:
 
GF100/Fermi isn't going to change gaming.

lets look back alittle and see what (at least since i've been playing games) actually has changed gaming or more specifically what has become a "standard" in games (What has been adopted?)


1. 3dfx's Voodoo1 - Pretty much started the PC Gaming revolution
2. 3dfx's T-Buffer (V5) - Introduced FSAA, pretty much a staple of todays hardware
3. Nvidia's TNT - 32Bit color. it sucked at it but it was there
4. Nvidia's TnL (Geforce256) - although the 256 was a horrible implimentation of the hardware TnL (My AMD Athlon at the time could process the TnL calculations faster than the 256) it has now become a staple of gaming
5. Microsofts DX9 - self explanitory :)


i'm sure the list is longer then this but you can get an idea of what it takes to actually change or become a gaming standard.

Will Tessellation become a standard? who knows,
 
Hmmmm well by the looks of it, he may actually be right in saying that it is going to change gaming forever. time will ultimately tell though

http://www.techreport.com/articles.x/18332

Did you read that article you just posted? Techreport isn't too thrilled with Fermi.
Obviously, the GF100 is a major architectural transition for Nvidia, which helps explain its rather difficult birth. The advances it promises in both GPU computing and geometry processing capabilities are pretty radical and could be well worth the pain Nvidia is now enduring, when all is said and done. The company has tackled problems in this generation of technology that its competition will have to address eventually.

In attempting to handicap the GF100's prospects, though, I'm struggling to find a successful analog to such a late and relatively large chip. GPUs like the NV30 and R600 come to mind, along with CPUs like Prescott and Barcelona.
 
This sounds like a troll post. The amount of hype nvidia has said about this only to not show Amy benchmarks makes me think otherwise.
 
Did you read that article you just posted? Techreport isn't too thrilled with Fermi.

Yea i actually did, it may just be the sleep deprivation as i haven't slept since tuesday night but the majority of the article seems to be focused on how nvidias ability to create a new architecture that allows more complexities in gaming environments is a good thing. its ability to take advantage of DX11 entirely as opposed to ATI's 5xxx inability to completely utilize it is going to be better for fermi down the long road, not the short track. did i read that wrong? how is this not a good thing for nvidia? they've created something that enables programmers to fully utilize DX11s capabilities. idk, i need sleep, ill look at it again after i take a nap.
 
Yea i actually did, it may just be the sleep deprivation as i haven't slept since tuesday night but the majority of the article seems to be focused on how nvidias ability to create a new architecture that allows more complexities in gaming environments is a good thing. its ability to take advantage of DX11 entirely as opposed to ATI's 5xxx inability to completely utilize it is going to be better for fermi down the long road, not the short track. did i read that wrong? how is this not a good thing for nvidia? they've created something that enables programmers to fully utilize DX11s capabilities. idk, i need sleep, ill look at it again after i take a nap.

It says right there in what I quoted from the article. His analogous comparison to NV30/R600 should say it all.
Cliff notes after the 3 pages of architecture discussion:
-There isn't enough solid info from Nvidia
-Exercise caution and skepticism with what they gave us thus far.
-its late late late( i counted 3x) lol
 
Yea i actually did, it may just be the sleep deprivation as i haven't slept since tuesday night but the majority of the article seems to be focused on how nvidias ability to create a new architecture that allows more complexities in gaming environments is a good thing. its ability to take advantage of DX11 entirely as opposed to ATI's 5xxx inability to completely utilize it is going to be better for fermi down the long road, not the short track. did i read that wrong? how is this not a good thing for nvidia? they've created something that enables programmers to fully utilize DX11s capabilities. idk, i need sleep, ill look at it again after i take a nap.

It might not be a good thing if the card doesn't perform. It's definitely a good thing for gaming's future that they are pushing big upgrades in geometry performance, but that won't be good for GF100 unless it has the performance in games that exist now or are likely to exist in the near term.

Down the long road, Fermi will be superseded by a new gen.
 
its ability to take advantage of DX11 entirely as opposed to ATI's 5xxx inability to completely utilize it is going to be better for fermi down the long road

5xxx supports every funtion of DX11, yes the Fermi may do it faster, but that doesn't mean the the 5xxx is unable to utilize the functions?

I believe Fermi will be a faster card than the 58XX series, but I already have a 5850, so I will wait fo the next cycle for an upgrade.
 
I hope you're right, but my personal opinion is that the majority of games will continue to be designed for console hardware, with very few PC-centric titles actually taking advantage of the hardware.

My opinion is pretty much this ^^^^ too.

Games aren't going to look any different than they have the last 5 years, until the next generation of consoles appears and the developers have had a chance to use them for a few years.

This is why 3D and multimonitor are more important right now. We need technologies that can take console ports and enhance them automagically.
 
Crysis was "a" defining point in game development and how the future of game development would be. Sadly, Crysis was highly pirated causing Developers to abandon making titles for PC only. While though, there are games that are still made PC only. Will end users see Games being made like Crysis were, NO! They only have themselves to blame too. Games should be made like Crysis, where the game outpaces the hardware technology and cripples the system. For as newer technology is developed, it makes the community as a whole want to go back and optimize and replay that game.

That is how I look at this whole Nvidia and Ati card situation. Most games are consoled now of days from a profit producing standpoint. When we all wonder why we are behind on the times for PC game development, look to the past if you can stand it. The hardware may be great but like Brent says if they dont develop the game with the technology than were all out of luck.

No, its more like Crysis didn't sell Halo-like numbers like Crytek wanted and then they decided to be whiny little bitches.
 
You can't blame them.. the console market is simply very lucrative right now. Companies will always go where the money is.
 
No, its more like Crysis didn't sell Halo-like numbers like Crytek wanted and then they decided to be whiny little bitches.

haha its true!

Here's the thing, Crysis was so demanding on peoples machines that instead of wasting 50 dollars people downloaded it to see if it would run on there PC's. Once they had the game there was no reason to run out and buy it regardless of how it ran.

Crytek shot themselves in the foot with Crysis. They should have invested more time in marketing and story development.
 
Back
Top