Fermi is going to change gaming forever

Bullies! WTF! :eek::cool:;) I know damn well Ati had truform with the Radeon 8500 way back in 2001. It was a good concept, good idea, but marred by being much too early to the table, not enough power, and no standardization. Nvidia seems like it wants to make tessellation and geometry power a standard thing with power to back it up. That is good for ALL of us in the LONG run. Run along now trolls....run. ;)

And ATI somehow does not want to make tessellation and geometry power a standard thing with power to back it up? You're talking about ATI here, the company that has offered a tessellation unit in every card they've produced since the HD 2900 XT. Who do you think got tessellation into the DX11 spec in the first place?

Evolution my friends....it's key.

Of course it is. You've evolved to make paragraphs, so now I don't have to reformat your post before I can read it. That's a PARADIGM SHIFT in my forum reading experience! It MUST have been due to the tessellation powerhouse that is Fermi.

The Geforce 2 GTS had working anti aliasing, but did it perform optimally? Of course not.

However, a year later, Nvidia released their GeForce 3 with multi-sample AA, and it performed impressively. Meanwhile, the Radeon 8500 (released after the GF3) had "optimized" super-sample AA. I should know, I bought the 8500, and the AA performance was terrible.

See here: http://techreport.com/articles.x/2515/22

NOTE: I'm not talking about Quincunx. I mean straight-up MSAA, which made the GF3 %50 faster than anything else on the market at the time.

However, some 2-3 years later, the beast named the Radeon 9700 Pro came along and MADE THAT HAPPEN by giving us playable, smooth, anti aliasing. That is evolution. It's all good for us.

The Radeon 9700 is exceptional for two reasons only:

1. it ushered-in DX9
2. It was the first time ATI released a more powerful gaming card than Nvidia.

The MSAA 6x mode was interesting, and the 4x mode was rotated grid, but those were the only differences. In terms of performance, the 9700 Pro simply caught-up to what the GeForce 3 could do. It was only the massive performance of the 9700 Pro that made it a standout.
 
What is fermi really going to bring to the table that Ati hasn't already? I wanna know, other than faster clock speeds and more power use?
 
haha its true!

Here's the thing, Crysis was so demanding on peoples machines that instead of wasting 50 dollars people downloaded it to see if it would run on there PC's. Once they had the game there was no reason to run out and buy it regardless of how it ran.

Crytek shot themselves in the foot with Crysis. They should have invested more time in marketing and story development. I bet they would have sold more copies of the game had they made a deal with a graphics card or system manufacturer and included the game with a new video card or motherboard.

Marketing was a big problem for them. It wasn't that they didn't market it, it was that every little piece of information they spilled about it said that the only way to play the game is on a super high-end system with the best technology at the time and even then it wouldn't max it out because the game was "developed with future technology in mind" or whatever BS they said. Their marketing made people think the game was simply impossible to player on other systems.
 
not only will it change gaming but Fermi is going to change the World as we know it :D
 
Like DX11.

That's not automatic at all. It requires a developer to start re-writing the game to take advantage of features that the consoles can't manage. Look at Dirt2. That's supposed to be some flagship DX11 tesselation title that AMD supported to show off the 5XXX series DX11 features. It was delayed by several months and you ended up with a feature that you'd be hard pushed to know if it's even switched on or not.
 
What is fermi really going to bring to the table that Ati hasn't already? I wanna know, other than faster clock speeds and more power use?

The nVidia badge, you know how many people who would turn a blind eye on something when they see the badge.
 
No, its more like Crysis didn't sell Halo-like numbers like Crytek wanted and then they decided to be whiny little bitches.

Yea, and Crysis actually sold well for a PC title. Well over 1m in RETAIL sales BY FEB 2008, not including the recent addition of Crysis to the digital download scene or the 2 years since those sales figures were posted.

Anyway, we are going to have to see who does tesselation better when games like AvP and beyond are released
 
What is fermi really going to bring to the table that Ati hasn't already? I wanna know, other than faster clock speeds and more power use?

A quick Google will answer this question. Basically Fermi is going to go about handling graphics in a different way, it is still going to be a DX11 part, but some of the tech in the chip will benefit Nvidia's proprietary APIs, etc. The chip design is made to be a jack of all trades GPGPU, vs ATI's more elegant solution which is designed to efficiently handle industry standard API such as DX11, OPENGL, OPENCL, etc.

/facepalm

This whole thread is kinda laughable.:rolleyes:

Right? Another locked graphics rant thread coming soon. Grand Master Sonda ain't doin' threads like this anymore, so someone had to step up to the plate. hehehe... :D
 
30kxc9g.png

fermichart.jpg
 
A quick Google will answer this question. Basically Fermi is going to go about handling graphics in a different way, it is still going to be a DX11 part, but some of the tech in the chip will benefit Nvidia's proprietary APIs, etc. The chip design is made to be a jack of all trades GPGPU, vs ATI's more elegant solution which is designed to efficiently handle industry standard API such as DX11, OPENGL, OPENCL, etc. :D


How is Fermi going to handle graphics in a different way? Jack of all trades? Master of none.
 
How is Fermi going to handle graphics in a different way? Jack of all trades? Master of none.

The chip is architecturally different, here's some info for reference:

http://techreport.com/articles.x/17670/2

I'm not a fanboy of Nvidia and went ATI, because it was right for this upgrade cycle. I think Fermi is only going to get good after its second iteration. i.e.) Power/Heat down, bugs worked out, etc. Which will probably be 1.5 t o 2 years from now.
 
The chip is architecturally different, here's some info for reference:

As far as games are concerned Fermi is still *very* similar to the GTX 2xx architecture. Fermi isn't going to change how games are rendered or anything remotely similar to that. Fermi's architecture changes are far more subtle than, say, the shift to unified shaders (as far as games are concerned). Even as far as GPGPU goes Fermi is still very much an evolutionary architecture update, and not some drastic overhaul.

Also, the architecture is irrelevant to games, as they neither know nor care about the card's architecture. The 5xxx series is a vastly different architecture to both the GTX 2xx and Fermi architectures, for example, but it doesn't matter because that is why we have DX and OpenGL. And as for that, Nvidia is very much playing catch up to ATI here. All their talk of super fast tessellation (which remains to be seen) and such is just to try and mask the fact that they are very late to this generation, and extremely late to the tessellation party. ATI has years of experience in this area whereas Nvidia is starting from scratch.

Personally I just can't wait for all the people who said Eyefinity was stupid, too expensive, etc... to turn around and praise nFinity - despite being the same thing as Eyefinity yet costs even more. :rolleyes:
 
Just sitting on the fence and waiting.Price in the end will determine what I buy,that and decent performance.
 
As far as games are concerned Fermi is still *very* similar to the GTX 2xx architecture. Fermi isn't going to change how games are rendered or anything remotely similar to that. Fermi's architecture changes are far more subtle than, say, the shift to unified shaders (as far as games are concerned). Even as far as GPGPU goes Fermi is still very much an evolutionary architecture update, and not some drastic overhaul.

I wrote "Handled" differently, not that the rendering of graphics will change. The link I pointed to indicates that Fermi has better scheduling of tasks, which is somewhat similar to hyperthreading, which "should" help gaming and improve parallel computing.

Look dude, I'm not looking to get into an argument with you, and just putting in my 2 cents in. The thread title is a little over the top...
 
Besides the possibility of being faster, how will Fermi change gaming when ATI already have the features you named?

Oh, because Nvidia is coming out with it. I understand now.
Fermi will only push Devs to make more DX11 games, end of story.

While I know you're being sarcastic, it kind of makes sense. How many game dev's you see that have ATI logos on their games? It's always that TWIMTBP shit that pops up. :(
 
lol @ geforce 3 playable aa! Maybe semi playable, but I think everyone agrees that the 9700 pro is what got us gamers using playable full screen anti aliasing.
 
While I know you're being sarcastic, it kind of makes sense. How many game dev's you see that have ATI logos on their games? It's always that TWIMTBP shit that pops up. :(

Ding Ding! That's why Nvidia hardware costs more too! They have to bribe devs into optimizing games for their hardware! j/k... well maybe not... :D
 
Ding Ding! That's why Nvidia hardware costs more too! They have to bribe devs into optimizing games for their hardware! j/k... well maybe not... :D

well maybe so, don't believe the marketing..

check out hte ATI's developer interivew posted a few weeks ago

http://www.bit-tech.net/bits/interviews/2010/01/06/interview-amd-on-game-development-and-dx11/1

Richard Huddy said:
Well the reason why you hear less on the AMD front is not because we do less, but we don't market our developer relations in the same way that Nvidia do
...

I mean, I've still got a respect for Jen-Hsun because he knows his tech too, but Nvidia is clearly a marketing lead company. [Nvidia] has put this big program together with equally big label that they shove in your face all the time, but I don't really think what they do is radically different than what we do - we just don't make as much noise about it.
 
It is up to the game devs to actually use these features... Fermi or 5870 is nothing without the game devs utilizing the features and performance available on the cards. Fermi won't change anything if the game devs don't change anything. Let's hope they do.

For now though, Eyefinity is certainly something that can be used right now on old and new games, without having to wait, and IMO (and many others) does "change gaming forever" It is a feature that can utilize the awesome performance of the GPUs to push higher resolutions and displays right now, giving even old games a whole new experience.

Look how many new games are still DX9 based.... just sayin, maybe that will change, but not overnight, but right now AMD does offer something that can improve your gameplay experience on those DX9 games

Most games are DX9 based because the consoles are DX9 based. Next gen consoles should be DX11 based. When they launch, DX11 will become common place. Until then we'll just get a trickle.
 
Most games are DX9 based because the consoles are DX9 based. Next gen consoles should be DX11 based. When they launch, DX11 will become common place. Until then we'll just get a trickle.

Consoles aren't DX9 based. The PS3 and Wii use a variation of OpenGL. The 360 uses a customized variation of DX.
 
Pfft don't be suckers waiting for Nvidia, I gave my girlfriend a fermi last night :D

Got a much better return on investment also :p
 
well nvidia is going to do same with this architecture that ati did with r600, they just developed and developed on it, it was an effiecient architecture and did quite well for them, once they tweaked it up to improve AA performance and I believe they have scaled it to its limits, and like the new has been around ATi has already been working on the next gen architecture from ground up that will be revealed with the hd 6000 series, I think AMD/ATI didn't have to spend a whole lot on RnD on hd 5000 series than nvidia had to do on fermi, but just like fermi will pay off, the original r600 architecture paid of in a sense for AMD, I have no doubt that AMD might be heading the similar route like Nvidia when it comes to HD 6000
 
well nvidia is going to do same with this architecture that ati did with r600, they just developed and developed on it, it was an effiecient architecture and did quite well for them, once they tweaked it up to improve AA performance and I believe they have scaled it to its limits, and like the new has been around ATi has already been working on the next gen architecture from ground up that will be revealed with the hd 6000 series, I think AMD/ATI didn't have to spend a whole lot on RnD on hd 5000 series than nvidia had to do on fermi, but just like fermi will pay off, the original r600 architecture paid of in a sense for AMD, I have no doubt that AMD might be heading the similar route like Nvidia when it comes to HD 6000

This is a scraped project resurrected as a videocard, it's not going to be an efficient architecture (ridiculous power consumption, huge chip surface area, low wafer returns). It's probably at it's limits with very little room for scaling or even the ability to run in a dual GPU configuration on one card because it will exceed the PCI-E power requirements and will result in features being locked off.

Fermi would have paid off if Nvidia had managed to make Tesla a success. Instead, they are taking an overpriced GPCPU architecture frankensteined into a videocard so as not to fall behind in the videocard game.
 
How do you know Fermi does what it claims?
I'm sure it does exactly what it does. It probably is 50% faster than 5870, but it's going to cost a hell of a lot more than 50% more. Nvidia always pulls that shit. Screm em.
 
Wow, the AMD loyalist are really coming out of the woodworks as of late.

I don't think most of the people you're generalizing aren't AMD loyalists, they're just mindful consumers. For me, personally, it's all about horsepower per dollar, which is why I'm rocking a 4770, got it few days after release at original MSRP.

In my mind, GF100/Fergie is looking and sounding like a Voodoo 5 6000. Which isn't a good thing. Common sense tells me these extra features and the exclusivity of having a GPGPU are going to cost. Since consolitis is rampant, for most titles, I don't need a Godzilla video card. Those $600 release day top-of-the-line video card days are over.

Reduce the heat, power consumption, and size and you might get me interested.

For the record, whoever believes in anything that Charlie says is just as much of a fool as he is. He clearly slams Nvidia GPU's just for the hits to his site.

Replace the word Charlie with Nvidia Press Corps and your statement is still true. Or ATI Press Corps. When [H]ardOCP gets their hands on "Fermi", we'll see. Until then, it is all speculation and FUD bullshit.
 
I'm sure it does exactly what it does. It probably is 50% faster than 5870, but it's going to cost a hell of a lot more than 50% more. Nvidia always pulls that shit. Screm em.

lol your sure it does exactly what is does??

Based on a bunch of slides from NV?

I got some waterfront property in Alaska that I need sell you, let me just whip up some slides of the awesome deal you are getting!

Cause if you saw it on the internet it must be real right?

This thread has been giving me laughs all day keep it up guys.
 
You people want Nvidia to fail, im convinced of it. But you get screwed in the long run if they would. You think Ati wouldn't drive prices up? Look how they've gouged on the 58/59XX series lately because of no competitive part from NVDA.
 
You people want Nvidia to fail, im convinced of it. But you get screwed in the long run if they would. You think Ati wouldn't drive prices up? Look how they've gouged on the 58/59XX series lately because of no competitive part from NVDA.

There are probably alot of people you can say that about on this site. Me personally I don't give a shit neither ATI or NV signs my paycheques. I buy when the product has features I like and suits my budget. I leave all the pointless cock fighting over who is getting .3 more fps up to the children. The only time frame rate matters to me is when i'm going from unplayable to playable and minimum fps. Most of the children act as if they sit there looking at the fraps counter all night instead of playing the actual games.

However....

It certainly makes for a fun read when i'm bored at work.

And infact I wish NV would just STFU already and release it. This way prices can drop which is good for everyone.
 
I'm sure it does exactly what it does. It probably is 50% faster than 5870, but it's going to cost a hell of a lot more than 50% more. Nvidia always pulls that shit. Screm em.

not only will it cost 50% more but it will also have 50% more heat...can anyone say 750-800 watt PSU minimum

I thought CPU and GPU manufacturers were working on reducing heat/power requirements on all future products...Fermi is pushing the industry back 10 years
 
not only will it cost 50% more but it will also have 50% more heat...can anyone say 750-800 watt PSU minimum

I thought CPU and GPU manufacturers were working on reducing heat/power requirements on all future products...Fermi is pushing the industry back 10 years

I agree with you here. I would love to see a 5870 or a 295 that puts out only 150watts of heat at max load or lower. These graphic cards are so complex I fear power consumption is only going to increase as they get faster, they will eventually hit a wall though just like intel did with the P4.
 
What's wrong with voting for the underdog?

I don't want Nvidia feeling they have a monopoly as consumers will suffer. I care about the consumer.

Fermie
+ High processing power

- High Power Consuption
High Heat

? Price
Actual in game performance
Noise Level

There is to many unknowns to support Fermi as a Cash Cow or a Dog. It is still very much a wild child. We don't know what it'll actually be capable of, when it'll actually be release, or for how much. We don't know if it'll be good for video editing, if it's overclockable, and more...no comparison can be made.

What we do know is that the ATI 5000's are good! Finally ATI is really putting on the pressure in a serious way. I hope ATI takes market share from Nvidia to the point they are dead even.

Why dead even? The CONSUMERS WIN! I don't care about companies that I don't own stock in... I care about my friends and I who game!

Now if we were talking Nintendo I would care. I bought thousands in Nintendo stocks when Wii was first displayed at E3, and everyone thought I was nuts!
 
This is silly.


If anyone is advancing pc gaming at the moment its ati. Ati has had dx 11 parts since the end of Sept 2009 , ati had sub $400 and then sub $200 parts in 2009 . Ati has sub $100 parts in january 2010 and will have sub $50 parts in Feb 2010. All dx 11 parts of which they shipped 2millionNvidia will have high end $400 + dx 11 parts in late march if even . At that point ati will have shipped millions more esp considering they are entering ever cheaper price segments

Developers will make sure their dx 11 parts run well on ati hardware (5x00 series ) for years to come because there are so many friggen of them before nvidia even gets 1 dx 11 part out the door.

Nvidia's top of the line card may be faster , but it will be more expensive and use more power and create more heat and it will limit those who buy it and we don't know when nvidia will get lower end dx 11 parts out and how competive they are with ati's parts.

its silly to think that fermi itself will push gaming foward. If anything its the massive amounts of dx11 cards flooding the market and the fact that people actually like windows 7 and dx 11 is avalible on vista also.

Thats what hurt dx10. The fact that the consoles were dx9 and that vista got a bad wrap and dx 10 wasn't on the previous os. But thats changed. Consoles are old now and simple ports of the games will not sell well on the pc .
 
You people want Nvidia to fail, im convinced of it. But you get screwed in the long run if they would. You think Ati wouldn't drive prices up? Look how they've gouged on the 58/59XX series lately because of no competitive part from NVDA.

I would buy a Fermi and I usually switch back and forth each generation. I wan't people to stop being sheep for marketing. Gouged? AMD raised their msrp I believe by a whopping $20.
a 5850 at even its inflated price $350 in retail or 5870 laughs at your statement when you look at what happened every time Nvidia had the competitive edge and where their prices were at. You think if the shoe was on the other foot and AMD was late Nvidia would even sell their single high end gpu below $500 msrp for the first 6 months?

Gtx 285

or 5850
 
Last edited:
Back
Top