NVIDIA Reveals Fermi's Successor

This is an interesting slide that really says nothing about performance or power consumption. All that they really say is that the next gen cards will use less power than the previous cards at an equivalent performance point. It is implied that the performance will increase and the power consumption will go down but the chart alone does not confim either. The 2 next gen cards listed could be faster GPU's that uses the current wattage of the Fermi or they could be ones that are the same speed but uses a fraction of the power. Most likely it will be somewhere in between (hopefully towards the performance end), however where the line is going to be drawn is as clear as mud.
 
As far as i'm concerned gfx cards are like women. You get one and you use it as much as you can till its worn out and you have to get a new one. ;)

Nah. As soon as you can't crank up the eye candy to the max in 3-d surround is when it's time to dump that model...
 
So fermis isnt selling , its hotter than hell, consumes more power than most cars and theyre cancelling it i guess starting over aagin kinda like IE6,7,8 and now 9 :rolleyes:
 
So fermis isnt selling , its hotter than hell, consumes more power than most cars and theyre cancelling it i guess starting over aagin kinda like IE6,7,8 and now 9 :rolleyes:

Fermi is not selling? Hmmm... Okay.;) Consumes power sure, put them in SLI and add three monitors and no one gives a shit.;) The fuckers are fast and powerful and they give the ultimate gaming experience right now.
 
So fermis isnt selling , its hotter than hell, consumes more power than most cars and theyre cancelling it i guess starting over aagin kinda like IE6,7,8 and now 9 :rolleyes:

...and yet they still have an increasing marketshare. Must be day one purchasers taking 6 months to install them.
If you can get a car to run on less than 300w then your buisness is going to soar.
But 38c on idle and 90c on load is really a PITA...but If you don't know how to cool something hot then you're in the wrong hobby/profession.
 
But 38c on idle and 90c on load is really a PITA...but If you don't know how to cool something hot then you're in the wrong hobby/profession.

+100. Really, 3 480s are far from the first hot and loud pieces of hardware I've ever run. And I don't care when I'm gaming in 3D bliss. Fast and powerful hardware is typically antithetical to cool and quiet.
 
+100. Really, 3 480s are far from the first hot and loud pieces of hardware I've ever run. And I don't care when I'm gaming in 3D bliss. Fast and powerful hardware is typically antithetical to cool and quiet.

I'm surprised you haven't gone for liquid yet. I mean if I had an 800d and 3 480s and a load of money. It would be worth it! Ive seen people do it with a 3x120 rad too! Plus it looks extremely nice... Especially with pink tubing UV and some CCFLs!!! :D
 
I haven't seen a single mention of Global Foundries in this thread. Has anyone read http://semiaccurate.com/2010/09/21/nvidia-signs-global-foundries/?

Charlie speculates it's just for Tegra, but perhaps there's more to the story than we know. Is it possible this successor will be on a GloFo process?

"Charlie" is the first problem with it. Their just some whiny bitch with sand in his vagina about everything Nvidia does because they beat him as a child or killed his puppy/something. Theres so much exaggeration/fabrication mixed with any truth you might as well not know anything. Also if you guess everything you're going to be right eventually...
 
Really guys, when it comes down to it, we can only evaluate what is out at a given time to what is out at the same time and compare at the same price points, it just makes sense. We will compare the next gen AMD series to what is out at the time, and we'll compare Kepler to what is out at the time, and compare by price. I'm really looking forward to seeing how this all shapes up myself.
 
I'm surprised you haven't gone for liquid yet. I mean if I had an 800d and 3 480s and a load of money. It would be worth it! Ive seen people do it with a 3x120 rad too! Plus it looks extremely nice... Especially with pink tubing UV and some CCFLs!!! :D

My office looks like a Besy Buy blew up and all of a sudden had cool stuff. I'll have to try liquid cooling on day just to pad my geek resume but no I don't see a whole lot of worth in it personally. Not saying that its not cool, pun intended, just looking at the numbers hasn't ever impressed me overall. But until I do it first hand I won't know for myself.
 
My office looks like a Besy Buy blew up and all of a sudden had cool stuff. I'll have to try liquid cooling on day just to pad my geek resume but no I don't see a whole lot of worth in it personally. Not saying that its not cool, pun intended, just looking at the numbers hasn't ever impressed me overall. But until I do it first hand I won't know for myself.

Well you'd gain 2 PCI slots straight away. Which are useful if you ever need them (though fitting anything into them with WC tubes infront is often....unfun). Also as heat rises having the 3 hot 480s underneath causes the CPU and everything above them to heat up slightly. The water causes the transportation of the heat (a little cited advantage) which would mean cooler case tempretures if you did it externally/lower tempretures even if your doing it internally, so that can mean a little more can be gotten out of your CPU/chipset in some situations or cooler stuff lasts longer. The noise can be greatly reduced as low RPM fans can be used. Liquids transports 25 times more heat than air. It looks rad to the power of sick squared the the hardest of cores (as long as its pink and has CCFLs). Plus you can get much better cooling off your 480s than is possible on air. Especially if your OCing them it gives much better headroom than the stock, all with lower case tempretures. Also if you go for generic GPU coolers (like if DD ever update their maze cooler to fit a 400 series) then you can reuse them on the next lot of GPUs. I've seen people with 50c on 480c at load. Which is pretty good. But youre looking at 100$ a card + RAD and Pump. So it would be maybe 500$ or more.
 
Well you'd gain 2 PCI slots straight away.

Use one of those slots you saved and get an Intel PCIe NIC for $30. Intel NICs are miles ahead of the Realtek/Marvell crap put on most boards (except the Rampage III series and the Intel Smackover).
 
Kid? Nah. i'm 35 with a mortgage. "got 5870 in May...getting 6870 when it's released this year"...... Kid. :p

I aint made of money and i'm not someone who shits a brick when i'm only getting 59fps instead of 60. I've been running 20-30 fps for a while now and the newer games were starting to make my old card beg for mercy so i gave in and bought a new card. As far as i'm concerned gfx cards are like women. You get one and you use it as much as you can till its worn out and you have to get a new one. ;)

Can't wait to meet the mother who raised you. :rolleyes:
 
That isn't logical at all. First off, there's a die shrink which translates to less power and more preformance. Second off, if you've done any technical reading at all on Fermi you'll know that the biggest problems with Fermi was the manufactoring process. Want proof, just go look at the 460 and it's preformance per watt compared to a 470. There's huge improvements between the two. Going forward to a new generatiion that fully fixed the manufactoring problems, and had the benefits of a die shrink, the numbers they are quoting ARE logical.

you need to remember that that they removed some of the GPGPU crap. that was really the biggest flailing there. that actually explains it better as they are still selling the GF104 as a crippled chip.
 
Giving somewhat vague performance indicators for an unreleased product, that may or may not come in 2011(especially considering Nvidia has a funny way of counting releases, looking at their presentation), seems more like a plan to steal some of the thunder from AMD than anything else IMO. Especially if you take into consideration that we may yet see 2 gens of AMD GPUs before that. And the first one could be here within a month.

Given that Nvidia hasn't really outperformed AMD in the important segments this gen, they'd better get Kepler out sooner than later, or they will really be on their heels.
 
The main thing on my mind is how unfortunately, the consoles have dominated the industry, and how so many PC games are console ports. We need to see more games that take advantage of the current GPUs out now, let alone these upcoming ones! Unfortunately, it looks like new consoles are not coming out anytime soon, especially with Move/Kinect keeping them alive even longer. :(
 
Its not surprising that NVIDIA is trying to build some hype for their next generation but I wonder if its a bit premature. It looks promising if they can actually deliver something other than figure on paper. Performance gains will probably be modest, but power efficiency, heat reduction looks to be their main focus. I like healthy competition so I'm hoping for the best.

5 years for your next upgrade...kids.

got 5870 in May...getting 6870 when it's released this year and ebaying my 5870.

If you're going to do that..you're going to end up with the same performance as the 5870. The card you want is the 6970. (6870 is upgrade path for 5770, i know its confusing).
 
Its not surprising that NVIDIA is trying to build some hype for their next generation but I wonder if its a bit premature. It looks promising if they can actually deliver something other than figure on paper. Performance gains will probably be modest, but power efficiency, heat reduction looks to be their main focus. I like healthy competition so I'm hoping for the best.

If you're going to do that..you're going to end up with the same performance as the 5870. The card you want is the 6970. (6870 is upgrade path for 5770, i know its confusing).

I would like to see that confirmed by AMD and reviews, before I take that for granted. The rumors about the naming scheme for the 6xxx series has been floating around the web, but seems to be little more than pre-release speculation based on some questionable articles from minor sites, afaik.
 
I would like to see that confirmed by AMD and reviews, before I take that for granted. The rumors about the naming scheme for the 6xxx series has been floating around the web, but seems to be little more than pre-release speculation based on some questionable articles from minor sites, afaik.

this, I have only seen this once. and it sounds more like intentional fud. I am quite sure that the naming scheme rumors will be floating around for a while before we know for sure
 
For those saying they want to skip Kepler, keep in mind that if NVidia had made that chart on a log scale the line would most likely be straight.
Using a linear scale is silly for computer technology that doubles in performance every 18 months.
 
Sounds like nVidia is doing what they were doing with the 5xxx launch. Hype up their next generation while ATi's is due so soon, to help dissuade early adopters.

I wonder if we'll see ATi mess with the 6770 on 28nm for practice like the 4770 in order to prep them better for the 7xxx launch in the future. That being said, wasn't Fermi originally designed for 32nm? (before it was decided to skip it and go straight to 28nm) so if that's true we should see some good performance Kepler.

Competition is good for everyone, but I'm seriously wondering if this is just nVidia's PR machine trying to unhype the 6xxx launch as it was doing continuously for the 5xxx series.
 
Competition is good for everyone, but I'm seriously wondering if this is just nVidia's PR machine trying to unhype the 6xxx launch as it was doing continuously for the 5xxx series.

Well of course that's part of it but at the same time nVidia like any company has to let people what's goimg a little bit. I'd rather for them or anyone to say something about their road map than nothing.
 
Very true, they shouldn't have near the trouble incorporating Kepler into 28nm as Fermi was into 40nm since Fermi was never really designed for 40nm. Hopefully we get some sweet price wars out both of them.
 
That being said, wasn't Fermi originally designed for 32nm? (before it was decided to skip it and go straight to 28nm)

Not unless Nvidia was *planning* on being a year behind ATI. It was designed for 40nm, there wasn't even anything else originally planned to be available then. The problem is that TSMC didn't hit the tolerances it claimed it would, and ATI was able to catch that and compensate early enough (thanks largely to the 4770 and a healthy dose of skepticism), and Nvidia wasn't.
 
So, they hyped up Fermi and it turned out to be a bust. Can't wait to see this one! :D
 
Fermi isn't a bust, I really can't believe so many here on [H] believe that. I will put my gtx 480 up against any other single GPU card ati has never worry about the numbers. When you factor in Sli scaling it isn't even a competition.
 
The 480 was a total joke to me like Vista was. Wasn't interested in the least and had no problems skipping it completely (gave me another reason to hang on to my 280). This new one looks interesting, if it can get the obnoxious power draw, heat, and noise down.
 
The 480 was a total joke to me like Vista was. Wasn't interested in the least and had no problems skipping it completely (gave me another reason to hang on to my 280). This new one looks interesting, if it can get the obnoxious power draw, heat, and noise down.

Yep, I laugh every time these 480s crush a game in Surround!:D
 
Yep, I laugh every time these 480s crush a game in Surround!

Yep and you can eat breakfast at the same time by placing a frying pan on it, have that great feeling of hearing the airplane engine in the background as you play, and contribute to power outages in your neighborhood! :cool:
 
Yep and you can eat breakfast at the same time by placing a frying pan on it, have that great feeling of hearing the airplane engine in the background as you play, and contribute to power outages in your neighborhood! :cool:

Yawn.... Same old shit from someone who doesn't have this type of rig.;) My color laser printer draw more power BTW.:cool:
 
I'm surprised so many people are saying such bad things about Fermi architecture. Sure it turned out that AMD came to the table with much more innovation and I applaud them for their 5000 series. They did more on their end to stir up competition. Fermi is not a bust, it's a good GPU and has been selling alot lately. I've been seeing alot of threads of people who just bought their 460/470/480 sli/ tri sli so the gpu's are selling. AMD has sold more no question but Fermi is not a bust. A little late to the game but thanks to fermi coming out amd's prices have come down and even if you purchased a 5870/5850 recently and saved $100.00 or more you can thank fermi for that.

I wouldn't be surprised if nvidia releases the fermi refresh asap maybe just in mid to late january to make up for a few months. Currently they are 7-9 months behind. Both are good GPUs. Lets not bash either side as they both bring much to the table.
 
I'm surprised so many people are saying such bad things about Fermi architecture.

+1. Of course Fermi runs hot duh! But the performance and feature set especially in SLI and Surround, AMD is behind here folks at the momemt. The 5000s are great cards, I like my little 5770 in my backup rig, but for top line gaming overall Fermi is the best solution right now.
 
The 480 was a total joke to me like Vista was. Wasn't interested in the least and had no problems skipping it completely (gave me another reason to hang on to my 280). This new one looks interesting, if it can get the obnoxious power draw, heat, and noise down.

280s get to almost the same tempretures as 480s... one is 90 one is 95...
 
Back
Top