GK110 aka GTX 680 tapes out at last, projected release: Q3 2012 [SA]

xoleras

2[H]4U
Joined
Oct 11, 2011
Messages
3,551
per SA:

http://semiaccurate.com/2012/02/07/gk110-tapes-out-at-last/

With the size of GK104 now pretty settled, what about the big one? Sources are now saying this chip might be called GK110, but we are still hearing some insiders say GK112. Lets stick with GK110 for the article though, numbers ending in 0 tend to soothe moles more than numbers ending in 2.
Recently, a Taiwanese dancing mole, hair died green and still a bit hung over from New Year parties, gave us the answer. He said that GK110 is basically reticle limited, about 23.5mm on a side. The math says that it is about 550mm^2, with the last two generations coming in at 529/550mm^2 (GF100/GF110 respectively) and 576mm^2 for GT200.
The reason for the vagueness it that the chip just taped out a few weeks ago. Current green-topped roadmaps not from green topped moles have the release date slated for August/September. While this is quite possible, several of SemiAccurate’s sources with prior experience bringing high performance silicon to market scoffed at the time tables. Either way, late Q3 2012 is a decent mental placeholder.
One thing the mole said when he looked up with somewhat bleary eyes is that the chip definitely has a 384-bit memory bus. Other documents have the part burning close to 300W, and you can read two things in to this number. First is that this part is very likely to be an HPC oriented chip with the CU count, and attendant power-hungry interconnect, being notably higher than GK104. Second is that the rumours of GK100 being cancelled early in the game are likely true, they said problems revolved around interconnects and power issues.
Will GK110 solve these, and therefore Denver’s similar problems? Will it pull ‘only’ 300W, or will we get another Fermi-esqe performance per watt waterfall? Will GK110 be skewed toward HPC like the early leaks suggested? Will it once again come at the cost of gaming performance? Will there even be a consumer variant, or will it be professional/compute only? Some of these questions can be answered when silicon comes back. Others are a little more subjective. No matter what happens, this will be fun to watch.S|A

Disappointed to say the least. I was hoping these cards would come out around the May time frame, just in time for my 6 month PC hardware refresh. Maybe GK104 will be a killer GPU, so far it should still be released at that time.

edit: Also jesus christ @ the size. Bigger than Fermi?
 
Recently, a Taiwanese dancing mole, hair died green and still a bit hung over from New Year parties, gave us the answer. He said that GK110 is basically reticle limited, about 23.5mm on a side. The math says that it is about 550mm^2, with the last two generations coming in at 529/550mm^2 (GF100/GF110 respectively) and 576mm^2 for GT200.

lol

Going backwards? wtf is up. I have never seen a weirder development cycle. This can't be right.
 
Hopefully this is not true but if it is then AMD has pretty much won the GPU crown for the year. Let's just hope they get their drivers in line as they now have an opportunity to gain market share.
 
Uh-oh, not looking good at all...

Of course SA is the only website spitting out Kepler rumors at this point so, meh whatever....
 
Ill be happy when they are finally here, so that I can make a purchasing decision ....Ill be getting a new card this year, just not sure who is gonna make it yet
 
If this release window is true (which it sounds more and more to be), my next GPU purchase decision was just made for me. No way I'm waiting until Q3 for a new card(s) even if it is faster than Southron Islands.
 
Hopefully this is not true but if it is then AMD has pretty much won the GPU crown for the year. Let's just hope they get their drivers in line as they now have an opportunity to gain market share.

Can I get an Amen?
 
Entertaining article with more holes than Swiss cheese... Lol... nvidia must have stepped up security and be under a very strict no talking policy to keep foot from mouth.

I can only guess it is pretty accurate though (pun intended). Story seems to fit proper info & time table. Makes AMD's decision to use the low power 28 process look good.
 
Ill be happy when they are finally here, so that I can make a purchasing decision ....Ill be getting a new card this year, just not sure who is gonna make it yet

I wish I had the patience like you...my tax return will be in the bank next Weds, I don't think I can wait any longer for NV..no matter how fast the GTX 680 is.
 
nah they might get too upset :) - grab a 7970 now and enjoy until then - who knows it might even be delayed till 2013 !!
 
I wish I had the patience like you...my tax return will be in the bank next Weds, I don't think I can wait any longer for NV..no matter how fast the GTX 680 is.
haha...Trust me, its not that I have patience,because I usually don't.... but my current GTX 580 is still doing fine.....so I can hold off until I know whats gonna be coming out from both camps......if I absolutely needed to upgrade now, I would have grabbed a 7970 by now...:D
 
384-bit bus should mean either 1.5GB or 3GB, so that would at least quiet the "Nvidia would be morons to release a 2GB card" crowd.
 
I already decided to purchase the 7990 when released.
But I maybe enticed to buy a 7970 Lighting before then. ;)

I'm only interested in Nvidia's top GPU. If it's Q3, then I'm going with AMD for now.
 
Charlie picking select info to create drama. The card he's describing lines up with the rumored GK110-based GTX660Ti, a crippled chip to say the least. Look at this LINK... there are many others picking up this leaked info. here, here, here, etc...

All still rumor... of course
 
Charlie picking select info to create drama. The card he's describing lines up with the rumored GK110-based GTX660Ti, a crippled chip to say the least. Look at this LINK... there are many others picking up this leaked info. here, here, here, etc...

All still rumor... of course

Knowing the previous Fermi fiasco and how Charlie called that one, his article sounds more credible from anything that has been posted lately.
 
If this is the case, I'll grab another GTX 580 used and just go SLI until around Christmas time. It would take a 400% AMD performance advantage to make me switch over. I'd rather get 25% of the FPS with Nvidia but actually be able to use my computer because the drivers don't completely fuck it up. My last experience with crossfire was a nightmare (6950's).
 
Knowing the previous Fermi fiasco and how Charlie called that one, his article sounds more credible from anything that has been posted lately.

You mean pointing out something was going to be late... when it already was, and then hinting "there are issues"? Seriously, a late product has issues!?! It's like he's psychic or something! :p

He's just a big troll trying to clickbait as usual. People stop visiting the site, it stops existing.
 
Knowing the previous Fermi fiasco and how Charlie called that one, his article sounds more credible from anything that has been posted lately.

I wasn't saying he isn't posting credible info... just cherry-picked info to be more dramatic... So that everyone understands how it compares to rumor this card he talks about is the modern-day equivalent to a GTX465. Not the high end, but based on the high end GPU.

Looking at the GF110 dates in those other articles they line up with his information, but only for the GTX660Ti. What he doesn't say is that the higher end parts (according to rumor) are coming out much earlier than this 384-bit part. So the drama is created by him only speaking about the crippled, late to the game rumored part. People then infer what they want...
 
You mean pointing out something was going to be late... when it already was, and then hinting "there are issues"? Seriously, a late product has issues!?! It's like he's psychic or something! :p

He's just a big troll trying to clickbait as usual. People stop visiting the site, it stops existing.

Nope...he called Fermi not only late but hot and power hungry as well...and aren't all of these sites going for clicks? ;)
 
If this is the case, I'll grab another GTX 580 used and just go SLI until around Christmas time. It would take a 400% AMD performance advantage to make me switch over. I'd rather get 25% of the FPS with Nvidia but actually be able to use my computer because the drivers don't completely fuck it up. My last experience with crossfire was a nightmare (6950's).
This, except I'll stick with it until next summer.

My machine has been running smoothly/dependably since I replaced my 6790 with a 580, back in September. No way I'm going to risk screwing things up.
 
You mean pointing out something was going to be late... when it already was, and then hinting "there are issues"? Seriously, a late product has issues!?! It's like he's psychic or something! :p

You mean like when Jen held up a card put together with wood screws in October, well before Fermi was considered "late" and everyone bought into it?

not to defend charlie but at that point Fermi wasn't late and Nvida was feeding us all a bunch of BS about a November release which clearly wasn't even possible when they were saying it. If anything at least Nvidia has learned not to put their foot in their mouth.

a lot of people complain about his info just being rumors but i have yet to see anyone take the challenge of doing a score card on him (How many times was his info right vs wrong)

the only time in recent history i remember him maybe being wrong is he may have gotten the "Sea Islands" codename wrong when the Resume of the employee first showed up (He said Sea Islands was just generic, but according to the slide recently released Sea island may be the actual code name)
 
Knowing the previous Fermi fiasco and how Charlie called that one, his article sounds more credible from anything that has been posted lately.

It does, and especially when you consider that Ageia story and how it's delayed yet again. If this is true, Nvidia was likely unimpressed with its performance or perf-per-watt so they had to delay it.

I don't think this is going to be another Fermi and I hope Nvidia gets it right, but it does seem like AMD has quite a head start and even more growing room. Given that much time, I wouldn't be surprised if AMD throws another 2-3 cards out by then.
 
Here's the thing though - my single GTX580 maxes everything with 4xAA 1900x1200 on ultra including Battlefield 3. Granted, I've got a beefy processor backing it up but literally the only need for a faster card at present is for those running eyefinity/surround or a 30in display, which is a very small niche. I doubt this will even impact their bottom line.
 
unless you have a 6950/560 or less and looking upgrade.
 
Here's the thing though - my single GTX580 maxes everything with 4xAA 1900x1200 on ultra including Battlefield 3. Granted, I've got a beefy processor backing it up but literally the only need for a faster card at present is for those running eyefinity/surround or a 30in display, which is a very small niche. I doubt this will even impact their bottom line.

A single GTX 580 will dip below 60 fps a lot in graphically demanding games such as metro 2033, bf3, and crysis 2. In 64 player MP , on large MP maps a single 580 goes down to 40-50 fps a lot unless you disable 4X MSAA (or use FXAA)

So your statement isn't really accurate. You can't max everything with a single card, unless you want to compromise on your framerates. Does it matter? Not really, you can drop 1-2 settings and easily play it at a fluid 60+ fps. But you cannot max them and keep a maxed 60 fps framerate, not at all.
 
A single GTX 580 will dip below 60 fps a lot in graphically demanding games such as metro 2033, bf3, and crysis 2. In 64 player MP , on large MP maps a single 580 goes down to 40-50 fps a lot unless you disable 4X MSAA (or use FXAA)

So your statement isn't really accurate. You can't max everything with a single card, unless you want to compromise on your framerates. Does it matter? Not really, you can drop 1-2 settings and easily play it at a fluid 60+ fps. But you cannot max them and keep a maxed 60 fps framerate, not at all.

agreed...a single 580 can't hold 60fps in BF3 ultra

http://www.guru3d.com/article/amd-radeon-hd-7970-review/22

its around 43fps...which isn't BAD by any means...but it certainly can piss off some people...

you can either drop some of the graphical settings...or get more graphics horsepower
 
Back
Top