GTX 680 - 1536 CUDA / 190w ?!

Trackr

[H]ard|Gawd
Joined
Feb 10, 2011
Messages
1,786
Okay, am I seeing right here?

We're getting confirmed reports of GTX 680 having one GK104 28nm core that has.. 1536 CUDA Cores and 190w TDP?!

That's 3x the performance of GTX 580 and 50% less heat.

Am I the only one asking WTF?

Has nVidia finally given in and lowered its performance-per-ALU just so that they can boast about the number of SPs they have?
 
Okay, am I seeing right here?

We're getting confirmed reports of GTX 680 having one GK104 28nm core that has.. 1536 CUDA Cores and 190w TDP?!

That's 3x the performance of GTX 580 and 50% less heat.

Am I the only one asking WTF?

Has nVidia finally given in and lowered its performance-per-ALU just so that they can boast about the number of SPs they have?

Its hard to know facts, and the real bullshit.

3x 580 GTX performance....seriously man...yuou cant tell thats bullshit?
 
It appears that either it's a different way of counting cores, or a different core architecture. It seems very unlikely that it would be 1536 Fermi cores.

Pretty sure this was discussed in the other Kepler thread.
 
3x the performance? the card is going to compete with the 7970 so its probably only around 30% or so faster than the gtx580.
 
Its hard to know facts, and the real bullshit.

3x 580 GTX performance....seriously man...yuou cant tell thats bullshit?
That might be the case in a certain scenario, it was that long ago that turning on even 2x AA would kill the framerate. Anyway, in about 10 days time there will be more benchmarks than you can count.
 
Its hard to know facts, and the real bullshit.

3x 580 GTX performance....seriously man...yuou cant tell thats bullshit?

Take it as you will, but Mark Rein of Epic has stated "Kepler" can render things in realtime that used to take 3 580s. While he didn't have any specific details on what he was rendereing the comment was in reference to a discussion on the Unreal 4 engine.

I doubt it will be 3x in most real world scenarios, I'd even be extremely suprised by 2x performance... but to hear that comment coming from someone who has been working with the hardware and is on a team that designs the engine that a ton of modern games use.. is very interesting.
 
Why did you make a new thread about the GTX680 when there are mulitiple discussions already going on about this?
 
Take it as you will, but Mark Rein of Epic has stated "Kepler" can render things in realtime that used to take 3 580s. While he didn't have any specific details on what he was rendereing the comment was in reference to a discussion on the Unreal 4 engine.

I doubt it will be 3x in most real world scenarios, I'd even be extremely suprised by 2x performance... but to hear that comment coming from someone who has been working with the hardware and is on a team that designs the engine that a ton of modern games use.. is very interesting.

No, he said the Samiratan demo was running on one Kepler, when last year they demoed it on 3 GTX 580s. They also changed the settings (FXAA instead of MSAA) and they never said how well it was running it. So that info is misleading at best.

Again, this has all been discussed in the other Kepler threads.
 
The 3x BS was openly claimed to be for "certain applications" according to the original statement. Specifically mentioned was the much heated debate over this claim by having Epic using the Samaritan Demo. It previously was ran on 3 x GTX 580's and recently was run with 1 x GTX 680.

However according to Epic's own claim, the settings were vastly different from when they first previewed the demo over a year ago, including MSAA (3 x GTX 580) ---> FSAA (GTX 680) where FSAA takes far less horsepower to achieve similar or better results. It was also noted that Epic "finely tuned" the engine during the time from the first demo to the second.

There is a lot unanswered and other than stirring the rumor mill and getting gamers excited for Kepler, that's about it.
 
FOR THE LAST TIME.....It is not 3XGTX580. Last year the demo was completely unoptimized and Epic itself said that after optimization, it will easily run at half the graphics power. It is obvious that 1 GTX680 can run it which is believed to be 1.4 times GTX580.
I don't understand how reputed sites are misleading people by stupid headlines like 'GTX680 power revealed:As powerful as 3 GTX580s'. That is BULLLSHITTT.
Again-unoptimized ,MSAA instead of FXAA.
PS: I am an Nvidia fanboy but this is ridiculous.
 
FOR THE LAST TIME.....It is not 3XGTX580. Last year the demo was completely unoptimized and Epic itself said that after optimization, it will easily run at half the graphics power. It is obvious that 1 GTX680 can run it which is believed to be 1.4 times GTX580.
I don't understand how reputed sites are misleading people by stupid headlines like 'GTX680 power revealed:As powerful as 3 GTX580s'. That is BULLLSHITTT.
Again-unoptimized ,MSAA instead of FXAA.
PS: I am an Nvidia fanboy but this is ridiculous.

I seriously doubt its "FOR THE LAST TIME"

I find threads like this funny due to the shear amount of speculation and mud slinging of self-proclaimed elitists. Its my Sunday's Comics replacement. :D

Oh, and to remain true to the thread: OMFG, GTX 680 is gonna rule or totally suck!
 
You also have to take into account that Tri SLi GTX 580 is not 3x the performance of 1 GTX 580. It's only about 2.3x faster from benchmarks I've seen because SLi Scaling isn't perfect.
 
You also have to take into account that Tri SLi GTX 580 is not 3x the performance of 1 GTX 580. It's only about 2.3x faster from benchmarks I've seen because SLi Scaling isn't perfect.

This.
 
What ever is the case about Kepler performance, nVidia is setting up some very high expectations of Kepler which I would imagine that they wouldn't be doing if there weren't some reason to.
 
What ever is the case about Kepler performance, nVidia is setting up some very high expectations of Kepler which I would imagine that they wouldn't be doing if there weren't some reason to.

Newsflash... marketing department for manufacturing company says "our shit is THE shit". :)

Do you expect Nvidia to say, "well after a few years of hard work Kepler is a no better than Tahiti, but it's a few months late so instead of waiting just buy a HD 7970". :D
 
Newsflash... marketing department for manufacturing company says "our shit is THE shit". :)

Do you expect Nvidia to say, "well after a few years of hard work Kepler is a no better than Tahiti, but it's a few months late so instead of waiting just buy a HD 7970". :D

But neither would I expect them to set expections so high as to assure the product completely failing to meet those expectations. Marketing is more than setting expectations you can't possibly meet.
 
Newsflash... marketing department for manufacturing company says "our shit is THE shit". :)

Do you expect Nvidia to say, "well after a few years of hard work Kepler is a no better than Tahiti, but it's a few months late so instead of waiting just buy a HD 7970". :D

I agree with you but you have to agree they have gone a little overboard. Things like :
1) We expected a lot more from AMD
2) 3XGTX580
3) 1.4X 7970 in BF3
 
I agree with you but you have to agree they have gone a little overboard. Things like :
1) We expected a lot more from AMD
2) 3XGTX580
3) 1.4X 7970 in BF3

And this is all I am saying. They have set some very high expectations only to set them and not meet any them. Maybe it is all blowing smoke but why blow all that smoke if there isn't SOME fire?

Just plain common sense.
 
I agree with you but you have to agree they have gone a little overboard. Things like :
1) We expected a lot more from AMD
2) 3XGTX580
3) 1.4X 7970 in BF3

As for #1, it appears that might be true if they are able to re-market their planned mid-range part to compete with AMD's top product. If that is true, I imagine Nvidia execs giggling with glee as they price what was probably planned as a $350ish part at $500+.

Pretty sure Nvidia never said #2, that was just some news site sensationalism.

#3 is typical marketing fluff.
 
Pretty sure Nvidia never said #2, that was just some news site sensationalism.

Samaritan has been regarded as a look into the distant future of gaming. The graphics are so advanced that it was recently regarded as a fringe demo – requiring no fewer than three GeForce GTX 580 graphics cards. That all changed with Epic’s demo at GDC, when it was unveiled that the Samaritan demo replaced the three GTC 580s with one Kepler GPU.
From the Horses Mouth
 
As for #1, it appears that might be true if they are able to re-market their planned mid-range part to compete with AMD's top product. If that is true, I imagine Nvidia execs giggling with glee as they price what was probably planned as a $350ish part at $500+.

Indeed, if that was true, AMD would have done nVidia a huge favor as rather than having to delay their high end card, now nVidia can immediately jump into the high end segment with the GK104 :eek:

There were also some rumor earlier that nVidia had to do a bottom up release of Kepler, now it looks like they can start at the top.
 
It can run it but with FXAA and some tweaks Epic made to the demo to optimize it.
 
What ever is the case about Kepler performance, nVidia is setting up some very high expectations of Kepler which I would imagine that they wouldn't be doing if there weren't some reason to.

Look for JF-AMD posts in amd cpu section from before Bulldozere release ;)
 
But neither would I expect them to set expections so high as to assure the product completely failing to meet those expectations. Marketing is more than setting expectations you can't possibly meet.

Come on man, these are the same guys that put wood screws in one of their demo GPUs and tried to pass it off......

the whole thing is bullshizzle until nvidia puts out some real working hardware.

either way....it's done when its done, people always lose their minds a few days or weeks before the release of new stuff, then argue about the benhmarks, then argue about the price, and finally the availability.......and when the dust clears, start arguing about drivers.

same old-same old.
 
It's a fine point, but they never said that the Kepler card was 3 x GTX 580 - they said it replaced 3 x GTX 580s with a single Kepler. They never said that it had the same performance (and it didn't - they changed the demo).

Yeah, it's clever marketing but it doesn't hold up past the surface whatsoever.
 
And this is all I am saying. They have set some very high expectations only to set them and not meet any them. Maybe it is all blowing smoke but why blow all that smoke if there isn't SOME fire?

Just plain common sense.

You need to look at past history for all PC hardware manufacturers to see that there have been many times when "expectations" were not met.. AMD, ATI, Intel, Nvidia have all released products in the past that came nowhere near close to the marketing bullshit.

Pentium 4
Bulldozer
Nvidia FX series (NV30)
ATI 2900XT
Early Fermi/GT300/GTX480 (woodscrews etc) - Note I am not including GTX 570 and 580

For every one of the above new hardware we received marketing that claimed performance way beyond what we received. Essentially the marketers cherrypick the extreme performance deltas that show their hardware in the very best light compared to the competitors part. Usually these extremes are not representative of real world scenarios. For example, GTX 480 slides showing 2x performance compared to HD 5870 in extreme tesselation benchmarks. People latch on to these so called facts and their expectations run out of control. You also need to factor in the natural bias a lot of people have for one manufacturer over the other. They wear red or green tinted spectacles and will immediately latch on to any information that confirms their bias and totally dismiss any that doesn't. Very soon they will have a very skewed idea of "facts" that don't stack up to scrutiny.

Nvidia, Microsoft and Intel are evil corporations.
AMD care for their consumers.
AMD drivers suck.
Nvidia drivers are amazing and never cause problems.

Reality and marketing are rarely even in the same book, let alone the same page. :D
 
Last edited:
Nvidia, Microsoft and Intel are evil corporations.
AMD care for their consumers.
AMD drivers suck.
Nvidia drivers are amazing and never cause problems.

Reality and marketing are rarely even in the same book, let alone the same page. :D

Yes AMD cares...they stopped making good products so consumers don't have to make a choice, life is easy that way.
 
Its hard to know facts, and the real bullshit.

3x 580 GTX performance....seriously man...yuou cant tell thats bullshit?


Yeah so we should believe you over a guy like Mark Rein, who has one in hand and runs programs on it? No?

LOL

Personally I bet a tiny bit of it is overstated as usual, but I'm betting this card really is at least a good 2x faster than a stock GTX 580 if it's to be believed.
 
Yeah so we should believe you over a guy like Mark Rein, who has one in hand and runs programs on it? No?

LOL

Guess in a couple weeks we will find out if its real bullshit.

and in Marketing. Bullshit is win!
 
Yeah so we should believe you over a guy like Mark Rein, who has one in hand and runs programs on it? No?

LOL

Personally I bet a tiny bit of it is overstated as usual, but I'm betting this card really is at least a good 2x faster than a stock GTX 580 if it's to be believed.

It's hard to believe they will crush their own previous top of the line card 3 fold. Unless it comes with the price to match. :D
 
You need to look at past history for all PC hardware manufacturers to see that there have been many times when "expectations" were not met.. AMD, ATI, Intel, Nvidia have all released products in the past that came nowhere near close to the marketing bullshit.

Of course, and they often regret it. Again, all I'm saying is that nVidia seems to be saying a lot that it really doesn't need to since all they would be doing is setting themselves for failure.
 
It's hard to believe they will crush their own previous top of the line card 3 fold. Unless it comes with the price to match. :D


You're not a programmer though, they could've easily pulled 3x the power for that particular demo / engine . Remember, we're being held back by Xbox 360 / PS3 right now. But I'm sure alot of it is marketing.
 
There is no point in releasing a card that is 2x or more powerful than your last card. Gradual upgrades, even when better technology is available, means more $$ for less. AMD and Nvidia are not fools, they will milk us for as much as they possibly can.
You will never see a 2x or more faster than last gen's top card ever if AMD and Nvidia are thinking with their money brains.
This alone should help you see which benchmarks are most likely false or true.
 
Back
Top