[Kepler/GTX 680] Info's in... $400-450? + benches + thermals (56k warning)

Guys, I'll only post this (I wanted to post something a little bigger that explained my thoughts on all this but want to wait until the real paper launch [if there is going to be one now] to speak up). The Euro prices mentioned in this post...this is what they are in real USD (according to a converter online):

406 Euros = $561.29 USD
607 Euros = $799.78 USD

That is all for now. Just wanted to take converting confusion out of this. Out!

Yeah that is true, but those prices never convert like that. If it launches @ 400 euros in Europe(not this card, just an example) it will most likely launch at $400 in the US, or in worst case, $450. Europeans seriously get fucked over.
 
Adaptive VSync looks pretty neat. Turns VSync off if below 60 fps so your fps isn't cut to 30 and turns it on if FPS goes above 60. Best of both worlds. Gets rid of stuttering and tearing. And the GPU boost which will overclock the GPU up to an adjustable TDP. So if you have better cooling overall, be it case cooling or better GPU heat sink, it can clock itself higher. I guess you can set the thermal limit you want it to stay under and it will clock accordingly. Hope this is available in desktop and not just mobile.
 
It would appear that my 6990 absolutely smokes this thing. I guess it will be a while before I need to get a new card haha!
 
It would appear that my 6990 absolutely smokes this thing. I guess it will be a while before I need to get a new card haha!

right now, if i had to choose, i would choose to buy a gtx680 rather than getting 3 free 6990's.
 
right now, if i had to choose, i would choose to buy a gtx680 rather than getting 3 free 6990's.

And zero microstutter. :)

tri virtually eliminates microstutter, let alone at high framerates.

trifire-no-real-microstutter.png



Unfortunately I have to just stick to high frame rates for the moment until I upgrade mobo+cpu+ram (soon) and have mutli-slot capability for tri. After upgrading I'm hoping to score a used/relatively cheap 6970 for trifire to hold me over. I'm keeping an eye on all the cards but I'm willing to wait until even xmas2012/Q1-2013 before I upgrade gpu portion of my setup majorly again. I think the pricing is coming in high and it will take awhile for the hierarchy to adjust properly. Lots of early adoption gouging instead of letting the last gen's prices slide down more. Need all the cards dealt and lots of reviews.. and competition in effect.
 
Last edited:
I'd say things look legit so far. They may have re branded what would have been their high end mainstream card in spite of 79XX performance but if it beats out the 580 handily and comes in under $500 then they have a winner.
IMO it needs to be slightly cheaper than a 7970 even if it beats it out overall, due to lack of VRAM vs it's competitor in the same price bracket. For most this will not be an issue but as a selling point if all new cards from NV/AMD support 3+ screens then many retailers/customers etc may have a hard time choosing a card with less Ram for the same price. Even if Nvidia seems to handle memory usage better many will just see "More Gb's of RAM I WANT IT!"
Now I'll go out on a limb and say for those with 480's in surround these "680s" should put a spanking to them; much less heat; 30ish % more Vram; NV's quality SLI support and overall new card tricks = a silver award from the [H] to say the least. What's this about adaptive Vsync? hmmmm
I do hope I'm proved right Tuesday/Thursday morning.... :D
 
IMO it needs to be slightly cheaper than a 7970 even if it beats it out overall, due to lack of VRAM vs it's competitor in the same price bracket.

Huh? Prices for GPUs and practically all compute hardware are almost always based on overall performance, not VRAM or other factors.
 
Just an update on that Battlefield chart:

600x437px-LL-61e93b33_600x437px-LL-12ef7d42_1702470727834307124.png

A couple oddities about that:

1) We're supposed to be seeing ~40% improvement from Kepler over 7970 in BF3.
2) In other reviews, those same settings (except with 4x AA since 8x isn't even an option in BF3) the 7970 gets closer to 50 FPS than 70. If it truly were "8x AA" it wouldn't even be close to that.
 
tri virtually eliminates microstutter, let alone at high framerates.

trifire-no-real-microstutter.png

Not true.
Toms ran into the CPU bottleneck, no wonder there are less/no microstutters. Play this game at 3x1080p and/or with SSAA and they're back. No other site has confirmed this, Toms is the only site I know and they only presented one graph, one.
It's astonishing that it is posted over and over without questioning its significance.
 
A couple oddities about that:

1) We're supposed to be seeing ~40% improvement from Kepler over 7970 in BF3.
2) In other reviews, those same settings (except with 4x AA since 8x isn't even an option in BF3) the 7970 gets closer to 50 FPS than 70. If it truly were "8x AA" it wouldn't even be close to that.

Makes no sense to me either. Unless they forced 8X AA in CC and NVCP (or some third party software) which would screw up results.
The only thing I believe with 90% certainty is that 3dmark11 screen.
 
If NVIDIA has half a brain they will not go higher than $399 on the price of this new card.


If I ran the show at NVIDIA I would release this card at $329.99 and watch my sales blast past AMD and then watch AMD have a melt down and enjoy the moment while stuffing my pockets full of money and let the praise of the world fill my soul with happiness.
 
Last edited:
If NVIDIA has half a brain they will not go higher than $399 on the price of this new card.


If I ran the show at NVIDIA I would release this card at $329.99 and watch my sales blast past AMD and then watch AMD have a melt down and enjoy the moment while stuffing my pockets full of money and let the praise of the world feel my soul with happiness.

This card will be $500+. Mark my words.

I also have a feeling (now this is my personal opinion) that 104GK will not be able to overclock much more because they already clocked it high. Personally I hope this isnt the case, cause I plan to step-up to 3 of these keplers in 90 days....but thats what I think will happen.

then AMD will counter with a 7980 or some bullshit that comes stock at 1100mhz core or some shit.

either way, dont expect a $399 video card LOL
 
(except with 4x AA since 8x isn't even an option in BF3)
Wrong !
If you have an nvidia card (gtx 4xx or 5xx) you will see also 32xAA in BF3
 
(except with 4x AA since 8x isn't even an option in BF3)
Wrong !
If you have an nvidia card (gtx 4xx or 5xx) you will see also 32xAA in BF3

No.
You cannot force or enhance AA in an DX10/11 application. All the 8xAA benchmarks of BF3 on the web are wrong.
 
Do you own bf3? Just curious because the in game option maxes out at x4. Just so you know.

Of course, you can try to override it but the last time I tried that (with nvidia driver 285.79) it did not work. That is not unexpected because override AA does not work with most DX11 games. I do not know if that has changed since...that is not a big deal really because
4X MSAA pretty much looks great in bf3. There is also no "FXAA x8" setting in game so i'm not sure you're talking about. The 2 AA related settings are 2x MSAA, 4x MSAA, post processing on or off.

It didn't work because you're not supposed to override it... you're supposed to enable AA in game, and then use the "Enhance" option in the NVIDIA control panel to get it to 8x.

Override works less and less these days not because of an NVIDIA bug, but because it's a legacy AA control that naively forces AA on the backbuffer. If the game is using deferred rendering (and most games nowadays are, BF3 included), you can't just apply AA to the backbuffer....deferred engines render the image on other resources then just copy the final picture (without depth information) to the backbuffer. So applying AA to the backbuffer does you no good.

Using the in-game AA coupled with the enhance mode in the control panel says "hey just change whatever resources have AA applied to them by the game to the level of AA I chose in the control panel". This way the driver doesn't have to have special prior knowledge of where to apply AA to get it to work properly... it's just taking the app's existing behavior/decision and cranking it up (or down if that's the case).
 
Its a shame nvidia aren't taking this opportunity to redefine price-performance tiers with this card, they could really give AMD a run for its money if they priced it around $400. Makes you wonder what their margins will be like if this wasn't even intended to be their flag ship gpu.
 
Its a shame nvidia aren't taking this opportunity to redefine price-performance tiers with this card, they could really give AMD a run for its money if they priced it around $400. Makes you wonder what their margins will be like if this wasn't even intended to be their flag ship gpu.

Seriously. They probably feel like they have won the lottery. Selling the 660 at 680's price...
 
If NVIDIA has half a brain they will not go higher than $399 on the price of this new card.


If I ran the show at NVIDIA I would release this card at $329.99 and watch my sales blast past AMD and then watch AMD have a melt down and enjoy the moment while stuffing my pockets full of money and let the praise of the world feel my soul with happiness.

If they had infinite supply, maybe they would. But given the limited supply, there's no reason to price it so low. AMD did them a big favor by pricing the 7970 at $550, and I think they'll take advantage of it (and the GTX 680 naming makes it pretty clear they will).

This card will be $500+. Mark my words.

My guess is $499, just like the GTX 480 and GTX 580. Undercuts AMD just enough to twist the knife, but should still make them big margins on the cards they can produce. I'd love to be wrong though.
 
Its a shame nvidia aren't taking this opportunity to redefine price-performance tiers with this card, they could really give AMD a run for its money if they priced it around $400. Makes you wonder what their margins will be like if this wasn't even intended to be their flag ship gpu.

Both AMD and nVidia are out to make as much profit as they can. If they can get away with ripping us off with these prices for the high end segment, they will.

If nVidia undercuts AMD, they will lose profit, and AMD will be forced to cut their price and lose profit. Its great for us consumers but both company will end up losing profit for no reason. Its in their own interest to keep the price bracket that high as long as we continue buying their cards

If everyone suddenly stops buying a high end card, then perhaps they would be forced to reconsider the prices for the high end segment. ;)
 
Lots of people hoping for a cheap brand new top tier video card :rolleyes:

If it is true nvidia has the performance crown and the performance/watt crown, they could price the cards at $600 and still have a lot of buyers. If they sell it for $400 and sell out everywhere, that means they could have charged more. Those who barely have the cash to buy it need to weigh out whether they really want the best.

Honestly think $550-600 will be Nvidia's offering. They haven't had cheap prices for their cards for a while. In any case, I'm getting one :)
 
Two companies that are the only players in the high end consumer GPU business. They wont lower prices until they really have to. They'll charge what they want until their consumers stop buying. Sucks, but business is business for large corporations.
 
OP, thanks for posting this!

The 680 looks good, based on unsubstantiated preliminary reports. :)

Once [H] takes one to task and posts a review, I'll consider it substantiated.

In the meantime, remember all the fuss over OC'ed 580's being compared to stock 7970's? I think [H] will need to do 2 sets of comparisons. The first would be stock vs. stock, since that's how they're boxed and sold. The second, more interesting, comparison would be maximum overclock of a 680 vs. a max OC 7970. Then let the price wars commence.

Either way, both these new generation cards look sweet.

Ken
 
Looks like the rumors of a 1ghz 7970 might prove true then, as that would make them about even in performance if these numbers are accurate.

I'm still curious to see how the 7970 holds up once it gets more mature drivers.
 
Not true.
Toms ran into the CPU bottleneck, no wonder there are less/no microstutters. Play this game at 3x1080p and/or with SSAA and they're back. No other site has confirmed this, Toms is the only site I know and they only presented one graph, one.
It's astonishing that it is posted over and over without questioning its significance.

Well said. It is indeed something that floors me, too.
 
GTX 680+ i7 3960x 3.3GHz VS. 7970 + i7 950 3.28GHz , the truth is out there

Well.....interesting.

http://www.xtremesystems.org/forums...is-out-there&p=5070091&viewfull=1#post5070091

0_o

Performance matches at the same clocks.

Let the price war begin. AMD, you first since we know nV doesn't place nice. A nice $75-100 drop will make a few current owners of the 7970 a little regretful.
A $450 7970 ($400AR on sale) would be great, and a $350 7950($300AR on sale) would be even better.
 
Performance matches at the same clocks.

Let the price war begin.

What this tells me. Is AMD clocked the 7970's way too low for stock.

I mean 95% of the people can get 1125 on stock volts.

They should give them refunds like EVGA did back in the day with the 280 gtx (i think it was them) if they drop prices.
 
Is that it? :(

I was expecting them to take advantage of 28nm and cram as many transistors in there as possible e.g. 5m+. This is a mid range part sold as high end (with prices to match not doubt).

Hey, AMD you have a new customer!
 
What this tells me. Is AMD clocked the 7970's way too low for stock.

I mean 95% of the people can get 1125 on stock volts.

They should give them refunds like EVGA did back in the day with the 280 gtx (i think it was them) if they drop prices.

AMD wanted to improve yields and power draw by underclocking the card. The 7970 was rumored to be a 1-1.1GHz card in the beginning.
Now that TSMC has corrected their flaw with 28nm in February, a 1GHz+ cards could be coming from AMD AIBs very soon.
Funny though, many of the 7970s were produced with whatever process issue TSMC claimed they had.

nV is using smoke and mirrors, but camparing stocks clocks, they come out ahead.
It's still a mid-range part going against a 7970 which is impressive, but the price will be high end.
 
Not true.
Toms ran into the CPU bottleneck, no wonder there are less/no microstutters. Play this game at 3x1080p and/or with SSAA and they're back. No other site has confirmed this, Toms is the only site I know and they only presented one graph, one.
It's astonishing that it is posted over and over without questioning its significance.



I don't want to derail this thread any more than this going on about microstutter.

What they are saying in Tom's article and in other forums I was reading is that when you throw the third card into the mix, not only are you getting some framerate boost, you are virtually eliminating/mixing-up the two gpu binary asynchronism so that you will not hit those low valleys/long processing gaps. High fps keeps triple gpu sequence graphs/valleys and even moderate single card dips well above the bottoming out of low fps valleys too of course.

I'm interested in more feedback about this though so if you want to feel free to input on this thread:

http://hardforum.com/showthread.php?p=1038503773#post1038503773

I should be finding out for myself in the next month or so when I do tri after my mobo+cpu+ram upgrade anyway.

I will be following the newer cards over this year though too, I'm just not in a rush and I think they are coming in overpriced.. milking both early adopters and keeping the last gen card's prices from dropping as much as they should. Just my opinion.
.
 
Is that it? :(

I was expecting them to take advantage of 28nm and cram as many transistors in there as possible e.g. 5m+. This is a mid range part sold as high end (with prices to match not doubt).

Hey, AMD you have a new customer!

I too, buy my video cards based on transistor count! Neat! :rolleyes:
 
Yeah and just like 90% of posters there, a horrible comparison. Both on different CPU's. Put them both on the 3960x platform, then compare. :mad:

So your saying that being on a slower older CPU isnt fair and it should be on the faster one?
 
I too, buy my video cards based on transistor count! Neat! :rolleyes:
Yes, the number of transistors they can fit (with the same size and thermals) greatly increases with moving to a smaller process (going from 40 to 32nm is about half the size), this is a step and a half. So yeah, I want a shitload of transistors!

Look at the purposed die size, memory bandwidth, even the code name of the 680 and tell me that doesn't look suspiciously like the new 560?

So I can't say paying $500+ for a new version of the 560 greatly appeals to me. Had they left the design about the same and fitted 2x580's on the same die (like the have with most other previous generations), it would be a different story.
 
It didn't work because you're not supposed to override it... you're supposed to enable AA in game, and then use the "Enhance" option in the NVIDIA control panel to get it to 8x.

Override works less and less these days not because of an NVIDIA bug, but because it's a legacy AA control that naively forces AA on the backbuffer. If the game is using deferred rendering (and most games nowadays are, BF3 included), you can't just apply AA to the backbuffer....deferred engines render the image on other resources then just copy the final picture (without depth information) to the backbuffer. So applying AA to the backbuffer does you no good.

Using the in-game AA coupled with the enhance mode in the control panel says "hey just change whatever resources have AA applied to them by the game to the level of AA I chose in the control panel". This way the driver doesn't have to have special prior knowledge of where to apply AA to get it to work properly... it's just taking the app's existing behavior/decision and cranking it up (or down if that's the case).

1. You cannot override or enhance MSAA in the driver for DX10/11 apps.
2. Using override AA in a game with native AA is a poor benchmarking methodology, because override AA is substantially unoptimized. Further, using override AA in a game that has *native* AA? Please, that is just stupid. Especially for a game with as many IQ options as bf3.
3. Anyone using override/enhance AA for bf3 needs to have their head examined. It is highly unlikely any review site would do such a thing.

So that leaves the conclusion:

1. The website is lying / review is fake
2. It is a typo

Lastly, if you want to champion this benchmark - it is like 1 fps faster than a stock 7970. Is that something you REALLY want to brag about?
 
Back
Top