• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

NVIDIA Kepler GeForce GTX 680 Video Card Review @ [H]ardOCP

It would be rather bad for Epic if Unreal Engine 4 only worked on the highest end Nvidia card. Just because they originally demoed the engine with 3x580 doesn't mean that's the minimum requirement, nor that GTX 680 provides equal FPS. Different setup, updated engine, different settings. Just drop the resolution and leave AA off and it'll need only a fraction of the performance.

Looks like 2GB is enough for now, though it would be interesting to see some 5760x1080 AA tests when the EVGA 4GB version arrives to see if it makes a difference.

Personally I have no need to update from my 7970, still rocking top of the line 3x1080p performance overclocked. Nvidia brought some much needed competition in prices, which is great for us consumers, though not so great for my 7970 resale value. :p I guess that's the price to pay for getting this level of performance 3 months early.
 
I can't wait til' [H]'s review of GTX680 SLI. Guru3d did a review and they cop'd out by only comparing it against HD 7950 CF. No CF 7970s to be found in that review at all. Biased much?
 
Going to the bank in a bit to put the money on my card, hope it doesn't sell out in 20 mins!
 
Out of stock. 5:59pm central time. both Newegg and EVGA. You snooze you lose,
 
EVGA notify and not new egg let me know. So I saw it available and checked newegg and they had it too at the same time. Seems they are synced. Never got notified from Newegg though.
 
Ok, TechPowerUp posted their SLI 680 review. They didn't even throw results of 7970 CF up against it. I'm guessing this means 7970 CF beats SLI 680, but I don't really have any solid evidence to back my claim. I wish some review sites would grow a pair and actually compare it to its main competition. Can't wait for [H]'s review!
 
ROFL!

That was quick. :eek:

Guess having a browser open on a spare monitor and hitting refresh all day finally paid off. :D

Google chrome extension:
Page Monitor

It refreshes a page at a specified interval (don't do it a crazy setting like 1 sec, 30 secs is more reasonable) and more importantly it tells you when that page has changed.
 
EVGA notify and not new egg let me know. So I saw it available and checked newegg and they had it too at the same time. Seems they are synced. Never got notified from Newegg though.

I was talking to NE with regard to a large laptop order and the operator helping said they aren't sending out auto-notifies until their stock is larger. She might be full of it but I didn't get my auto-notify this afternoon.
 
Google chrome extension:
Page Monitor

It refreshes a page at a specified interval (don't do it a crazy setting like 1 sec, 30 secs is more reasonable) and more importantly it tells you when that page has changed.

I got my IP address blocked for doing that last month, so did a number of other people from what I read after I was trying to figure out why I couldn't reach any newegg pages. I had Page Monitor checking every 30 seconds.

I had to spoof my MAC address in order to get a new IP address from my ISP, and then I could get back on newegg's website.
 
I got my IP address blocked for doing that last month, so did a number of other people from what I read after I was trying to figure out why I couldn't reach any newegg pages. I had Page Monitor checking every 30 seconds.

I had to spoof my MAC address in order to get a new IP address from my ISP, and then I could get back on newegg's website.
Yikes. Well then, maybe 5 minutes. I doubt they'd do it for 5 minute refreshes.

Edit: a quick google shows people reporting a 24 hour IP ban if that happens.
 
http://www.realworldtech.com/page.cfm?ArticleID=RWT032212172023&p=2

"The catch is that the Kepler core is a poor fit for compute applications. The excellent efficiency for graphics has undoubtedly come at the cost of general purpose workloads. As our analysis showed, Nvidia’s architects made a conscious choice to quadruple the FLOPs for each core, but only double the bandwidth for shared data. The result is that the older Fermi generation is substantially better suited to general purpose workloads and will continue to be preferred for many applications.
...
Given this situation, it seems highly likely that Nvidia’s upcoming compute products will use a core that is tuned for general purpose workloads. It will be a derivative of Kepler, to re-use as much of the engineering effort as possible, but with several significant changes. "

Seems Dave also believes there's gonna be a second line of products aimed at computing in the future, moving away from their original Fermi approach of all-in-one solutions.
 
http://www.realworldtech.com/page.cfm?ArticleID=RWT032212172023&p=2

"The catch is that the Kepler core is a poor fit for compute applications. The excellent efficiency for graphics has undoubtedly come at the cost of general purpose workloads. As our analysis showed, Nvidia’s architects made a conscious choice to quadruple the FLOPs for each core, but only double the bandwidth for shared data. The result is that the older Fermi generation is substantially better suited to general purpose workloads and will continue to be preferred for many applications.
...
Given this situation, it seems highly likely that Nvidia’s upcoming compute products will use a core that is tuned for general purpose workloads. It will be a derivative of Kepler, to re-use as much of the engineering effort as possible, but with several significant changes. "

Seems Dave also believes there's gonna be a second line of products aimed at computing in the future, moving away from their original Fermi approach of all-in-one solutions.
I imagine that's because they don't want their target compute market buying $500 cards instead of the $2000 workstation ones.
 
Ok, TechPowerUp posted their SLI 680 review. They didn't even throw results of 7970 CF up against it. I'm guessing this means 7970 CF beats SLI 680, but I don't really have any solid evidence to back my claim. I wish some review sites would grow a pair and actually compare it to its main competition. Can't wait for [H]'s review!

Here. Happy? :rolleyes:
 
http://www.realworldtech.com/page.cfm?ArticleID=RWT032212172023&p=2

"The catch is that the Kepler core is a poor fit for compute applications. The excellent efficiency for graphics has undoubtedly come at the cost of general purpose workloads. As our analysis showed, Nvidia’s architects made a conscious choice to quadruple the FLOPs for each core, but only double the bandwidth for shared data. The result is that the older Fermi generation is substantially better suited to general purpose workloads and will continue to be preferred for many applications.
...
Given this situation, it seems highly likely that Nvidia’s upcoming compute products will use a core that is tuned for general purpose workloads. It will be a derivative of Kepler, to re-use as much of the engineering effort as possible, but with several significant changes. "

Seems Dave also believes there's gonna be a second line of products aimed at computing in the future, moving away from their original Fermi approach of all-in-one solutions.

Okay...does ANYONE have any PPD info for the 680? Its posts like this that make me wonder if this new architecture is going to fold better than the 5xx gen.

 
don't see what all the hype is over reference cards?

we already know the non reference cards are going to be much better. Guess if you want the rear exhaust its the way to go.

I just don't like the reference designs, because they clog with dust very easy. When your pushing 400+ cfm though your case some filtered some not, things get dusty fast and therefore clogged.
 
Had a stock GTX 260 running for 4 years in a case and barely any dust in it. However, all my cases are designed with intake air going through filters with a 5% to 10% positive air pressure to keep air from coming in through unfiltered openings and to help cooling. It will really be up to case design and preference, I guess.
 
Can you please show me in the review where the GTX 680 ran out of memory, because I went through the review again and couldn't find it.

I can't show you in this review, but if you read other review articles on this very site, you'll find situations where [H] themselves attributed drops in frame rates for the GTX cards to textures exceeding the card's memory. I don't know why the tests they just did in eyefinity/surround resolutions didn't reveal a problem. Perhaps they didn't push the AA high enough? In any case, unless NVidia is doing some on-the-fly texture compression and de-compression which somehow allows textures larger than 2Gb to be stuffed into 2Gb, this IS a documented issue. And in the event that NVidia HAS figured out how to squeeze larger textures into smaller amounts of video memory with no performance penalty or loss in quality, then I'd love for [H] to let us hear about how this marvellous new technology works.

In the meantime, this isn't really a debateable point in the absence of such magical technology. It's like asking me to prove that 2.5 gallons of water won't fit into a 2 gallon container. The proof is self-evident.
 
Last edited:
I can't show you in this review, but if you read reviews of the 1.5Gb GTX580 vs. the 3Gb 7970 on this very site, you'll find situations where [H] themselves attributed drops in frame rates for the GTX cards to textures exceeding the card's memory. I don't know why the tests they just did in eyefinity/surround resolutions didn't reveal a problem. Perhaps they didn't push the AA high enough? In any case, unless NVidia is doing some on-the-fly texture compression and de-compression which somehow allows textures larger than 2Gb to be stuffed into 2Gb, this IS a documented issue.

This isn't really a debateable point. It's like asking me to prove that 2.5 gallons of water won't fit into a 2 gallon container. The proof is self-evident.

Maybe because 2GB is more than 1.5GB? Perhaps Nvidia didn't pick a number out of their hat, but actually did some research before deciding on 2GB for these cards. Or maybe they just got lucky in their testing. But just because a GTX 580 hits a VRAM limit with 1.5GB doesn't mean a GTX 680 is going to hit a limit with 2GB.
 
You think Nvidia would have learned from so many aftermarket 3GB 580GTX what to do on their 6 series cards. Or followed AMD's lead and competed with the 7970 when it comes to framebuffer. Nvidia does what they want, they're like Robin Hood, so cool!
 
Maybe because 2GB is more than 1.5GB? Perhaps Nvidia didn't pick a number out of their hat, but actually did some research before deciding on 2GB for these cards. Or maybe they just got lucky in their testing. But just because a GTX 580 hits a VRAM limit with 1.5GB doesn't mean a GTX 680 is going to hit a limit with 2GB.

they didn't have many choices on ram sizeing, based on the memory bus.

1gb-2gb or 4gb.

In the case of the 7970, they could choose 1.5gb, or 3gb.

2gb is not enough. for extreme resolution gaming +AA
 
don't see what all the hype is over reference cards?

we already know the non reference cards are going to be much better. Guess if you want the rear exhaust its the way to go.

I just don't like the reference designs, because they clog with dust very easy. When your pushing 400+ cfm though your case some filtered some not, things get dusty fast and therefore clogged.

Non reference cards look like poo. Poo might be too much a compliment.
 
2gb is not enough. for extreme resolution gaming +AA

Didn't seem to cause too many problems in the [H] testing, or with the 2GB 6970s. We'll have to wait and see how much the 2GB limits them. For now, the only reason people think 2GB isn't enough is because the 79xx cards come with 3GB, not because any testing has shown that 2GB isn't enough.
 
Last edited:
536962_3504648584666_1525315285_3118472_1932887038_n.jpg


Someone on FB made me jelly.
 
Three 680 cards, and one 089 card, that's gonna be a monkey wrench ... :D
 
Okay...does ANYONE have any PPD info for the 680? Its posts like this that make me wonder if this new architecture is going to fold better than the 5xx gen.


Well I dunno exact data, but the first page of that analysis had some hard numbers and some useful indicators:
http://www.realworldtech.com/page.cfm?ArticleID=RWT032212172023&p=1

"The shared data bandwidth for the Kepler core is 0.33B/FLOP with 32-bit accesses, just half of GF104
...
The significant regression in communication bandwidth is one of the clearest signs that Nvidia has backed away from compute workloads in favor of graphics for Kepler
...
The other architectural change that favors graphics is simplified scheduling. The JIT in Kepler’s graphics driver is now responsible for scheduling instructions that can execute without any register dependencies
...
general purpose workloads are far less predictable and benefit from more dynamic scheduling; there is a reason that Fermi had such hardware in the first place."

So I think nVidia will be pushing GPU computing even harder toward their professional solutions, it's not making a "jack-of-all-trades" consumer chip like it tried to do with the 580.
 
Didn't seem to cause too many problems in the [H] testing, or with the 2GB 6970s. We'll have to wait and see how much the 2GB limits them. For now, the only reason people think 2GB isn't enough is because the 79xx cards come with 3GB, not because any testing has shown that 2GB isn't enough.

then you haven't seen many crossfire vs sli reviews on the new cards.

I will wait for offical [H] reviews, but from what i have seen in other reviews 7970 crossfire when it works properly spanks gtx 680 sli at extreme resolutions.
 
then you haven't seen many crossfire vs sli reviews on the new cards.

I will wait for offical [H] reviews, but from what i have seen in other reviews 7970 crossfire when it works properly spanks gtx 680 sli at extreme resolutions.

Actually I've looked at all the SLI reviews I could find, which wasn't too many, and in none of them did it appear that 2GB was a limiting factor, even at Eyefinity resolutions. The 7970 might be faster in those instances, but that doesn't mean that 2GB isn't enough. I do agree that the place you will likely find limitations is high settings with multiple monitors, I just haven't seen the tests that show it as a problem yet. You still might run out of GPU power before you run out of VRAM in those instances.

When it works properly. Nice caveat.

Edit: Looks like Brent ran into some problems at 5760x1200 with 4x AA. Although with only 50 fps with 2x AA I'm not sure how playable 4x would be anyway.
 
Last edited:
Edit: Looks like Brent ran into some problems at 5760x1200 with 4x AA. Although with only 50 fps with 2x AA I'm not sure how playable 4x would be anyway.

Hmmm. The plot thickens. The problem is that in 2 way SLI, you'd probably be pulling closer to 100fps at that resolution (I'm being generous to nVidia's driver optimization team) using 2x AA, which means that 4x AA should be perfectly playable if only you don't run out of video memory.

So many quick-fingered buyers... there may be a fly in that ointment yet. It's almost like all anybody wanted to hear was that the new nVidia card was 'faster', and that was enough for them to loose their marbles and load up their newegg carts without taking an extra couple of minutes to process the implications of a 2Gb frame buffer. Honestly, who would plunk down $500 or even $1000 for two cards before seeing any meaningful investigation of whether 2Gb is adequate for multi-monitor? I could understand this kind of enthusiasm for the competitor to the 7850 or 7870 cards (whatever they'll be called) competing in the $200-$300 range if it provided a similar degree of improvement over the competing AMD card, but this is a $500 card for christ's sake. You're MUCH better off buying two $250 cards from either the red or the green team and running them in crossfire/SLI for 1920x1080 gaming. But a $500 card implies you're at least THINKING of going multi-monitor, and almost certainly, that you're eventually hoping to throw in a second card. Do you really want to plunk down a thousand bucks and THEN hear that your 2Gb frame buffers are inadequate? Unless you're sure you're never going to be running anything higher than 1920x1080, that just seems kinda... impulsive. I'm waiting to hear the final word on this before I buy anything.

(Oliver Twist style London accent on)Brent, please sir, may we have a multi-monitor texture utilization review please! Before hundreds of thousands of lemmings run right off a cliff!(Accent off)

Thanks in advance, [H].
 
Last edited:
Honestly, who would plunk down $500 or even $1000 for two cards before seeing any meaningful investigation of whether 2Gb is adequate for multi-monitor? ... Do you really want to plunk down a thousand bucks and THEN hear that your 2Gb frame buffers are inadequate? Unless you're sure you're never going to be running anything higher than 1920x1080, that just seems kinda... impulsive. I'm waiting to hear the final word on this before I buy anything.

Have you heard a lot of people having issues with frame buffers on NVSurround or Eyefinity that have over 1GB of RAM? I haven't. I'm not really worried about it.
 
Hmmm. The plot thickens. The problem is that in 2 way SLI, you'd probably be pulling closer to 100fps at that resolution (I'm being generous to nVidia's driver optimization team) using 2x AA, which means that 4x AA should be perfectly playable if only you don't run out of video memory.

Huh? Did you mis-read the resolution that was at? It's only getting 50 FPS at 2x AA, and 60 with no AA, so how should they be getting 100 with 4x AA? You'd hardly even get 100 FPS at 1080p with those settings - forget about 5760x1200.
 
Back
Top