NVIDIA Kepler GeForce GTX 680 Video Card Review @ [H]ardOCP

Amazing results. I would never have thought it possible from nVidia. Thanks for the article.
 
Three 680 cards, and one 089 card, that's gonna be a monkey wrench ... :D

lol. I am not OCD or anything. But how does a someone put those in the boxes, see the three and the fourth sticking out, and not get the twinge to turn it over?
 
Would it be a good idea for me to SLi 680s on a single 1200 res monitor? I play BF3 and i would like it to never dip below 60fps while playing @ ultra 4xaa.
 
Would it be a good idea for me to SLi 680s on a single 1200 res monitor? I play BF3 and i would like it to never dip below 60fps while playing @ ultra 4xaa.

Why would you waste 1k$ worth of GPU on a cheap monitor?

Get a 2560 monitor or go surround. And don't waste money if your monitor is low resolution - just IMO of course :)
 
Last edited:
lol. I am not OCD or anything. But how does a someone put those in the boxes, see the three and the fourth sticking out, and not get the twinge to turn it over?
I'm one of those people who must always keep his money in his wallet facing the same way, ordered from the smallest denomination to the largest. Yeah, the one box drives me nuts.
 
I'm one of those people who must always keep his money in his wallet facing the same way, ordered from the smallest denomination to the largest. Yeah, the one box drives me nuts.

I do the same to keep track of how much I'm spending. That's normal I think.
The funny thing is the only used packing nuts on one side of the box.

A bigger box would have been a better and more secure.
 
I am going to assume that one box with four GPUs is courtesy of NewEgg's crappy packing department.
 
Delete post, I changed out the SLI connector with a new one and its working properly now. Yay for random failures!
 
Last edited:
Comparing SLI 680 to xfire 7970 there are areas that are 10-20 fps slower in crysis 1/2, this is 2560 resolution with ultra textures / dx11...same for metro 2033 and Witcher 2 at the same resolution

Those are all games I want to play on my new 30" LCD at 2560. Is one 680 enough to play them MAXED OUT?
 
Excellent review, as always! I really appreciate the effort you guys put into getting everything right, and getting it out so fast for us nitpicking readers! I'm really tempted to buy one of these cards right now. :) But I'm going to wait for a few weeks. There is no real emergency here, my aging GPU will still play all my games. I always thought that buying a new high-end GPU on the first week of release was kinda asking for it anyways (and not only the early-adopter premium).

As impressed with the GTX 680 as I am, I'm still trying to figure out where it best fits considering the choices NV made for this part. The [H] review says that at 1080p/1200p, the 680 is the clear winner over the 7970. And the 2GB frame buffer will certainly pose no problems at those resolutions. But looking at the numbers I think its pretty clear that both Kepler and Tahiti are overkill on a single monitor @ 1080/1200p. Yet at the higher resolutions that can really use the power of these beastly GPUs, the Tahiti part runs pretty even with Kepler, not even taking Tahiti's OC headroom into consideration.

Here is my thought experiment: under what conditions would a GTX 680 improve real-world gaming experience over the HD 7970 at 1080p? I wasn't aware of any title that could make the 7970 cry for help at that resolution. I mean, if the 680 can deliver 145fps in one title while the 7970 can only manage 120fps with the same title, is there a real-world gameplay difference? Even on a 120Hz HD display? If someone currently has a single 1080/1200p monitor, they may certainly be considering an eventual triple-display config as they plunk down for a top-of-the-line GPU, which only makes sense. But once you get to 5760x1080+ resolutions the performance delta between the two cards vanishes, or even reverses. So, again, where does the 680 fit into the high-end gaming ecosystem? Has Nvidia re-taken the flat-out performance crown, or has it taken the efficiency crown from AMD (a real switch)? Is the top performer at 1080p the benchmark for high-end gaming? Or would that be 1600p? Or is it NV Surround/Eyefinity?

Obviously, until AMD drops their prices the GTX 680 is a better choice at the resolutions that it beats or matches the 7970, if for no other reason than being $50 cheaper. But at those resolutions, wouldn't an even more appropriate choice be a non-flagship GPU? Are there titles that bring a GTX 570 or 560 Ti or Radeon 6970 or 7800 series to their knees at 1080p? (I'm curious.)

---------------------------------------------------------------------------------------------------------------------------
Of course it is not only frames/sec that improves gameplay experience, and NV has come to the table this time with a passel of new features like Adaptive V-sync, SB-style auto-clocking for max efficiency, a new AA mode, lifted the limit on simultaneous texture processing, comes in at a lower TDP and power draw than AMD (!), and other goodies. These considerations, along with my personal frustration over AMD drivers of late, are currently leading my list of reasons to upgrade. But it is a tough choice for me as I ponder my future gaming situation. If I know I am staying with my single 24" display, then I won't worry about the 'performance-midrange' memory sub-system on the 680 (compared to the 580 or the HD7900 series). If I think there is a good chance I will go to a higher resolution for gaming within the next year, then I think the Tahiti part would serve me better. Maybe. Damn...
 
Not a chance. Even 680 SLI won't max out Witcher 2 and Metro 2033 at 1600p

OK, I must assume that those games are not typical because last Aug I was rebuilding my computer and I was told this

"It really comes down to if you're going to go multimonitor or not. Because the build I posted above is overkill for a single screen. Well, the GPUs anyways. "

About these GPUs

MSI N580GTX Lightning Xtreme Edition GeForce GTX 580 (Fermi) 3072MB x 2

So what DOES it take to play games maxed out these days on one 30" LCD?

PS Games I am waiting to play include Crysis 1/2, FryCry 1/2/3, ME 1/2/3, Deus Ex HR (all bought at Steam's Christmas Sale)

I played Metro 2033 with the best settings I could with 8800 Ultras SLI, and was frankly not that impressed with the graphics,. what would max settings get me? Crysis is MUCH more impressive.
 
But looking at the numbers I think its pretty clear that both Kepler and Tahiti are overkill on a single monitor @ 1080/1200p.

Are there titles that bring a GTX 570 or 560 Ti or Radeon 6970 or 7800 series to their knees at 1080p? (I'm curious.)

Great post let me address a couple points. Battlefield 3, especially with 4x MSAA is where it matters. I'm going from a 580 to a 680. Looking at the review, a min of 40 fps vs 29 fps and an average of 59.4 vs 45.6. This is at 1080p. To me, especially in a fast paced multiplayer shooter, this makes a huge difference. The min fps is especially where it comes into play. Deus-Ex also shows an improvement against a 7970. While the average fps is the same, the min fps is far higher which smooths out the experience. I also hate tearing so Adaptive V-sync is something I'm going to spend a lot of time with and I hope it will work out very well. Some of the new tech in this generation is amazing. Add in the lower temps/power usage and it's a real winner.

I want to address what I think is a VERY common misconception or maybe even elitist attitude on the forums. That these cards are "overkill" for 1080p. In many of the games tested these cards are not overkill. A lot of it has to do with personal preference, so I will state mine. I have a 27" 1080p monitor and a 50" TV both of which are connected to the PC. Depending on the game I am playing I use the monitor or TV (BF3 vs Street fighter for example). However, I like to CRANK things all the way up. Having a 580 GTX (soon 680 GTX) allows me to turn on all the eye candy including MSAA etc to have the visuals really shine. I also expect 60fps as often as possible. Combine these settings and the 680 GTX is a perfect fit for max eye candy 1080p gaming.

I have also noticed that people who game on multi-monitors have to frequently turn down effects because the cards cannot handle everything maxed out. They also run into issues with proper multi-monitor support in some games as well as SLI issues if they are using a 2nd or 3rd card. Don't get me wrong, there is nothing wrong with multi monitor setups. If that is what you prefer then please by all means enjoy it and ignore my rant about 1080p single monitor. I just simply prefer to game from 1 display and wanted to raise awareness that high end cards are needed for even 1080p to drive the newest games with all settings maxed out.
 
I max everything out at 2560x1600 no problem on a single 7970... not 60FPS in every case but that doesn't bother me. As long as it doesn't dip below 30 and plays without any noticeable hiccups, I'm good...

Also I find that sometimes, maxing everything out doesn't make a game look better, subjectively. Metro 2033 isn't very demanding at all if you turn off the Depth of Field effect and I think the IQ is worse with it on. The way it blurs objects looks very unrealistic and overdone.

Crysis 2 has way better DoF, doesn't kill your performance either.
 
Most games to be "maxed out" have one inane setting that makes FPS plummet but does little for overall visual quality. In Metro 2033 it's "Depth of Field," which is just some over zealous bokeh filter, or in the Witcher 2 it's "Uber Sampling," which is just some crappy super sampling method. Either can be turned off with no noticeable visual quality loss and then those games will play fine on a single high-end card (7970, GTX 680, etc.).
 
Most games to be "maxed out" have one inane setting that makes FPS plummet but does little for overall visual quality. In Metro 2033 it's "Depth of Field," which is just some over zealous bokeh filter, or in the Witcher 2 it's "Uber Sampling," which is just some crappy super sampling method. Either can be turned off with no noticeable visual quality loss and then those games will play fine on a single high-end card (7970, GTX 680, etc.).

YES! LOGICAL! THANK YOU!

The term MAXED OUT gets over used and in the hands of an anal gamer can be very misleading :)
 
YES! LOGICAL! THANK YOU!

The term MAXED OUT gets over used and in the hands of an anal gamer can be very misleading :)

Says the guy who used it in the question in the first place :rolleyes: If you don't define maxed out as all settings to the highest at close to 60 FPS then what is it?
 
Nvidia brought some much needed competition in prices, which is great for us consumers, though not so great for my 7970 resale value. :p I guess that's the price to pay for getting this level of performance 3 months early.

nah, if you're planning on keeping it, then it doesn't hurt your resale value at all imo as it would be a ways down the road anyway.

Cheers

Google chrome extension:
Page Monitor

It refreshes a page at a specified interval (don't do it a crazy setting like 1 sec, 30 secs is more reasonable) and more importantly it tells you when that page has changed.


but this would take the fun out of "do you feel lucky punk, well do ya" F5 trigger pulls :)
 
Last edited:
Says the guy who used it in the question in the first place :rolleyes: If you don't define maxed out as all settings to the highest at close to 60 FPS then what is it?

OK, Fair enough, maxed out IS maxed out, but then at least say afterwards that at these lesser settings there is no loss in image quality. I'm sure you realize not all games are coded well, so it does not seem fair that some badly coded effect that does not enhance IQ should make a GPU look weak because it can be said it cannot handle game X maxed out.
 
YES! LOGICAL! THANK YOU!

The term MAXED OUT gets over used and in the hands of an anal gamer can be very misleading :)

In general maxed to me, means highest settings enabled and fps average beyond your monitors' refresh rate.
So 30fps with maxed settings is great and all but it's below your monitors refresh rate, thus a subpar experience.
I mean a person can say this and that about 30fps, but once your rated 60/75/120 refresh rate is reached, it feels like a different game.
 
I think "maxed out" means maxed out. My point was, why bother? You'll really end up paying $100's of dollars more for little more just because of poor or unnecessary programming.
 
Great post let me address a couple points. Battlefield 3, especially with 4x MSAA is where it matters. I'm going from a 580 to a 680. Looking at the review, a min of 40 fps vs 29 fps and an average of 59.4 vs 45.6. This is at 1080p. To me, especially in a fast paced multiplayer shooter, this makes a huge difference. The min fps is especially where it comes into play. Deus-Ex also shows an improvement against a 7970. While the average fps is the same, the min fps is far higher which smooths out the experience. I also hate tearing so Adaptive V-sync is something I'm going to spend a lot of time with and I hope it will work out very well. Some of the new tech in this generation is amazing. Add in the lower temps/power usage and it's a real winner.

I want to address what I think is a VERY common misconception or maybe even elitist attitude on the forums. That these cards are "overkill" for 1080p. In many of the games tested these cards are not overkill. A lot of it has to do with personal preference, so I will state mine. I have a 27" 1080p monitor and a 50" TV both of which are connected to the PC. Depending on the game I am playing I use the monitor or TV (BF3 vs Street fighter for example). However, I like to CRANK things all the way up. Having a 580 GTX (soon 680 GTX) allows me to turn on all the eye candy including MSAA etc to have the visuals really shine. I also expect 60fps as often as possible. Combine these settings and the 680 GTX is a perfect fit for max eye candy 1080p gaming.

I have also noticed that people who game on multi-monitors have to frequently turn down effects because the cards cannot handle everything maxed out. They also run into issues with proper multi-monitor support in some games as well as SLI issues if they are using a 2nd or 3rd card. Don't get me wrong, there is nothing wrong with multi monitor setups. If that is what you prefer then please by all means enjoy it and ignore my rant about 1080p single monitor. I just simply prefer to game from 1 display and wanted to raise awareness that high end cards are needed for even 1080p to drive the newest games with all settings maxed out.

Agreed that 7970 or 680 may or may not be overkill for 1080p depending on personal preferences. Minimum frame rates are the key and the newer cards do a much better job keeping the minimum 60 fps goal. Obviously some of us do not mind the lower frame rates as much.
 
Agreed that 7970 or 680 may or may not be overkill for 1080p depending on personal preferences. Minimum frame rates are the key and the newer cards do a much better job keeping the minimum 60 fps goal. Obviously some of us do not mind the lower frame rates as much.

I have friends who tolerate fps below 60 but I cannot. I even have a friend who can't stand not being at 120fps+ and to achieve that he just lowers settings, seems fps is more important than texture resolution for him.

it all looks so stuttery below 60 to me :(
 
I even have a friend who can't stand not being at 120fps+ and to achieve that he just lowers settings, seems fps is more important than texture resolution for him.

I'm part of that group, for multiplayer at least.
 
I have friends who tolerate fps below 60 but I cannot. I even have a friend who can't stand not being at 120fps+ and to achieve that he just lowers settings, seems fps is more important than texture resolution for him.

it all looks so stuttery below 60 to me :(

Yup, it's all about preference, and I'm also in the same group of people who will not tolerate anything under 60 fps, hence my username. ;) Once a game dips under 60 fps, it's just not smooth.
 
I remember the days of 120Hz CRT monitors and the vast, seriously eye-numbness solving solution of increasing the ever default of 60Hz flicker to 120Hz. It was a serious improvement in No Flickering.

The same applied to my games over a decade ago. I demanded nothing less than 120Hz on my monitors, and easily saw the difference at anything less than 60 or 70 FPS. Hell, some would argue that you can tell less than 120 FPS.

"But the human eye can only read up to 30 FPS." they would argue when I would complain about a game that purposely limited it. Does everyone think ur eyes are in sync with the exact 30 FPS the monitor is at? Hell no, and that's why we have 60 Hz. Bit perhaps we have involved, I can see a difference in 60Hz and 120Hz.

Hell, even Batman Arkylm City still limits the FPS to a ridiculous 34 FPS or something like that! It was horrible! Thank social influence that at lease the game developers allow someone to go in and edit the settings file manually.

I've been swapping out a lot of mobos lately and installed games 3 to 4 times over now (almost used up by credits of the BAC installs, how lame) because of the formatting. Every damn time i have to go in and raise all limits to up beyond 130 FPS, in almost every game.

I have 3x 24" 120 Hz monitors. It is required for 3D Vision and I use all three with NV Surround. I can tell a difference at 60Hz vs 120Hz when switching from laptop (that only supports 60Hz, damn Dell) to my gaming machine.

I had 3-way EVGA 580 3GB Hydros that kept most games around 120 FPS, except BF3 under ultra settings with 4x MSAA.

I just sold my prized 580s and dipped into two EVGA GTX 680s, though only 2GB. I suspect disabling MSAA lowers gram enough to keep it under 2GB. They will be for sale though when the 4GB models come out on a few months.

Sorry, my point being is the new FXAA I am putting faith in, and think I can lower my home temp bills by having much smaller power draw to have 120 FPS games. May have to go 3-way 680s, but we but ill see.
 
Sorry if this has been mentioned before but...

The 1920X1200 graph says FXAA is ENABLED, while the text before says it was disabled, and only MSAA 4X was applied.
 
That's a very good point.

I've just got my POV GTX 680 today but I'm a little disappointed that there isn't at least one tech demo from NVIDIA showcasing TXAA. That means we'll have to wait until games include it to see just how good it really is. Hopefully, NVIDIA will eventually be able to add it as a selectable AA setting in the driver profiles like they've done for FXAA (finally!!!) in the v301.10 drivers. If they don't then I fear it'll be underused and as much of a letdown as PhysX was.

TXAA can't be enabled in an override fashion. It's based on some level of MSAA (meaning you have to either enable MSAA then this special TXAA mode, or you need to make a call into NVAPI that will enable both for you. Either way it requires the app to opt-in.
 
Zarathustra[H];1038527643 said:
Really? How come this bug doesn't prevent my 7970 from running at PCIe 3.0 speeds?



It is if you have plenty of PCIe lanes or a single GPU.

Where it becomes relevant is when you are trying to run multiple GPU's on boards with fewer PCIe lanes, or poorly laid out motherboards that trip the X16 lanes down to fewer x8 or even x4 over nothing.

If you are running SLI and are forced into a 16x-4x config due tot he layout, if you are running at 3.0, that still means your board in the 4x slot gets the equivalent of 8x 2.0 speeds and there is less of an impact. If you are running at 2.0 speeds, there is a good chance of a performance impact from the board running at 4x.

1. Because they have different hardware on board. By your logic a game dev would only ever need to test 1 GPU to ensure that all GPUs work well for game. The game could still have application bugs via the exact same APIs, but some generations of hardware will show it and some won't. And it's also possible that AMD has been more fast and loose with some clocking or spec details to work around it. You're oversimplifying the technology.

2. No motherboard forces x16 and x4 for an SLI config. They drop to x8/x8 at worst because x8 vs. x16 has no perf implications in real world gaming scenarios. Look up any article on GPU PCIE scaling. You're manufacturing a use case here.
 
Screenshots show what?
I meant camera shots. They show the blurring on 120 Hz LCDs compared to CRTs.

It does work that way. Non-instantaneous pixel response times yield an after-image, e.g. ghosting, making each frame appear individually less discrete.
Yes, but it is not a plus, nor does it smoother and more fluid picture make.
 
2. No motherboard forces x16 and x4 for an SLI config. They drop to x8/x8 at worst because x8 vs. x16 has no perf implications in real world gaming scenarios. Look up any article on GPU PCIE scaling. You're manufacturing a use case here.

while I'm not going to go digging for a source right now, I am almost positive that this statement is false and there are boards that will not swap to 8/8. Of course these would typically be lower end and not bought primarily for sli/xfire, but I believe your blanket statement is incorrect.
 
2. No motherboard forces x16 and x4 for an SLI config. They drop to x8/x8 at worst because x8 vs. x16 has no perf implications in real world gaming scenarios. Look up any article on GPU PCIE scaling. You're manufacturing a use case here.

while I'm not going to go digging for a source right now, I am almost positive that this statement is false and there are boards that will not swap to 8/8. Of course these would typically be lower end and not bought primarily for sli/xfire, but I believe your blanket statement is incorrect.

You are correct, Tiporaro - there are boards that only support x16x4 CrossFire and SLI because they lack the PCIe switches to split the lanes to x8x8. The Asus P8Z68-V LE is one such motherboard, and there are quite a few others. They are lower-end boards though.
 
You are correct, Tiporaro - there are boards that only support x16x4 CrossFire and SLI because they lack the PCIe switches to split the lanes to x8x8. The Asus P8Z68-V LE is one such motherboard, and there are quite a few others. They are lower-end boards though.

I was going to say Asus is known for doing this in their cheaper sli boards. Just about every board these days is multi GPU capable, but the big difference in the cheaper ones versus the more expensive (other then bios options and power options) is their lanes and how they manage them.
 
Back
Top