ASUS GeForce GTX 680 DirectCU II TOP GPU Review @ [H]

Clocked at the same frequency there is a gap of almost 10 % in performance between the GTX 670 and the GTX 680 in theoretical benchmarks such as 3D Mark 11. While in games this is another thing, we noticed only 3 % difference in this case, nothing to worry about I would say. The reason for thise differences can be found in the specs of the card. The GTX 680 simply has 192 more shaders than the GTX 670.

http://www.ocaholic.ch/xoops/html/modules/smartsection/item.php?itemid=742&page=0

now keep going on with your 680 nonsesne but most people here know better. :rolleyes:
 
Nice work comparing to a 680 that isn't un locked and intentionally downclocked :rolleyes: You can't overclock a 670 to the same levels because of kepler throttle and overcurrent throttle. Again, how convenient that you ignore this.

As I said the UNLOCKED 680 with voltage will completely destroy the 670 which I have shown you on the previous page. If it makes you feel better about your cheap purchase though its all good.
 
Nice work comparing to a 680 that isn't un locked and intentionally downclocked :rolleyes: You can't overclock a 670 to the same levels because of kepler throttle and overcurrent throttle. Again, how convenient that you ignore this.

As I said the UNLOCKED 680 with voltage will completely destroy the 670 which I have shown you on the previous page. If it makes you feel better about your cheap purchase though its all good.
learn to read. I said there was about a 5% difference CLOCK for CLOCK. keep making a fool of yourself though by changing arguments over and over.
 
learn to read. I said there was about a 5% difference CLOCK for CLOCK. keep making a fool of yourself though by changing arguments over and over.

Just give up - if they want to argue that the 680 is better overclocked (as if the 670 can't also be overclocked) let them. If they want to argue that the extreme editions of the 680 are better than the 670, well that's true, but not many people want to pay the premium for that kind of performance. It's like arguing that the 690 is better than the 670 - no shit, but the fact of the matter is that, for most people, the 680 isn't worth it over the 670. Anyone who can read a review and do some simple math knows the real story.
 
Just give up - if they want to argue that the 680 is better overclocked (as if the 670 can't also be overclocked) let them. If they want to argue that the extreme editions of the 680 are better than the 670, well that's true, but not many people want to pay the premium for that kind of performance. It's like arguing that the 690 is better than the 670 - no shit, but the fact of the matter is that, for most people, the 680 isn't worth it over the 670. Anyone who can read a review and do some simple math knows the real story.

That's exactly what I thought, but wanted to get some other opinions so I asked the initial question.
 
Does the VGA hotwire from the Maximus V Extreme work with any 680 or only with certain cards like the Asus II Top? Will it work with EVGA Classified (instead of EVBot)? How about a 670 card?
 
Does the VGA hotwire from the Maximus V Extreme work with any 680 or only with certain cards like the Asus II Top? Will it work with EVGA Classified (instead of EVBot)? How about a 670 card?

It will but you need to mod it, its complicated, so in short only for DCII/ROG cards.
 
looking at details for my first foray into modding, and I'm wanting to watercool one of these. Does anyone know of a waterblock that will fit this? Was looking at EK initially, but I understand that they may not be the best way to go for a waterblock, considering recent quality issues.
 
looking at details for my first foray into modding, and I'm wanting to watercool one of these. Does anyone know of a waterblock that will fit this? Was looking at EK initially, but I understand that they may not be the best way to go for a waterblock, considering recent quality issues.

I believe EK is the only one who currently makes a block. I wish someone would WC this card and post some shots. ;) I own 2 of these cards and plan to WC them but not for another month+.
 
Guys, please mention your ambient room temps and the temps of your TOP at Idle and full load. I have a NZXT Phantom 410 with 2 AF 140`s at the top for exhaust and 2 SP 120`s on the front for intake. I also have installed the default NZXT 140 on the side panel right on top of the card as an exhaust. The thing is I am still not able to get the temps below 67 degrees on the card.... What am I doing wrong?.... or am I just too thick to expect lower temps from my card?... At Idle the temp seems OK at 33 degrees. My average room temperature is around 25 degrees.

I am also having massive problems with the VSync setting on the card. I am using the 301.2 driver at the moment and I cant seem to get the Adaptive VSync option to work properly. When I enable Adaptive VSync on the control panel the game just does not seem to recognize it. I still get tearing on my 19 x 10 monitor and the frame rate still goes above 60 fps. If I switch to the 306.xx beta driver recently released, I got Adaptive VSync to work but I am getting massive fps drops in games. Now I dont know which driver and settings to use with the card. Completely lost... :(....
 
Guys, please mention your ambient room temps and the temps of your TOP at Idle and full load. I have a NZXT Phantom 410 with 2 AF 140`s at the top for exhaust and 2 SP 120`s on the front for intake. I also have installed the default NZXT 140 on the side panel right on top of the card as an exhaust. The thing is I am still not able to get the temps below 67 degrees on the card.... What am I doing wrong?.... or am I just too thick to expect lower temps from my card?... At Idle the temp seems OK at 33 degrees. My average room temperature is around 25 degrees.

I am also having massive problems with the VSync setting on the card. I am using the 301.2 driver at the moment and I cant seem to get the Adaptive VSync option to work properly. When I enable Adaptive VSync on the control panel the game just does not seem to recognize it. I still get tearing on my 19 x 10 monitor and the frame rate still goes above 60 fps. If I switch to the 306.xx beta driver recently released, I got Adaptive VSync to work but I am getting massive fps drops in games. Now I dont know which driver and settings to use with the card. Completely lost... :(....

Download MSI Afterburner and modify the fan curve. Also, make sure there's good case airflow to feed the GPUs. The stock coolers on these are good but they dump the heat into the case which is bad. So keep that in mind. See my sig for hardware. I have both GPUs at 1300 boost and run at max power for games, etc.

I ran a very quick test for you as I'm about to crash for the night...

Idle: Room 23C, GPU1 31C, GPU2 36C
After FurMark 1.10.1 Benchmark Preset 1080: GPU2 64C, GPU2 58C
 
That is pretty good considering you have two in SLI.... but I wonder how you keep the cards so cold especially when the top card will blow hot air right on the bottom one.... it should get pretty toasty logically but yeah as you said case air flow also matters.... I fiddled with the case fans a bit after posting and I figured that the fans had been set at the lowest possible speed and that is why I was getting high temps.... Now I am idling at 31 and loading at 59 max during intense sessions of Crysis 2 which I think is much better.. gives me more rrom for overclocking.... even lower in less demanding games. The 140 I recently installed on the side panel is doing a marvelous job... It took away nearly 8 degrees off the GFX.... lol....

I am not sure about this but the above temps I am getting are with Vsync set in game and not on the control panel. When I disable Vsync on the control panel, I am ramping up to 65 degrees... weird...:confused:.....
 
Back
Top