NVIDIA GeForce GTX 690 Dual GPU Video Card Review @ [H]

I'd love to have one of these but I just can't afford it.

I also worry that by the time they do drop in price (if they ever do) they won't be making anymore.

I guess it's a 680 now and then another 680 down the road.
 
Point worth mentioning about 690's potential effect on your HDD temps, in case no one has mentioned it (didn't see mention of it in the hardocp review; apologies if you covered it). This comes from the anandtech review:

"Of course this design means that you absolutely need an airy case – you’re effectively dissipating 150W to 170W not just into your case, but straight towards the front of your case. As we saw with the GTX 590 and the Radeon HD 6990 this has a detrimental effect on anything that may be directly behind the video card, which for most cases is going to be the HDD cage. As we did with the GTX 590, we took some quick temperature readings with a hard drive positioned directly behind the GTX 690 in order to get an idea of the impact of exhausting hot air in this fashion.

The temperature increase is reduced some thanks to the lower TDP of the card, but we’re still driving up the temperature of our HDD by over 10C. This is still well within the safety range of a HDD and in principle should work, but our best advice to GTX 690 buyers is to keep any drive bays directly behind the GTX 690 clear, just in case. That’s the tradeoff for making it quieter and capable of dissipating more heat than older blower designs."
 
Not sure that you're looking at but this review shows 7970 CF beating the 690 only in BF3.

I didnt mean to put the 7970's in there. Was not reading while typing. My point was yes it smashed the 7970's which is great cause thats the way it is but to have sli 680's even touch it sucks. yes it runs cooler, quieter and everything but I think Nvidia should of made it a bit more powerful. I may end up picking these up a 690 even on my single 1080p monitor.

IMO to have 680's in SLI to be 2% faster then the flagship 690 is a a slight fail in my book as the 690 was supposed to be the top dog here
 
Last edited:
IMO to have 680's in SLI to be 2% faster then the flagship 690 is a a slight fail in my book as the 690 was supposed to be the top dog here

This makes zero sense. One card based on the same GPUs and the single cards is supposed to be faster when it's in a more thermally and power constrained package?
 
This makes zero sense. One card based on the same GPUs and the single cards is supposed to be faster when it's in a more thermally and power constrained package?

Not to mention that a dual-GPU card has NEVER been faster than the SLI "equivalent", so I don't get the comments about how it's a failure because it can't outperform 680 SLI. Not to mention the fact that it can match/beat 680 SLI if overclocked.
 
Are you use that isnt a Crossfire problem? (which AMD has lots of).

Sometimes people mistake driver problems with Microstutter.

If someone has a game stutter, they instantly think (GOD this is that horrible microstutter people talk about).

Oh believe me I've been trying to fix this for months, tried a ridiculous number of CAP and driver combos, nothing works. I'm as certain as one can be without some sorta hardware probe that its microstutter
 
Maybe I am just hoping for to much in this industry. My opinion is this. Why waste the man hours & time to make a card that is = to 2 cards you already have on the market. Ok it runs quieter and uses less power cool thats a plus but & yes I have a but here.

Why not just make a card that straight murders everything on the market. I wouldnt even waste my man hours developing a card that is = to what you can do now. If I was or owned Nvidia I make a card that just destroyed anything on the market to sometime to come. Maybe I am just wishing for to much but it makes no sense to me. Im going to buy it more then likely to replace my 580 (Yes even on a single 1080p monitor). I guess I can dream right?
 
Maybe I am just hoping for to much in this industry. My opinion is this. Why waste the man hours & time to make a card that is = to 2 cards you already have on the market. Ok it runs quieter and uses less power cool thats a plus but & yes I have a but here.

Why not just make a card that straight murders everything on the market. I wouldnt even waste my man hours developing a card that is = to what you can do now. If I was or owned Nvidia I make a card that just destroyed anything on the market to sometime to come. Maybe I am just wishing for to much but it makes no sense to me. Im going to buy it more then likely to replace my 580 (Yes even on a single 1080p monitor). I guess I can dream right?

Well there are things like limits to technology, making it not cost $10k, that sort of thing. In a single card the 690 does destroy everything.
 
It IS a card that murders everything on the market because everything else you're comparing it to that (barely) beat it is two cards.

Practically same performance (esp if you crank up the PF)
Less power
Less noise
Less PCIe slots

So yeah you're totally hoping for too much. If they had a chip that they could stick two of on a card and beat the 680, then they could just as easily stick one on a card and it would be just as good if not better. Expecting 2 on one card to be better than 2 on 2 cards is completely irrational. Equal performance is the best you can hope for, and the 690 delivers.
 
Maybe I am just hoping for to much in this industry. My opinion is this. Why waste the man hours & time to make a card that is = to 2 cards you already have on the market. Ok it runs quieter and uses less power cool thats a plus but & yes I have a but here.

Not everyone wants or has the capability for two cards.

Argument invalidated.
 
I really want to replace my GTX580 with one of these but I think I'll wait until 790 =p, and when I get a job again. Have to say though, this card looks amazing for those who don't want to use more than one pcie slot.
 
Not to bash this review but i come to find to them worthless as i'm sure 90% of this site. Who cares about 5760x1200 or what ever most people here do not have that and don't care too. However, most people on here do have a nice 30" 2560x1600 monitor and want those benchmarks. As i'm sure they also want 1080p benchies aswell just to see what they really want and need.
 
Not to bash this review but i come to find to them worthless as i'm sure 90% of this site. Who cares about 5760x1200 or what ever most people here do not have that and don't care too. However, most people on here do have a nice 30" 2560x1600 monitor and want those benchmarks. As i'm sure they also want 1080p benchies aswell just to see what they really want and need.

If it can do 5760x1200, it can do 2560x1600. A single GTX 680 can do 1080p.
 
What I hate is this subjective "smoother" crap.
it doesnt matter what FPS amd gets now a days. When they achieve higher frames at a lesser cost its quickly ignored with the "smoother" gameplay shit.

It's hardly subjective, it's a very real thing and a major difference between CFX and SLI. Pound for pound my 560 ti SLI rig is slower in raw frames than my 5870 CFX setup however the 560 setup plays and feels significantly better every single time. In multi GPU raw frames mean diddly squat when 60 fps looks like 35.
 
Not to bash this review but i come to find to them worthless as i'm sure 90% of this site. Who cares about 5760x1200 or what ever most people here do not have that and don't care too. However, most people on here do have a nice 30" 2560x1600 monitor and want those benchmarks. As i'm sure they also want 1080p benchies aswell just to see what they really want and need.

TPU spends more time on more res' than most sites however I don't hold it against [H] or AT or anywhere else. The general consensus is - if the card doesn't shit the bed at a res at least a few steps above what you run at, isn't it clearly capable of running your res of choice just fine? In that regard, why bother spending hours and hours re-benching CoD on 15 different resolutions just to appease the few people still on 1280x1020?

No one needing something adequate for 1080p is buying a GTX 690.
That's like buying a Lambo just because you want to go 70.
 
TPU spends more time on more res' than most sites however I don't hold it against [H] or AT or anywhere else. The general consensus is - if the card doesn't shit the bed at a res at least a few steps above what you run at, isn't it clearly capable of running your res of choice just fine? In that regard, why bother spending hours and hours re-benching CoD on 15 different resolutions just to appease the few people still on 1280x1020?

No one needing something adequate for 1080p is buying a GTX 690.
That's like buying a Lambo just because you want to go 70.

120fps @ 120hz 1080p requires more GPU power than 60fps @ 60hz 2560x1600. Same for 1080p 3D. Hell, I need Tri 580's (and CPU speed to match, 2700k @ 5GHz) just to run BF3 MP at 120+ fps/hz at 1080p.
 
When someone says "1080p" I see that (and I would argue that most other people do as well) as 1080p60 ie 1920x1080 60fps.
If you want to run higher than that by not running Vsync, that's your prerogative. ;)
 
Why cant you run 120hz with vsync? or 3D with vsync? both of those require more GPU horsepower than 2560x1600 @ 60hz. Point is, 690, 680 SLI and Tri SLI results at 1080p are useful for many people, from that I can gauge 3D and 120hz performance.
 
Last edited:
Are these going to be sold straight from nvidia like the the cards you find in Best buys with just nvidia branding or are they going to be sold threw there AIB?
 
um ever heard of supply and demand?

No, troll.... What's that?

Ever heard of voting with your wallet? By expressing my distaste for a vendor's actions I bring crap like this to others' attention. The hope being to negate any gains, newegg in this case, might have made through price gouging by convincing others to boycott them and opt for a different vendor.
 
So the 690 partially vents into the case? What does that do for CPU temperatures?
 
When someone says "1080p" I see that (and I would argue that most other people do as well) as 1080p60 ie 1920x1080 60fps.
If you want to run higher than that by not running Vsync, that's your prerogative. ;)

V-sync adds input lag. I used to be a competitive Quake 3 player so I am extremely sensitive to input lag and thus can't stand playing anything with v-sync enabled. 120hz monitor + high fps ftw :)
 
So the 690 partially vents into the case? What does that do for CPU temperatures?

You would probably want to put it into a case with an exhaust fan directly in front of the gpu like the In Win Dragon Slayer. Probably lots of better cases out there with the same functionality but you get the idea...
 
120fps @ 120hz 1080p requires more GPU power than 60fps @ 60hz 2560x1600. Same for 1080p 3D. Hell, I need Tri 580's (and CPU speed to match, 2700k @ 5GHz) just to run BF3 MP at 120+ fps/hz at 1080p.

This is just my opinion but I would take a 2560x1600 PLS screen over 1080p 120hz anyday. Doing normal work on a 1080p 27" is painful, everything is big and blocky (1080p is too low of a resolution for a 27"), not to mention its a TN panel which is garbage - terrible colors and image shifting when you change viewing angles. Further, 2560 resolution has 3.7 million pixels, 1080 has 2.1. 2560 requires far more horsepower than 120hz 1080p.

Anyway, thats my personal comment on the matter, although I agree that 1080p benchmarks aren't a bad thing. I am still confused as to why TPU has 1280x1024 benchmarks though? That shit doesn't make sense at all. Nobody buys a 999$ video card for 1280x1024, lol.
 
This is just my opinion but I would take a 2560x1600 PLS screen over 1080p 120hz anyday. Doing normal work on a 1080p 27" is painful, everything is big and blocky (1080p is too low of a resolution for a 27"), not to mention its a TN panel which is garbage - terrible colors and image shifting when you change viewing angles. Further, 2560 resolution has 3.7 million pixels, 1080 has 2.1. 2560 requires far more horsepower than 120hz 1080p.

Anyway, thats my personal comment on the matter, although I agree that 1080p benchmarks aren't a bad thing. I am still confused as to why TPU has 1280x1024 benchmarks though? That shit doesn't make sense at all. Nobody buys a 999$ video card for 1280x1024, lol.

Thats why I like to have both, I prefer 1440p over 1600p though. There are some very nice TN panels these days, and all TN isnt equal, theres a reason why high end gaming displays like the VG278H cost more than a U2711 despite being 1080p TN vs 1440p IPS. 60hz IPS/PLS displays just dont come close for online gaming, nor do they support 3D. But each has their place imo, I prefer some games at 1440p 60hz vsync, others at 1080p 120hz no vsync, and others in 3D. On a 1080p 27", SSAA is your friend :)

Further, 2560 resolution has 3.7 million pixels, 1080 has 2.1. 2560 requires far more horsepower than 120hz 1080p.

Yes I'm aware of pixel count, but it requires more GPU horsepower to run 120fps @ 1080p/120 than 60fps @ 1600p/60, which is how I like to run my games, with fps at or above my refresh rate. 1600p 60hz is 3.7MP x 60 = 222 MP/s. 1080p 120hz is 2.1MP x 120 = 252 MP/s.

Yah 1280x1024 is a bit low for 690 I'll agree with you there lol, but its also good to see where CPU limitation ends and GPU limitation begins I guess.
 
Last edited:
Yes I'm aware of pixel count, but it requires more GPU horsepower to run 120fps @ 1080p/120 than 60fps @ 1600p/60, which is how I like to run my games, with fps at or above my refresh rate. 1600p 60hz is 3.7MP x 60 = 222 MP/s. 1080p 120hz is 2.1MP x 120 = 252 MP/s.

I think you calculator is a bit off, 2560*1600 = 4.1MP so at 60 Hz it's 246MP/s.
2560x1600 is almost exactly twice the number of pixels as 1920x1080, it's a pretty useful rule to know.
 
I was going from xoleras figures, but thanks. Its actually 1600p 246MP/s vs 1080p 249MP/s.

2560x1440 is 3.7 btw, for 222MP/s.
 
's hardly subjective, it's a very real thing and a major difference between CFX and SLI. Pound for pound my 560 ti SLI rig is slower in raw frames than my 5870 CFX setup however the 560 setup plays and feels significantly better every single time. In multi GPU raw frames mean diddly squat when 60 fps looks like 35.

I ran mGPU with 6870s and it never bothered me once.
It is very subjective. Which is why its hardly been mentioned in any reviews on this site up until a gernation or so ago. mGPU has been around a lot longer, yet never any mention of frame metering before. Never did I see Nvidia cards before get mentioned for having higher frames but less smoothness. Its just odd that suddenly its a deal breaker.
 
I ran mGPU with 6870s and it never bothered me once.
It is very subjective. Which is why its hardly been mentioned in any reviews on this site up until a gernation or so ago. mGPU has been around a lot longer, yet never any mention of frame metering before. Never did I see Nvidia cards before get mentioned for having higher frames but less smoothness. Its just odd that suddenly its a deal breaker.

Yes, it's subjective, not only to the person viewing but also to the hardware set. It doesn't occur for everyone.

And there has been talk of microstuttering for a long time now. In fact, here is a post on [H] from 2008:

http://hardforum.com/showthread.php?t=1317582

Relatively recently since both AMD and nVidia are doing multi-GPU capability, it has become a subject of contention. Before 2004-2005 there was no SLI/CFX.
 
No 690 vs 6990? Still need to pit dual GPU against dual GPN no matter the gen. AMD's latest right now is 6990 in the dual GPU per card offering. Until the 7990 comes out if they do.

I am still waiting on Big K which is rumored to be able to stomp this thing into the ground.
 
Yes, it's subjective, not only to the person viewing but also to the hardware set. It doesn't occur for everyone.

And there has been talk of microstuttering for a long time now. In fact, here is a post on [H] from 2008:

http://hardforum.com/showthread.php?t=1317582

Relatively recently since both AMD and nVidia are doing multi-GPU capability, it has become a subject of contention. Before 2004-2005 there was no SLI/CFX.

Yeah there was. There was scan line interleaving from 3dFX and others and it worked very very well.

Okay so maybe there wasn't Scalable Link Interface which is different but SLI before there was SLI was epic and elite and I had it and it was a head turner for sure with my dual 3dfx voodoo 2's in my 166 Pentium mmx
 
Yeah there was. There was scan line interleaving from 3dFX and others and it worked very very well.

Okay so maybe there wasn't Scalable Link Interface which is different but SLI before there was SLI was epic and elite and I had it and it was a head turner for sure with my dual 3dfx voodoo 2's in my 166 Pentium mmx

Lol. Yeah, not the same technology really. :p

However, that does leave me wondering if microstutter or some form of it did exist back then on Voodoo2 SLI...
 
No 690 vs 6990? Still need to pit dual GPU against dual GPN no matter the gen. AMD's latest right now is 6990 in the dual GPU per card offering. Until the 7990 comes out if they do.

I am still waiting on Big K which is rumored to be able to stomp this thing into the ground.

It's pretty safe to say the 690 would utterly destroy the 6990, It's not worth testing. A 7990 would most likely be slower than 7970 CF, which is slower than A 690. Not to mention how broken AMD drivers are right now.

It's amazing to think this could have been a dual 660. :cool:
 
It's amazing to think this could have been a dual 660. :cool:

It makes me wonder, though, if they would have even made a 690 if the 680 ended up being a mid-range product.

I think they may have initially expected the 680 to be mid-range but my guess is they changed that thinking long before the AMD 7000 series was released, rather than releasing it as a top-end card after seeing the 7000 series' performance, like many people believe.
 
Back
Top