GeForce GTX 690 - Perfection Inside and Out @ [H]

If you ask me. This is a card that would need pci 3.0....thats alotta info being processed on the beast!


Actually, where PCI-e 3.0 would/should/maybe be important is already there.....on the card between the GPUs. As for dual GPU cards I would suggest this card would benefit less than traditional two-card SLI.
 
TPU got theirs earlier, it appears, so thanks to them we at least have one general piece of information--size:

16.jpg


Dare I say it, I'm digging the industrial design the more I see it...
 
anyone notice the card in the last pic looks like it has an H in the design?

makes me think the new paradigm in GPU design . engineers ask yourself "WWKD"?

what would Kyle do ;)
 
i'm really intruiged by the cooler and the most exciting thing for me is to see how it overclocks since at stock it will be clocked at lower than stock 680s, perhaps other nvidia partners like evga and the like will, with cherry picked gpus, be able to offer higher clocked 690s

From the article, sounds unnecessary to wait for factory OCed cards. Seems like they only underclocked it a little to keep it within the PCI-E spec, which quality PSUs and motherboards can exceed without worry. And if that's the reason for the lower clock, OEMs won't be able to exceed it either. Otherwise they don't get to put the little PCI-E logo on the box.

When nVidia themselves are touting 345W and 1300+ clocks, why worry?

Edit: Kyle also said that all 690s will be nVidia-made regardless of brand, so there won't be any with different specs.
 
Zarathustra[H];1038669985 said:
I can't even get above 1200 on my regular 680 without instability. :(

Probably the difference in the power phases and VRMs is the reason, unfortunately. Still, I'm guessing that at that speed it's a very powerful single card?
 
improved cooler and better quality components and better chips it should do 1300
 
Probably the difference in the power phases and VRMs is the reason, unfortunately. Still, I'm guessing that at that speed it's a very powerful single card?

It is, but I am still ~10% shy of never dropping below 60fps at 2560x1600 in Red Orchestra 2.

So close I can taste it, and I don't want to go SLI as that would introduce input lag :(
 
I'd like to propose 2 new rules to posting in the videocard forums:

1. You must remember your financial situation is not everyone else's. $1000 might be to someone else what $200 is to you. If debating value of a card, debate its performance/features/price and avoid general statements such as "$1000 is insane for a card"

2. You do not know Nvidia's, AMD's or anyone else's costs, profit margins, R&D time, etc. Unless you work for the specific company you cannot assume how much they are "ripping you off". You are simply making an asshole out of yourself by guessing. Price is determined by competition and supply/demand, not your idea of what is "fair".

Oh goodness, QF the F'n Truth.

I was expecting that the clocks wouldn't get hit too much due to the lower TDP of the 680 and they made a work of art to boot. No doubt the most powerful single card coming up imo. Can't wait to see the numbers.
 
Last edited by a moderator:
It's certainly clear to me that I want this card. What's less clear to me is why I want it: nothing I play in any way demands what this thing offers.

Still...
 
yeah I guess DVI is still the port of choice for most PC display connections.. not sure when HDMI will take that over..
 
There are too few native DP displays to consider switching from DVI to DP anytime soon.. I am guessing HDMI would be port to switch to but the HDMI handshake can be fussy with some equipment.. DVI is more hassle free I reckon and offers the same quality (sans audio).

Anyways... this card looks bad ass... the metal housing is a very nice touch. for sure will need some LED spot lighting on it. :)
 
That IS a great-looking card. I'd still take dual, single-GPU cards for the cooling and fact that dual GPU cards are normally a bit harder to sell, but damn if that card doesn't look hot. Looks like something that should be on the inside of a Cylon :)
 
I understand this, I'm just questioning the use of DVI these days is all.

I enjoy DVI.

Monitors last for a lot longer than video cards do, so if people have a bunch of monitors, they are VERY likely to have DVI ports, and a lot less likely to have displayport compatibility.

In my house, we have 6 monitors and 3 HD TV's. Of these only 1 has a displayport, and that's my Dell U3011.
 
When I consider the GTX590, $1000 seems like a lot, but this is two full GK104's with a mere like 30mhz drop in clock. And considering I spent $1,200 on my dual MSI 580 Extremes, it's not as bad. I guess I simply have less disposable income recently too :p

And why are people saying no card is worth $1000?, This is two GK104's and you get like 99% SLI in a single card. It's fantastic.

Wish it had 2x4GB of ram when I maybe go surround, but I might not ever do that, and according to you're all's review in SLI and Tri SLI, 2GB seems to work fine on triple 1080.

If I could sell my cards, I'd grab one.

Zarathustra[H];1038670053 said:
It is, but I am still ~10% shy of never dropping below 60fps at 2560x1600 in Red Orchestra 2.

So close I can taste it, and I don't want to go SLI as that would introduce input lag :(

Input lag.. SLI? what? I've never experienced anything like that on my system, and I returned a couple monitors for their lag.
 
These price complaints are fucking absurd. If you go buy two individual GTX680s, you are going to spend over $1000, and you won't have the improved PCB + VRM solutions, the integrated PCI-E 3.0 bridge chip between GPUs, and you won't have the improved cooler on the 690.

None of you bitching about the cost were likely to invest in two 680s anyways, so just drop it. If AMD does come out with a competitive dual-GPU solution for cheaper, great - I hope they do. But until they do, this is still an excellently positioned video card in the market today.
 
Input lag.. SLI? what? I've never experienced anything like that on my system, and I returned a couple monitors for their lag.

Best illustrated in a not to be named site's 1999 comparison of the dual GPU Rage Fury Maxx to the single GPU Geforce 256.

At the same frame rate a AFR mode multi GPU solution will inherently have more delay before keyboard/mouse input is displayed on the screen than a comparable single GPU solution.

7125601965_776096cd0b_o.gif
 
There are too few native DP displays to consider switching from DVI to DP anytime soon.. I am guessing HDMI would be port to switch to but the HDMI handshake can be fussy with some equipment.. DVI is more hassle free I reckon and offers the same quality (sans audio).

Anyways... this card looks bad ass... the metal housing is a very nice touch. for sure will need some LED spot lighting on it. :)

Kyle is l33t he knows the secret HDMI handshake :eek:
 
It's like complaining that a McLaren MP4-12C is too expensive ... most of us aren't going to buy it anyways! :p Free free to send either over for my personal testing though ... :D

If they really are going to hit 1300, then they must be stock-piling the best chips, because I thought the average 680 wasn't even reliably hitting 1200. That, or the voltage control is unlocked or something.

Lol @ the crowbar and crate
 
lol @ the box.. that is cool. you weren't kidding about the crowbar. :p it's very impressive how small they managed to keep this card.
 
Based on the the pic with the 680 and 690 together I'm wonder how the hell did nVidia figure out a way to cool a dual 680 GPU card and keep the physical about the same as the regular 680?
 
Please, please, please have this for step-up EVGA. LOL at the crowbar and crate, that crowbar had me wondering what the hell it's purpose was until I saw the unboxing post by Kyle.
 
Am I the only one that is bummed at the lack of a back plate? I know its only for aesthetics but when you drop a grand on a video card it would be nice.
 
Based on the the pic with the 680 and 690 together I'm wonder how the hell did nVidia figure out a way to cool a dual 680 GPU card and keep the physical about the same as the regular 680?
its about the same size card as the gtx590 which used way more power so I don't see why you would be surprised. if anything it looks over engineered for what it is which is a smart move.
 
It is odd how some of you refuse to figure the cost of the cooling solution into the equation. :) The finstacks alone are much more expensive than what you find on "stock" cooling solutions in terms of metals alone. The fan is more expensive as well and there was a lot of engineering that went into that solution.
 
Well some of us appreciate original cooling designs. It's the main reason I went with the MSI Lightning 6970. Cost a little more than stack but it is very quiet and has never annoyed me or more importantly, the wife.
 
Please, please, please have this for step-up EVGA.

Pretty sure Kyle already said that nVidia was putting these out themselves and there will not be an AIB partner option.

EDIT: Er, maybe he meant that they wouldn't be making non-reference designs but would sell the nVidia-designed boards. Not sure.
 
I'd like to get a pair of 690s, but it'll be a while before I can start my new build and I'd rather wait to see if there is eventually 690s with more memory.
 
Should have gone 8 GB on the VRAM.....

Two of these, overclocked, with 8gb of VRAM..........

That surely would have made it more expensive with no benefit whatsoever.

Pretty sure Kyle already said that nVidia was putting these out themselves and there will not be an AIB partner option.

EDIT: Er, maybe he meant that they wouldn't be making non-reference designs but would sell the nVidia-designed boards. Not sure.

These will only be built by NVIDIA.

I literally had to use the crowbar to open the box. Had to pry the top off, it was nailed shut.

That is pretty effin cool...for some reason. I call dips on the box and the crowbar. I want to put it with my 5800 film can.
 
Back
Top