GeForce GTX 690 - Perfection Inside and Out @ [H]

What retailer will likely be sporting the GTX 690? Or is this something we need to order direct from nvidia?
 
Given the specs, it isn't hard to predict where the card will end up. Take a 680 SLI benchmark and subtract just a hair, and you're very much in the ballpark.
 
What retailer will likely be sporting the GTX 690? Or is this something we need to order direct from nvidia?

at the end of kyles article he said nvidas normal AIBs will be shipping, so you wont have to order direct from nvidia. id keep an eye on all the major sites though: newegg, tigerdirect, amazon, maybe check some B&Ms, etc.
 
I can think of 49857845893453745873498578934759834589 better ways to spend $1000.
 
edit: I'm guessing partners will put a sticker on the card, but other than that. I wonder if they get to brand the firmware, probably so.
 
Last edited:
Kyle: After you finish testing the 690 is it going into a personal rig, shipped on to the next reviewer, or donated to charity?
 
The thing that shocks me, is how everyone expected to be a lower price when 2x680 gpu's.

When the last 2 generations had the same pricing when you compare it to its single GPU counterpart.

I expected over $999. and I think $999 is a damn good deal considering how fucking badass the cooler is.

Also lets remember early reports of nV wanting to price the 680 at $549 originally, but dropped the MSRP to $499 near launch.
The 690 was probably aimed at $999 from the being but a pricing shift on the 680 made it equal to 2x680s.

The 680 could have been $600 and people would have paid for them.

Without the 7990 on the market, nV can charge a premium for the fastest single card on the planet.
 
As far as I understand, two GTX 690's would be a perfect setup for 3x1080p. Is this what these cards are for?
 
Kyle, will you guys have the ability to look at PCI-E 2.0 vs. PCI-E 3.0 performance in the review?

Especially at the multiple monitor resolutions. I'd like to know if putting this card on a SB or older (like my i7-950) would hinder it at all
 
Kyle, will you guys have the ability to look at PCI-E 2.0 vs. PCI-E 3.0 performance in the review?

Especially at the multiple monitor resolutions. I'd like to know if putting this card on a SB or older (like my i7-950) would hinder it at all

Probably once he gets a high end Z77 setup with IVB. For now nVidia has disabled PCIe 3.0 support on the X79 platform (something about some sort of incompatibility with X79 that I'm guessing is Intel's fault) until they have a driver level workaround.

My guess (please don't flame me) is that they want to rule out any sort of processor bottleneck and want to use the top dog processor out there with PCIe 3.0 support. Not sure if we're going to see a comparison using older processors, or less powerful ones.
 
For for the reccord.. I am discrased to say that as an AMD fan boy I got an errection reading the article about this card..
 
Looks to me like instead of just throwing out a dual card they went ass over tit into creating something absolutely incredible, paying extra attention to each and every detail. They've done the hardware, and the performance is exceptional, so now they get to make the 'products' to put on shelves. This is the flagship, and they've cut no corners. I wouldnt be surprised if they dont make money from it at all. If all you're looking at is numbers you're missing the appeal, and wouldn't have bought one anyway. With this they're giving the high end users a whole bunch of sugar on top.

Will I buy one? Hell no. Do I want one? Hell yeah!!....thats the point of it.
 
Looks to me like instead of just throwing out a dual card they went ass over tit into creating something absolutely incredible, paying extra attention to each and every detail. They've done the hardware, and the performance is exceptional, so now they get to make the 'products' to put on shelves. This is the flagship, and they've cut no corners. I wouldnt be surprised if they dont make money from it at all. If all you're looking at is numbers you're missing the appeal, and wouldn't have bought one anyway. With this they're giving the high end users a whole bunch of sugar on top.

Will I buy one? Hell no. Do I want one? Hell yeah!!....thats the point of it.

<golf clap> Well said old chap, well said.</golf clap>
 
Looks to me like instead of just throwing out a dual card they went ass over tit into creating something absolutely incredible, paying extra attention to each and every detail. They've done the hardware, and the performance is exceptional, so now they get to make the 'products' to put on shelves. This is the flagship, and they've cut no corners. I wouldnt be surprised if they dont make money from it at all. If all you're looking at is numbers you're missing the appeal, and wouldn't have bought one anyway. With this they're giving the high end users a whole bunch of sugar on top.

Will I buy one? Hell no. Do I want one? Hell yeah!!....thats the point of it.

If it really does come in at MSRP, I'm buying one. Sadly I think it will be marked up by a few hundred...
 
So I did have a legitimate question in my post - can you pair up two 680s with a 690 in SLI?
 
So I did have a legitimate question in my post - can you pair up two 680s with a 690 in SLI?

That would be.......no.

NVidia took the approach that high end car companies do. Put a beast under the hood, and make the ouside absolutely slick looking using the best components possible. :D

Sure, you dont NEED a car like that, but you'll damn sure WANT a car like that.
 
That would be.......no.

NVidia took the approach that high end car companies do. Put a beast under the hood, and make the ouside absolutely slick looking using the best components possible. :D

Sure, you dont NEED a car like that, but you'll damn sure WANT a car like that.

I don't think it is entirely clear whether or not nVidia made any architectural enhancements to Kepler for this sort of thing. Have they released any public statements about whether or not this is(not) possible?

It would be cool if it were possible but yeah, unlikely, given that the volume of people that would buy $1500 in GPUs and not have three PCIe x16 slots is rather low...
 
So I did have a legitimate question in my post - can you pair up two 680s with a 690 in SLI?

From what I've read, I think you have to have similar cards for nVidia, and that means 690 + 690. No idea if it's arbitrary or technical.
 
For now nVidia has disabled PCIe 3.0 support on the X79 platform (something about some sort of incompatibility with X79 that I'm guessing is Intel's fault) until they have a driver level workaround.

Well, two things I want to mention.

1.) X79 didn't officially mention PCIe 3.0 support as it predated there being any boards to test compatibility on. It is designed to PCIe 3.0 specs, but PCIe 3.0 is not officially supported. It's how they cover their asses when people complain.

2.) My Radeon 7970 worked perfectly in PCIe 3.0 x16 mode on my x79 platform. How come AMD can design a solution that works with PCIe 3.0 on x79 and Nvidia can't?
 
From what I've read, I think you have to have similar cards for nVidia, and that means 690 + 690. No idea if it's arbitrary or technical.
In the past yes, I believe they only supported similar cards. Was wondering if that changed. Two 680s are functionally the same as a 690, in terms of the GPU anyways.
 
Zarathustra[H];1038672305 said:
2.) My Radeon 7970 worked perfectly in PCIe 3.0 x16 mode on my x79 platform. How come AMD can design a solution that works with PCIe 3.0 on x79 and Nvidia can't?

I would guess simply because it's not worth the R&D dollars. From what we've seen, especially in single card setups, the PCIe 2.0 bus is still more than enough to power the card. So instead of wasting time on PCIe 3.0 with no benefit, I would rather their driver team work on Adaptive V-Sync, GPU Boost, Dynamic Core Volt/Clock... oh what, they did :D

In all seriousness, I would guess it's simply because PCIe 3.0 support has no real world benefit at this time since PCIe 2.0 is still enough bandwidth for these cards.
 
Also lets remember early reports of nV wanting to price the 680 at $549 originally, but dropped the MSRP to $499 near launch.
The 690 was probably aimed at $999 from the being but a pricing shift on the 680 made it equal to 2x680s.

The 680 could have been $600 and people would have paid for them.

Without the 7990 on the market, nV can charge a premium for the fastest single card on the planet.

Originally it was meant to be under 400 or 350, 180, 600, what ever... I don't know and neither do you.

And the 680 could be $300 and people would buy tons of them.
 
Zarathustra[H];1038672305 said:
Well, two things I want to mention.

1.) X79 didn't officially mention PCIe 3.0 support as it predated there being any boards to test compatibility on. It is designed to PCIe 3.0 specs, but PCIe 3.0 is not officially supported. It's how they cover their asses when people complain.

2.) My Radeon 7970 worked perfectly in PCIe 3.0 x16 mode on my x79 platform. How come AMD can design a solution that works with PCIe 3.0 on x79 and Nvidia can't?

I didn't know it didn't officially mention PCIe 3.0 support. Thought it did, wow. As for AMD, I can't explain that and it sure could still be Intel's problem. It could be nVidia's too. I fondly remember the debacle that was 802.11n...when you combine two products that claim to support the same standard yet neither is "official" bad things can happen.

Anyway, just saying that they're probably waiting for the best high end setup that will actually provide working PCIe 3.0 out of the box before pulling the trigger on a review.
 
So I did have a legitimate question in my post - can you pair up two 680s with a 690 in SLI?

As others have stated, nVidia drivers have only worked in SLI configurations with cards of the same designation. You can mix cards from different manufacturers and even clock speeds but not designations. Technically it's probably possible as AMD allows for these kinds of configurations but its not a technical challenge that nVidia seems to deem worth its time.
 
As others have stated, nVidia drivers have only worked in SLI configurations with cards of the same designation. You can mix cards from different manufacturers and even clock speeds but not designations. Technically it's probably possible as AMD allows for these kinds of configurations but its not a technical challenge that nVidia seems to deem worth its time.

Wasn't there a driver a while ago that let you sli cards with different chips?
 
I don't think it is entirely clear whether or not nVidia made any architectural enhancements to Kepler for this sort of thing. Have they released any public statements about whether or not this is(not) possible?

It would be cool if it were possible but yeah, unlikely, given that the volume of people that would buy $1500 in GPUs and not have three PCIe x16 slots is rather low...

Each 690 takes two PCI slots, as indicated by the 690 SLI picture.
 
Thanks for the contribution to the thread. I am sure we gained a lot of insight from your post.

Thanks!

Buying a card like that is for 1st people that can afford it, and 2nd people that like computers a lot. It's like a tradition imo.
 
I've never seen [H] get so excited about a video card ever. this thing has to be awesome.

Well, it is big and shiny.

Frankly, without 4Gb per chip frame buffer, I can't really see $1000-card buyers drooling over it. It's like it's already obsolete before it's even left the store. I really find it hard to believe that anybody who is willing to drop $1000 on graphics cards without blinking an eye, would not even be thinking of multi-monitor game play. That and the lack of more than 1 mini display port and no HDMI whatsoever seem like fairly serious oversights in a 'perfect' graphics card.
 
Well, it is big and shiny.

Frankly, without 4Gb per chip frame buffer, I can't really see $1000-card buyers drooling over it. It's like it's already obsolete before it's even left the store. I really find it hard to believe that anybody who is willing to drop $1000 on graphics cards without blinking an eye, would not even be thinking of multi-monitor game play. That and the lack of more than 1 mini display port and no HDMI whatsoever seem like fairly serious oversights in a 'perfect' graphics card.

More memory isnt needed if it uses what it has more efficiently, which [H] seemed to indicate in their 680 review. No reason to add more cost/complexity to the card if it doesnt need it, even for triple screen gaming.

Mini display port is still uncommon in most screens out there. DVI is still the majority and if you need HDMI just buy a DVI to HDMI cable and yer set. I've done that on several cards and it works fine.
 
well, the fact that the price for this card (before even factoring in scarcity and markup) is at LEAST 5 (FIVE) car payments for me... I think I'll stick with my 6970+6950 crossfire rig.


Beautiful card though -- I'd love to own one.
 
More memory isnt needed if it uses what it has more efficiently, which [H] seemed to indicate in their 680 review. No reason to add more cost/complexity to the card if it doesnt need it, even for triple screen gaming.

Mini display port is still uncommon in most screens out there. DVI is still the majority and if you need HDMI just buy a DVI to HDMI cable and yer set. I've done that on several cards and it works fine.

Leaving the memory issue aside, I've read elsewhere on the [H] Forums about AMD 3D requiring displayport/mini display port monitors to work properly in 3 monitor eyefinity. Does nvidia's 3D technology not have the same requirements? If not, then the port configuration makes more sense to me.
 
Leaving the memory issue aside, I've read elsewhere on the [H] Forums about AMD 3D requiring displayport/mini display port monitors to work properly in 3 monitor eyefinity. Does nvidia's 3D technology not have the same requirements? If not, then the port configuration makes more sense to me.

Last I checked if you used a single AMD card you had to use 2 DVI + 1 DP (with or without a DisplayPort to DVI adapter) in order to drive three displays. Granted my knowledge is based off of the 5870 but I don't think that's changed unless AMD has come out with cards that can drive 3 DP monitors.

As far as I know nVidia is the first with native support for three DVI monitors (if you count the fact that HDMI is backwards compatible with DVI).
 
I've not read the thread, but I hope that [H] test the 690 with both 1x and 2x 680 for triple and quad GPU [H]ardness.

And, if possible, can we please have some 5-monitor testing / gaming? Not that I'm actually going to do it myself, of course.
 
I've not read the thread, but I hope that [H] test the 690 with both 1x and 2x 680 for triple and quad GPU [H]ardness.

And, if possible, can we please have some 5-monitor testing / gaming? Not that I'm actually going to do it myself, of course.

There's a whole host of reviews by Kyle and team about 1x, 2x, 3x, Adaptive V-Sync, etc, etc, etc.
http://hardocp.com/reviews/gpu_video_cards/1/nvidia

Not sure if Surround supports 5 monitors. nVidia only advertises it as a three monitor solution.
 
So when are we going to see an [H] review of the 690 vs 2gb 680 SLI vs 4gb 680 SLI at 5760x1200 to settle this!!

Wondering if the 4gb 680 SLI setup is worth the $200 premium...

I have been holding out for the 4gb 680's, but I am worried when the 690 drops I won't be able to hold back. :)
 
Back
Top