Why nvidia won't release dual GPU cards?

lee0539

[H]ard|Gawd
Joined
Nov 17, 2001
Messages
1,816
AMD had 295X2 and now prepping Fury x2, but nvidia has yet to make any dual gpu cards. Why is that? Architecture issue or they feel there is no market for them? I rather buy a dual gpu card then buy two cards and sli. At least with dual gpu cards I only have to buy one waterblock if it isn't water cooled like the fury x2. Do you think pascal may come with dual gpu option? Titan V2?
 
Nvidia has released a few dual GPU cards.

GTX 590
GTX 690
Titan Z

That's about all I can think of.
 
GeForce GX2 and 295 as well if I recall correctly the name. That card was less $$$ at release than what 980Ti costs today.
But nothing lately. Probably because Nvidia is the performance king. Back then ATI was way more competitive, being even faster than Nvidia. That's probably why Nvidia was building dual gpu cards back then.
And honestly, they do get really hot.
 
As has been said, they have had a couple over the years. I don't think they advertise it as prominently as AMD does when they do have a dual GPU card. Also, I don't think they release as many because they probably just don't feel a need. The last couple cycles, they have been on point with single-GPU performance, especially with the Titan line. Not trying to sound like a fan boy here, but I think many of AMDs dual-GPU cards are a result of necessity in order to have a single piece of PCB that matches or bests Nvidias best single-GPU offerings.
 
Here you have the answer based on detail investigation of this issue:

NVidia does release dual GPU cards. Traditionally they were mainly positioned for the professional graphics and HPC (High-Performance Computing) markets with consumer cards made as a side of it. That was leading into contradictions since professional cards are several times more expensive than consumer cards. For quite a few professional applications consumer cards are sufficient and thus consumer segment was eating into the professional segment. Trying to deal with this somehow NV put outrageously high price on its last dual GPU card, Titan Z, but that of course drastically reduced its consumer market.

Situation took another twist with the Maxwell line of cards. Maxwell chip skipped on the double floating-point arithmetic support. This has made Maxwell cards of much less interest to the HPC market and thus making a dual GPU card positioned for both markets was not viable anymore. Instead the NV made a dual GPU K80 (similar to Titan Z) card based on the older Kepler chip addressing exclusively the professional market.

Now there is yet another twist in the making. The new Pascal chip will be 'universal', with flexible support for at least three types of floating point artithmetics (FP64, FP32, FP16). This chip will be thus attractive for both the professional and consumer markets. In the meantime a new professional market has emerged: artificial intelligence branch aka 'deep learning'. This market is much bigger than HPC and demands huge specialized processing power, thus a dual GPU card is much desired. One can thus expect that there will be Pascal-chip based dual GPU card for professional applications and in parallel to it a consumer dual GPU card could also be made. But here again there would be old contradiction in the prices of professional and consumer cards. At issue is thus how NV will position the new professional and consumer cards.
 
AMD had 295X2 and now prepping Fury x2, but nvidia has yet to make any dual gpu cards. Why is that? Architecture issue or they feel there is no market for them? I rather buy a dual gpu card then buy two cards and sli. At least with dual gpu cards I only have to buy one waterblock if it isn't water cooled like the fury x2. Do you think pascal may come with dual gpu option? Titan V2?

There were rumors of a dual GPU card from nvidia before the hollidays, and was supposedly shown to the press but apparently it has been delayed or maybe even cancelled.
 
They both release an equal amount of desktop dual GPU cards.A list where there may be a few dual cards missing and companies all over the world made/make different versions of the cards below.

Nvidia will put out a new dual GPU.If it is another $3000 USD card then they are even more .........

AMD Dual Cards
ATI's Rage Fury Maxx
Radeon X1600 Dual
Radeon X1650 DUAL
Radeon X1950 PRO DUAL
Radeon HD 2600 XT X2
Radeon HD 3850 X2
Radeon HD 3870x2
Radeon HD 4850x2
Radeon HD 4670x2
Radeon HD 4870 X2
5770X2
5870x2
Radeon HD 5950
Radeon HD 5970
Radeon HD 6850 X2
Radeon HD 6870x2
7970x 2
7990
6990
R9 290Xx2
R9 295x2
R9 390x2
R9 Fury X2

Nvidia Dual Cards
GeForce 6600 GT Dual
GeForce 6800 GT Dual
ASUS Extreme Dual N7800GT
GeForce 7900 GX2
GeForce 7950 GX2
9800 GX2
GeForce GTX 460 2Win
GeForce GTX 560 Ti 2Win
GTX 275 Co-Op Physx
GTX 285 Mars 1
GTX 295 V1
GTX 295 V2
GTX 580 Mars 2
GTX 590
GTX690
GeForce GTX 760 X2
GTX Titan Z
 
You probably mean the 590 with its exploding VRMs.

Bahahahaha.

"Now the pretty looking, backlit GeForce logo was blinking helplessly and the fan did not spin, both indicate an error with the card's 12V supply."

I was never a fan of dual GPU cards. Has to be a pretty niche market. I did like the execution of the 295x2 though, since it eliminated the thermal throttling stock. Still have the massive price premiums.
 
I was never a fan of dual GPU cards. Has to be a pretty niche market. I did like the execution of the 295x2 though, since it eliminated the thermal throttling stock. Still have the massive price premiums.

Dual GPU cards have to be the cream of the cream since they only make sense if doubling top of the range single GPU card. As such they are of interest only for those looking at and beyond top dual single GPU cards and among these folks there will be some fans ready to pay the premium. Companies probably know this market very well which is the reason they are not making the premium lower as this could cause some people migrating from dual cards to dual GPU cards resulting in less overall profit.

There is perception that dual GPU card has less potential performance (including overclocking) than dual cards. It seems that with proper watercooling and power supply this is not that case.
 
AMD has stated that it is delaying their dual GPU card to better match up with VR technology releases later this year. Nvidia is simply not talking about a dual GPU card - there is nothing in the rumor mill about whether they will do one or not do one.

I can only observe that Pascal, which by all reports has huge improvements in performance per watt and greatly reduced thermals, would be so much better for a dual GPU card that it would be well worth the delay. With Maxwell, Nvidia would be restricted to a pair of GPUs similar to the GTX 970, while with Pascal, they might be able to put GPUs faster than the 980 Ti on a dual-GPU card.
 
Dual GPU cards have to be the cream of the cream since they only make sense if doubling top of the range single GPU card. As such they are of interest only for those looking at and beyond top dual single GPU cards and among these folks there will be some fans ready to pay the premium. Companies probably know this market very well which is the reason they are not making the premium lower as this could cause some people migrating from dual cards to dual GPU cards resulting in less overall profit.

There is perception that dual GPU card has less potential performance (including overclocking) than dual cards. It seems that with proper watercooling and power supply this is not that case.

Thermal management is only one issue, the other big issue is power delivery. The PCB can only be so long and so tall, and cramming two massive power hungry dies means the power circuitry will necessarily suffer. For example, the 290X Lightning has an extremely beefy 12+3 phase setup, there's just no way you're going to be able to cram two sets of those onto the same PCB, unless you want to make it hilariously oversized. Thus power circuitry on dual GPU cards are invariably weaker (since they have to stick to reference for the most part), and can be a limiting factor in overclocking.
 
I wonder where OP is..... Hopefully looking up his next question...
 
Okay i think people misunderstood when i said why nvidia isn't making dual gpu. i meant recently or announcement of one like amd and for the gaming market not history going back to voodoo era. I forgot about the titan z. definitely hope they bring another one of those for pascal but price it more reasonably
 
Okay i think people misunderstood when i said why nvidia isn't making dual gpu. i meant recently or announcement of one like amd and for the gaming market not history going back to voodoo era. I forgot about the titan z. definitely hope they bring another one of those for pascal but price it more reasonably

"Titan" and "reasonably priced" are words that do not belong in the same sentence. They are luxury cards.
 
Titan Z is recent considering it's pretty current architecture....

Apparently OP is itching himself for Nvidia to come out with Dual GPU Pascal tomorrow. Lol.
 
Titan Z is recent considering it's pretty current architecture....

Apparently OP is itching himself for Nvidia to come out with Dual GPU Pascal tomorrow. Lol.

Nope. Bought my 980 Ti, won't be buying a card for another 3-4 years. I just find it interesting to see how rumors change over time.

Titan Z is two years old, and it's not even Maxwell - it's Kepler.
 
Titan Z was an abject failure at the $3K launch price simple as that. 50% more money than Titan Black SLI, less performance, and 3 slot cooler means you barely save any space vs a SLI setup.

They only started moving volumes once they slashed the price by 50%. Obviously nVidia never announced this publicly but Titan Z went for $1500 a pop on Newegg right around the time Maxwell was released, and that's when it quickly sold out.
 
Okay i think people misunderstood when i said why nvidia isn't making dual gpu. i meant recently or announcement of one like amd and for the gaming market not history going back to voodoo era. I forgot about the titan z. definitely hope they bring another one of those for pascal but price it more reasonably

I wouldn't say it's people misunderstanding with that title you put on it. :)
 
Nope. Bought my 980 Ti, won't be buying a card for another 3-4 years. I just find it interesting to see how rumors change over time.

Titan Z is two years old, and it's not even Maxwell - it's Kepler.

If I were you, I'd sell when new architecture releases that is the high end of the release. I do this every release just to stay on the high end. Taking a $200 hit on a card you got at least a year of use out of to purchase the newest king of the hill isn't a bad way to stay current without breaking the bank!
 
Thermal management is only one issue, the other big issue is power delivery. The PCB can only be so long and so tall, and cramming two massive power hungry dies means the power circuitry will necessarily suffer. For example, the 290X Lightning has an extremely beefy 12+3 phase setup, there's just no way you're going to be able to cram two sets of those onto the same PCB, unless you want to make it hilariously oversized. Thus power circuitry on dual GPU cards are invariably weaker (since they have to stick to reference for the most part), and can be a limiting factor in overclocking.

This is serious issue indeed but I am not sure if e.g. Titan Z power circuitry is substanially weaker than 2xTitan Black???
 
Dual GPU cards are a great way to sell off old chip stocks.

Nvidia in no way needs to rush to sell off Maxwell chips, which are happily selling themselves.

AMD, on the other hand, probably expected to sell more Fiji chips than they did, and so is desperate to sell them off as fast as possible, because the value of these chips is dropping daily.

Once Pascal arrives, AMD will not be able to sell Fiji chips in the high end. Period. They're barely in a position to do it now. So you can see they necessarily need to be selling a dual GPU Fiji solution asap. It's their best, if not only, move with Fiji at this point in time in the high end market.
 
This is serious issue indeed but I am not sure if e.g. Titan Z power circuitry is substanially weaker than 2xTitan Black???

Just looking at the PCB itself both seem to use 6+2 phase setups, so unless the quality of the components used are different, power circuitry at least seems comparable.

That said the Titan Z only has 8+8 pin for a maximum of 375W (if you stay within PCIe spec), while the Titan Black is 6+8 pin for 300W. So if we were to give Titan Z the "proper overclocking headroom" so to speak, it would need to be spec'd for 600W! :eek:

You can already see there's a huge compromise in power in order to cram two power hungry GK110 chips on the same board. You could bios mod away the power limit and increase power to the 8 pins beyond PCIe spec I suppose (which is what I'm doing with my 980 Ti), but you're not going to be pulling 600W through an 8+8 setup period unless you want to literally start a fire.
 
Op here is the reason they suck at it, They are afraid they will get creamed again and again. If you want dual gpu Radeon Tech is your best bet.
 
Op here is the reason they suck at it, They are afraid they will get creamed again and again. If you want dual gpu Radeon Tech is your best bet.

There is very good reason to think there will be dual GPU Pascal cards. Nvidia has to make such cards for professional applications which are insatiable for computing power and Pascal has exquisite support for all kind of floating point arithmetics. A dual GPU Pascl card will have 8 Teraflops of FP64 and possibly up to 10 Teraflops with boosting which is fantastic performance. Having to make dual cards for professionals, the dual consumer cards will be just an offshot increasing the volumes. The usual problem here is that Nvidia needs to differentiate so that consumer cards are not cannibalizing professional market. This can be done in many ways: make the professional cards first and consumer cards e.g. one year later, make the consumer cards very expensive (the Titan Z case), put less/different memory into consumer cards.
 
If an all in one water cooled dual gpu fit in a mini itx build it might be worth it for me, even if I'd have to mod the radiator to be through the side of the case. In larger builds there is no reason to run two downclocked gpus with less modularity and higher heat for more money.
 
Oooh I had one of those. I had a 9800GX2 and GTX 295 as well. Dual PCB cards FTW.
My 9800 GX2 ran at 105 C under load. I don't miss it. The x58 mobo I had it in has actually seen quite a progression of GPUs, 8800gt => 9800GX2 => gtx460 => gtx660ti => gtx970 => gtx1070. I'm glad it survived the inferno of that 9800GX2.
 
I think Nvidia is slowly moving away from SLI (the 10 series cards only officially support SLI now and no longer support Tri or Quad SLI). It seems like they are looking to provide more powerful single GPU cards instead which I'm entirely for.
 
My 9800 GX2 ran at 105 C under load. I don't miss it. The x58 mobo I had it in has actually seen quite a progression of GPUs, 8800gt => 9800GX2 => gtx460 => gtx660ti => gtx970 => gtx1070. I'm glad it survived the inferno of that 9800GX2.

That board was tempered in the fires of hell and will continue to survive anything you can throw at it.
 
I think Nvidia is slowly moving away from SLI (the 10 series cards only officially support SLI now and no longer support Tri or Quad SLI). It seems like they are looking to provide more powerful single GPU cards instead which I'm entirely for.
For some of us enthusiasts that like SLI and dual GPU cards though this kind of sucks and is a direct effect of no competition.
 
If an all in one water cooled dual gpu fit in a mini itx build it might be worth it for me, even if I'd have to mod the radiator to be through the side of the case. In larger builds there is no reason to run two downclocked gpus with less modularity and higher heat for more money.

You bumped an 8 month old thread just to post this?
 
I was looking around for any rumor of dual gpu and stumbled on this thread. Considering the replies after mine it doesn't seem like a dead conversation topic.
 
I think Nvidia is slowly moving away from SLI (the 10 series cards only officially support SLI now and no longer support Tri or Quad SLI). It seems like they are looking to provide more powerful single GPU cards instead which I'm entirely for.

Conversely resolutions are getting higher and the most advanced gaming monitors have high hz ceilings in order to reduce blur and provide greater motion articulation and pathing. There will be 165hz and 200hz 3440 x 1440 21:9 and 2560x1440 16:9 monitors as well as 120hz 4k monitors by end 2016 or into 2017. VR also needs high hz and would be higher than 90hz if they could do it. It also has a 2160 x 1200 resolution that the gpu has to render to. That is the demand but it is only 1080 x 1200 per eye so It will also require a much high resolution vs it's perceived ppi in future generations.

Unless graphics ceilings are going down or you are willing to play in a low frame rate graph band (plus and very minus your average frame rate) on a high resolution monitor that is, and get nothing out of the high hz since you aren't feeding it 100fps or more consistently. Having a high Hz gaming monitor without the frame rate graph's main band being high enough to feed that hz is not getting anything out of those appreciable and aesthetic blur reduction and motion defintion gaming advancements. The other option is single high end card and drop back to a 1080p high hz gaming monitor I guess where you are guaranteed to get the most out of the high hz.

A 100 to 110 fpz-hz Average frame rate graph compared to a 60fps-hz Average frame rate graph (link), showing the real ranges you'd be playing. You could use your imagination to move the average bar line to where you think your frame rate would be on your single card on a demanding game in the next year.

edit:
Guru3d uses very high settings on far cry primal benchmarks:
"We use Very High quality settings as it offers an acceptable level of performance versus quality and gives the mid-range cards some breathing room as well (for objective comparison)."
A gtx 1080 gets 80 fps average at 2560 x 1440 there so the low end of the graph is probably 60's and down to 50 fps-hz.

Their witcher3 bench at 2560 x 1440 shows the gtx 1080 about the same, at 82fps-hz average:
"Our settings are Ultra quality with AA enabled. Please find an overview of the exact settings here. Hairworks is DISABLED to objectively compare inbetween AMD and Nvidia cards.
index.php
"

That's pretty good for a single card I suppose, and a titan or rumored 1080ti could get 20 or more fps on that. Already dialed down but with some more reductions in still quality/fx you could get back to 100fps-hz average to get some of the motion clarity and motion definition increases out of a high hz gaming monitor. Far cry is already down to very high settings though so that doesn't bode well for where the graphics ceilings might be in the next year or more, and anyone interested in a 3440 x 1440 (let alone 4k) high hz gaming monitor is going to have an even more demanding set up.
 
Last edited:
I was looking around for any rumor of dual gpu and stumbled on this thread. Considering the replies after mine it doesn't seem like a dead conversation topic.

Even after all this time since the thread began, the rumor mill is still dead silent about a dual GPU card from Nvidia.
 
Nvidia desperately wants to get rid of multi-GPU segments. They are doing everything they can to make SURE people need a reason to upgrade regularly. People who buy multi-card setups are less likely to upgrade every generation, and now that Nvidia is basically the only show in town, they don't want to have a situation where their biggest competitor is themselves from 2 years ago.


"Nobody supports SLI anyway"

Well, Nvidia can hand out millions upon millions of dollars to 'suggest' developers to use Gameworks, PhysX, etc they could sure as Hell have had every major Nvidia sponsored title support SLI if they wanted.
 
Last edited:
From what I have read about dual-GPU cards, they are almost never worth the premium price put on them. They are typically a significantly downclocked version of the single GPU card, and typically cost more than 2x of the single cards in the first place.
 
Nvidia desperately wants to get rid of multi-GPU segments. They are doing everything they can to make SURE people need a reason to upgrade regularly. People who buy multi-card setups are less likely to upgrade every generation, and now that Nvidia is basically the only show in town, they don't want to have a situation where their biggest competitor is themselves from 2 years ago.


"Nobody supports SLI anyway"

Well, Nvidia can hand out millions upon millions of dollars to 'suggest' developers to use Gameworks, PhysX, etc they could sure as Hell have had every major Nvidia sponsored title support SLI if they wanted.

The problem is multiGPU is an epic mess. And its an ultra niche. In short, multiGPU should never have been brought back to life after the 3dfx SLI. And with DX12 its even more of a hopeless child, only the most sponsored developers will even touch.

Its a waste of resources.

NVLink is Nvidia new "SLI" and its only on the enterprise products. SLI was removed and limited from more SKUs with Pascal. And I wouldnt be surprised if Volta or the one after Volta puts the final nail in. SLI is dead.

I also wonder what SLI/CF people thinks on their purchase, all the times their extra GPU is simply dead weight due to lack of support. Not to mention the other issues involved in multiGPUs.

On the CPU front multisocket died from the consumer as well. And I dont miss it.
 
Back
Top