Why did SLI and CrossFire die off?

XacTactX

Supreme [H]ardness
Joined
Dec 13, 2010
Messages
4,134
I was just watching this video about the Radeon HD 6990, 10 years after launch and I realized that both nVidia and AMD haven't released any dual GPU cards in almost 10 years. On top of that, the marketing for SLI and CrossFire has disappeared and hardly anyone discusses these techniques. When I started keeping up with computers and tech in 2005, nVidia had just released SLI for the GeForce 6800 Ultra and ATi was scrambling to answer this product with the Radeon X850 XT CrossFire edition. Why has the landscape shifted so much to favor > $1,000 GPUs instead of SLI and CrossFIre?

EDIT thank you to everyone who contributed to the discussion, I learned a ton from this 😃
 
Last edited:
instead of SLI and CrossFIre?
Do not known enough but one possible reason is the way rendering changed, using previous frame information for effect and all the image for anti aliasing making the easiest solution for scaling with SLI/crossfire in a general way like alternate frame rendering or rendering half the image on each GPU less obvious to do and require a custom solution for each game engine with work on the game dev side to be done to scale (at least for the game that use those effects and considering how niche dual gpu is and would have been not many would have made a lot of work on it).

Has for dual gpu on the same cards, I imagine that with the current power single chips cards use it would be ruff.
 
In the past, SLI or Crossfire support in games was primarily up to NVIDIA and AMD to implement in their drivers via profiles. With changes to DirectX 12, its no longer up to the GPU makers. The game developers have to implement the feature in the games themselves. Given that most games on PC are ports of their console versions and given how few people utilize multi-GPU configurations the developers never bother to implement it. It's to the point now where AMD and NVIDIA cards don't even have the physical capability in hardware anymore. I think the only current high end card that does is the RTX 3090. It does via an expensive and proprietary bridge, but all the same limitations apply to it as well. Basically, it works in a few games but the scaling isn't what you'd want or hope for. It's just not worth doing.

The reason you don't see dual-GPU graphics cards anymore is simple. These cards all relied on some form of internal Crossfire or SLI between the GPU's. How well these GPU's worked was entirely dependent on SLI/Crossfire profiles in the drivers. Some game engines and some architectures also benefited more or less from this than others. To say it was inconsistent is putting it mildly.

Do not known enough but one possible reason is the way rendering changed, using previous frame information for effect and all the image for anti aliasing making the easiest solution for scaling with SLI/crossfire in a general way like alternate frame rendering or rendering half the image on each GPU less obvious to do and require a custom solution for each game engine with work on the game dev side to be done to scale (at least for the game that use those effects and considering how niche dual gpu is and would have been not many would have made a lot of work on it).

Has for dual gpu on the same cards, I imagine that with the current power single chips cards use it would be ruff.
Most dual GPU cards down clocked their GPU's to keep the TDP's in a reasonable place to allow two to exist on the same board without drawing too much power. You could functionally do the same thing today so long as you never drew more than 75w from the PCIe bus. You would simply have double the amount of power connections going to the card. However, the big problem here isn't power as much as the cooling solution would be. Something like a dual RTX 3090 card would almost have to be watercooled or the clocks would have to be nerfed considerably.
 
It had it's day, but it's over. There are two main reasons:

- DX12 and Vulkan came out, allowing far more control for utilizing mGPU, but required developer programming for it to work. As game engines became more complex and customized for each studio (with the rise of shaders and the death of the fixed-function pipeline), it became more difficult for Nvidia/AMD to use generic solutions that would work with all games. Though they did supply custom code in the drivers to facilitate this, it became more effort for little benefit.

- Game developers are reluctant to spend the effort to support mGPU in their engines when the economics aren't there. The market for multi-GPU owners is small, and the ROI isn't there for what is probably like 1% of users. Additionally, 10 years ago GPUs weren't as expensive, so it made sense financially to buy two (or buy another one a year or two after building a rig). When you have a $400 top GPU, spending $800 for SLI is reasonable. When the top video card can cost $1,200 or $1,500, the amount of people buying 2 is miniscule. A fraction of 1%, it just doesn't make sense anymore.
 
It only ever made sense to have dual GPUs if you were doing it with the fastest/most expensive GPU available at the time and it generally only netted you maybe 50% better performance on average, with some cases better or worse depending on the game. Otherwise it was generally cheaper to either buy or upgrade (after selling towards a new card) to the faster card rather than adding another GPU of the same model to your PC. Not to mention having the additional power and heat penalty, as well as the frame pacing (microstutter) that plagued mGPU setups. It just never really worked well or made sense for anyone besides the .1% of gamers.
 
A few inter-related reasons come to mind, but in summary, the economics to support mGPU is not longer a rational endeavour:
  • mGPU is hard/expensive to support (irrespective of its ever-present barriers to entry and issues e.g. power requirements, mobo support, microstutter). Support can be done either by GPU designer in drivers (and hardware), or in software where the onus is on developers. Once nVidia and AMD found the benefit was nil on their side, and the ability to support mGPU resided in an API (DX12, Vulcan), they put mGPU support out to pasture
  • If GPU designer is no longer motivated to support mGPU, the developer -- on whom the onus now lies -- is exponentially less motivated to support mGPU per application/game/engine. Do you think developers care on putting in even a single story/cycle point supporting mGPU for a single game, if it will only be attractive to >.05% of end users?
  • Single, enthusiast GPUs have been closing the gap on being able to perform at reasonable levels at high-res/high IQ. In 2010, you absolutely needed dual GPUs to reach 40-60 FPS at 1600p (what most enthusiasts were using then); in 2021, you only need a single top-tier card to reach 40-60 FPS at 4k in most games
  • As nV (and AIBs) has been driving up card prices, the market segment for mGPU (two more and more expensive cards) has gotten smaller and smaller.
 
Last edited:
Multi GPU support is implemented at the game engine level. Unreal engine 4 supports multi GPU SLI and CryEngine supports both Crossfire and SLI — thus all games using these game engines support multi GPU in DX 12 out of the box.
 
Last edited:
My one and only experience with multi-GPU was rather poor with crossfired 5770's. Worse performance in some games than a single card due to microstuttering, driver issues where it would often stop detecting I even had a second card which usually required me to shut down, reseat, and start up again. Was a big PITA with questionable return of benefit. Since then, I've always adopted a "get the highest end model possible within your budget" philosophy.

Now, video cards are the most expensive they have ever been so the idea of buying two is rather ludicrous. I don't even get why gamers are buying 3090's except for lack of other model supply I guess. But really $2k on a video card and you want to get two of them? Heck, $700 (in a perfect world without current issues) for a 3080 and you want two of them?

Support has also fallen to game developers instead of AMD/Nvidia profiles in their drivers since DX12 and Vulkan and the developers aren't supporting it.
 
Microstutter and DX12.
Not everyone was sensitive to microstutter and some of us would take it as a trade off to get overall better performance. This was especially important at higher resolutions. Even today, an RTX 3090 FE can't handle everything at 4K the way I'd like it to. DX12 is on the other hand a huge reason why multi-GPU died.
SLI and Crossfire died off when it became almost impossible to buy one card, much less two.
This is simply not true. SLI and Crossfire waned in popularity back when GPU availability wasn't a problem. Pascal and later architectures simply showed worse GPU scaling than some earlier architectures. You combine that with DX12 and developers that wouldn't put in the time to support it, and you have the reason why multi-GPU died out. I literally bought GPU's in pairs on launch day or near launch day all the way up through the GTX 1080 Ti without issues. It was never an availability thing.
It only ever made sense to have dual GPUs if you were doing it with the fastest/most expensive GPU available at the time and it generally only netted you maybe 50% better performance on average, with some cases better or worse depending on the game. Otherwise it was generally cheaper to either buy or upgrade (after selling towards a new card) to the faster card rather than adding another GPU of the same model to your PC. Not to mention having the additional power and heat penalty, as well as the frame pacing (microstutter) that plagued mGPU setups. It just never really worked well or made sense for anyone besides the .1% of gamers.
As I said, I had multi-GPU solutions for each generation all the way from the 6800 Ultra days through the GTX 1080 Ti. Microstutter wasn't the problem. In fact, no one had heard of it until several generations after it had been on the market. Those of us who had it and used single- GPU setups were often aware of it (but had no name for it) but you take the good with the bad when you are trying to run a high resolution / multi-monitor setup with the latest games. It was the only way to do it. The 50% boost by the second card wasn't the issue though. It was really a matter of fewer titles supporting it. I would agree with you that it was only ever worth it on the ultra-high end as two GPU's were able to do what one GPU could not.
A few inter-related reasons come to mind, but in summary, the economics to support mGPU is not longer a rational endeavour:
  • mGPU is hard/expensive to support (irrespective of its ever-present barriers to entry and issues e.g. power requirements, mobo support, microstutter). Support can be done either by GPU designer in drivers (and hardware), or in software where the onus is on developers. Once nVidia and AMD found the benefit was nil on their side, and the ability to support mGPU resided in an API (DX12, Vulcan), they put mGPU support out to pasture
  • If GPU designer is no longer motivated to support mGPU, the developer -- on whom the onus now lies -- is exponentially less motivated to support mGPU per application/game/engine. Do you think developers care on putting in even a single story/cycle point supporting mGPU for a single game, if it will only be attractive to >.05% of end users?
  • Single, enthusiast GPUs have been closing the gap on being able to perform at reasonable levels at high-res/high IQ. In 2010, you absolutely needed dual GPUs to reach 40-60 FPS at 1600p (what most enthusiasts were using then); in 2021, you only need a single top-tier card to reach 40-60 FPS at 4k in most games
  • As nV (and AIBs) has been driving up card prices, the market segment for mGPU (two more and more expensive cards) has gotten smaller and smaller.
Multi-GPU is not hard to support from a hardware perspective. Motherboards today still support it at various price points. You literally only need to have two PCIe x8 slots (NVIDIA) or an x8/x16 + an x4 slot (AMD). Power requirements aren't really a barrier either since most enthusiasts who would consider running such a thing would make sure to have a power supply that could handle it first. I size my PSU's to do just about anything and could add a second RTX 3090 FE without even thinking about that. Support can't be done by the GPU designers in drivers. That's the problem. With DX12 and Vulkan, it was left 100% up to the developers to implement in their titles. They don't do it for obvious reasons. The market for it is almost non-existent and it takes time, testing etc. to implement and most games are muilti-platform these days. They simply have other priorities.
Multi GPU support is implemented at the game engine level. Unreal engine 4 supports multi GPU SLI and CryEngine supports both Crossfire and SLI — thus all games using these game engines support multi GPU in DX 12 out of the box.
Indeed. There are still games out there that do support multi-GPU today. They are few and far between. It isn't just a matter of the engines supporting it, although you can often create a profile yourself (NVIDIA only) and get it to work reliably, although its not perfect. The game developer still always had to do some work to make multi-GPU behave correctly. That's simply not done even in cases where the engine would still support it.
 
I was a long time multi-GPU user, starting with the 5970. Eventually had 2x 1070s and switched to a lone 1080 ti.

While I liked the performance gains and felt it added really good bang for your buck, it was just too much hassle. I didn't notice microstutter all that much honestly but dealt with many games not scaling performance properly, having flickering issues, or outright not supporting it all. It got tiring having to wait for a driver update, it seemed like in most cases the driver had to hack in support. There were also a few games that outright did not work even with SLI and I only had 1 GPU effectively.

Don't regret swapping out for the 1080 ti - even if theoretically I had 20% performance loss. Hassle free gaming wins. I also luckily came out ahead because I sold my 1070s for more than the 1080 ti cost me.
 
I was a long time multi-GPU user, starting with the 5970. Eventually had 2x 1070s and switched to a lone 1080 ti.

While I liked the performance gains and felt it added really good bang for your buck, it was just too much hassle. I didn't notice microstutter all that much honestly but dealt with many games not scaling performance properly, having flickering issues, or outright not supporting it all. It got tiring having to wait for a driver update, it seemed like in most cases the driver had to hack in support. There were also a few games that outright did not work even with SLI and I only had 1 GPU effectively.

Don't regret swapping out for the 1080 ti - even if theoretically I had 20% performance loss. Hassle free gaming wins. I also luckily came out ahead because I sold my 1070s for more than the 1080 ti cost me.

I switched from two GTX 1080 Ti's to a single RTX 2080 Ti. There was a performance loss in some cases, such as Destiny 2, but overall I got a performance boost in most games given SLI wasn't doing anything.
 
These talking points get old and repetitive:
  • If you have two cards, you do not get worse performance then with just one card -> you always have the option to use just one card for games that do not scale giving you the same performance, the 2nd card, unless put to some other use will just idle
  • With two cards, you can have greater to much greater performance for games that support rendering on both cards
    • Rise and Shadow of The Tomb Raider are very good examples of excellent scaling, I've seen and used 3080 level performance years ago
    • Most games do not benefit significantly from two cards but a hell a lot more games play better with two cards than all the current available RT games
  • There are other reasons for having two cards vice just for games:
    • Folding at home
    • Mining on the second card or both when not gaming or during gaming -> very lucrative at the moment
    • 3d Rendering, professional work
I don't recommend a second card for gaming but if you do other stuff frequently it might be a good addition and at times be able to be used for gaming as well.

Really it comes down to that SLI/CFX is DX 11 type technology where AMD/Nvidia made profiles for games to get it to work. DX 12 requires more developer work which is where it should be placed in my opinion. Also various techniques now used do not play well with multiple cards where you have to analyze data from two or more frames such as TAA, you don't have the data locally if you don't have the data from the next frame if it is on the other card and communications between them is costly and slow.
 
If you want a gaming card the Titan Z and 295X2 were released in 2014.

I think the pro duo is pretty much the same class as the titan z was. I had a 295x2 for abit in the mining days and it was a pretty cool card.

My first GPU was a 5970 and that card was great in anything that could actually take advantage of the two chips (not much :p ).
 
Now, video cards are the most expensive they have ever been so the idea of buying two is rather ludicrous.

When top-end videocards get expensive, that opens up tons of opportunity for multi-GPU. Multi-GPU is not nor has ever been just about getting two of the fastest cards available at the same time. It's just that the people who did do that also tended to have the biggest mouths and the largest egos and thus tended to dominate the conversation.

Normal people would, for example, buy one card and then buy a 2nd identical card a year later (likely cheaper at that point). Someone currently using something like a GTX980, looking to upgrade, but dumbstruck by the ridiculous prices on new GPUs, could simply buy a 2nd, or even a 3rd used GTX980 instead. That's exactly what I did during the first cryptocurrency boom. I had used 2x GTX680 for a few years already and was looking to upgrade, but was not going to pay the stupid prices. I took advantage of the PCIe lanes that my X99+5820k had and bought a 3rd GTX680 used for cheap. Those 3x GTX680 cards gave me more combined GPU power than a 980ti in games with good SLI support (which included all the games I cared about) and was enough to carry me through the BS. Fast forward to today, where I'm currently using a 2080 RTX. I would LOVE to do the same thing again and simply pick up another 2080 for SLI instead of dealing with the BS going on with the new cards.
 
All these long winded comments of why. Developers are lazy period. We have 6+ cpu cores that don't get fully utilized in any game. Even games like Ultimate Epic Battle Simulator which would benefit from all that extra CPU power are underutilized, lazy programming. Consoles just make it even more of an excuse. Don't get me wrong, there are a few that push our systems, but they are few and far between. Horizon Zero Dawn did a decent job and is beautiful game. But take MS Flight Simulator. More could have been done to utilize cpu power than it was and it doesn't matter if you have 6 or 24 cores. LAZY. This why I am downgrading to a 5600X from 3900. Don't do any video editing any more, gaming doesn't use more cores, so why use more power and create more heat?
 
I had some fun with SLI over the years. Had a GTX 470 and bought a second one later for SLI and that had a huge performance improvement. Also allowed me to play games like Alice Madness Returns and Mirror's Edge with full hardware PhysX. Very nice.

Also went all out and got three GTX 980 cards for a Surround build. That was 7680x1440 resolution 144Hz and 3D. It needed a LOT of power and one card couldn't cut it. Played Bioshock Infinite, Dirt 3, and a bunch of games and it did have the performance.

Sadly things went downhill. New games started coming out with poor SLI support or negative scaling (like Watch Dogs) and I spent more time trying to get games to work than actually playing. Did GTX 1080 SLI and that was okay I guess, but I could see things getting worse.

Finally, I tried RTX 2080 Ti and honestly it barely did anything. FPS did appear high, but the games were not actually smoother. Like Far Cry 5, was actually much choppier on SLI even with a higher FPS. It was bogus. So I sold the extra 2080 Ti and didn't even notice any difference in most of the games I play.

Also had a Vega 64 Crossfire setup around the same time, and it was the same thing but worse. Microstutter, while it was supposed to be improved, was a huge problem. Again, I got a Radeon VII and everything was butter smooth. So much better even with less FPS.

It's a scam and after everyone realized, AMD/Nvidia just pulled the plug.
 
  • Like
Reactions: noko
like this
I was Cross Firing around 2019 on my x58 platform using a RX 580 and RX 570 .. They worked pretty good in a lot of older games which there was a list of games that support it some where .
 
It's a scam

Yeah... no.

I still use my 3x GTX680 in my backup computer to this day. In World of Warcraft, using DX11, one card gives me ~45-50fps while 3 cards gives me ~110fps+. 110fps even with microstutter is still better than 45fps.

Finally, I tried RTX 2080 Ti and honestly it barely did anything. FPS did appear high, but the games were not actually smoother. Like Far Cry 5, was actually much choppier on SLI even with a higher FPS. It was bogus. So I sold the extra 2080 Ti and didn't even notice any difference in most of the games I play.

If it "barely did anything", and you "didn't even notice any difference", that's probably because SLI doesn't work with DirectX12. With a card like a 2080Ti you would want to play most games using DirectX12. In some games you have a choice between using DirectX12 (with potential API performance improvements) or DirectX11 (with SLI support) in which case it's not just about SLI scaling but also weighing that against the loss of DX12 improvements (including increased multi-threading, etc). Did you ever monitor your GPU usage to see if it was even using more than one card?
 
Last edited:
Multi-GPU support on the 3090 is still there for pro use cases such as AI. If I had to guess, lack of consumer multi-GPU support comes down to technical difficulties with getting better performance on dual GPU setups nowadays compared to in the past when GPUs and game engines were smaller and simpler.

Another issue is hardware demand for games seems weaker nowadays than in years past. I'm still using a long-outdated Titan X Pascal while I wait for the 3090 to become possible to buy without scalper price. For workstation use, I'm taking a performance hit of, if I had to guess, 2-3x compared to said 3090 - there at least half the reason to upgrade is to double total RAM. But I can still run most games on 4k close to maxed out as long as RTX is off. That type of hardware longevity was not there in years past.
 
Last edited:
I also think the Nvidia "Titan" and Ti line of cards became a thing, and were popular. It used to be a little more simplified. I don't recall the GTX 580 or 680 having Ti/Ultra versions. Seems like the Ti and Titan (3090 this year) have taken that spot.

For lower end cards, why bother with 2x **60s when you can get an **80 for around the same price with more consistent performance, better thermals and lower noise? I understand the option to upgrade was there, but again, for most bargain upgraders I'd question your case cooling/PSU. And simply buying a new **60/70 would likely cost roughly the same if you sold your old card.

So the drop in an extra card for a cheap/quick upgrade wasn't that great in practice.
 
For lower end cards, why bother with 2x **60s when you can get an **80 for around the same price with more consistent performance, better thermals and lower noise?

Maybe the lower-end cards are actually available? "Just buy the faster card" sounds nice in practice, if you have the money right then and there and the more expensive card is actually for sale.

I understand the option to upgrade was there, but again, for most bargain upgraders I'd question your case cooling/PSU.

I wouldn't say that someone is a "bargain upgrader" just because they don't splurge on a 3090 or similar. Case and PSU are always a concern, but not everyone upgrades their case and PSU every time they upgrade their GPU. My current case (Corsair 800D) and PSU (Termaltake 1000w) date back to when I was running a Q9650 and 2x 4870x2 in Quad Crossfire and have hosted numerous GPUs over the years. There is nothing about either that would prevent me from sticking a 2nd card in there. Also, if a computer couldn't handle 2x 3060 or similar, it would probably have issues with a 3090 also, as it's not exactly an "efficient" card in terms of power consumption.

And simply buying a new **60/70 would likely cost roughly the same if you sold your old card.

When people eventually upgrade, there is usually a generational gap at that point. It wouldn't be someone getting another 3060 instead of upgrading to a 3090, it would be someone getting a 2nd 1070 or 2nd 980Ti. Older cards don't generally hold their value as well as the newest cards, making it more of a buyer's market.
 
Maybe the lower-end cards are actually available? "Just buy the faster card" sounds nice in practice, if you have the money right then and there and the more expensive card is actually for sale.



I wouldn't say that someone is a "bargain upgrader" just because they don't splurge on a 3090 or similar. Case and PSU are always a concern, but not everyone upgrades their case and PSU every time they upgrade their GPU. My current case (Corsair 800D) and PSU (Termaltake 1000w) date back to when I was running a Q9650 and 2x 4870x2 in Quad Crossfire and have hosted numerous GPUs over the years. There is nothing about either that would prevent me from sticking a 2nd card in there. Also, if a computer couldn't handle 2x 3060 or similar, it would probably have issues with a 3090 also, as it's not exactly an "efficient" card in terms of power consumption.



When people eventually upgrade, there is usually a generational gap at that point. It wouldn't be someone getting another 3060 instead of upgrading to a 3090, it would be someone getting a 2nd 1070 or 2nd 980Ti. Older cards don't generally hold their value as well as the newest cards, making it more of a buyer's market.
Don't you get worried about the longevity of a PSU when its 13-14 years old?!?! :sick:
 
Don't you get worried about the longevity of a PSU when its 13-14 years old?!?! :sick:

No. I've made my views on using older PSUs clear in many other threads so no need to derail this one, but suffice it to say that it was a good PSU to begin with (Good design, doesn't use faulty capacitors, etc). I am the original owner, it has been well cleaned/maintained, it has not been abused, and it's still working great.
 
I wouldn't say that someone is a "bargain upgrader" just because they don't splurge on a 3090 or similar.

SLI used to be poised as an option for the budget gamer to upgrade cheaply. For example, your PC is getting a bit old. Double you GPU performance by buying another identical GPU, which you can find for $150 used or $200 new as they were being cleared off. Which sounds great, but again, you can sell your old GPU and then buy one new one and have a better experience overall. It wasn't long ago that the mid range cards were $250-300.

In general I think people realized it wasn't worth the trouble. This really only left the high end market for those pushing the highest resolutions and settings. But with the Titans and now **90s that are coming along, I think that market has too died.
 
I had crossfire rx480s for a while and in the games it worked well in it was awesome as at the time the rendering power was unmatched. Some games on the other hand hardly got a boost even when it worked, while others flatout were broken or had micro stuttering. So besides extra power draw and heat it was really those issues that pushed me back to single card only unless I need a ton of outputs or compute power.
 
No. I've made my views on using older PSUs clear in many other threads so no need to derail this one, but suffice it to say that it was a good PSU to begin with (Good design, doesn't use faulty capacitors, etc). I am the original owner, it has been well cleaned/maintained, it has not been abused, and it's still working great.

Capacitor aging is a thing. After enough years of use they'll eventually fail to provide their max power output. I've seen it happen to many PSUs over the years.

Paul has done the testing and written at least one article on the subject.
 
The fact of the matter is that microstutter is real, and seeing higher FPS in SLI/CF is useless if the experience is worse.

I was in denial for a while, and the driver frame pacing improvements did help, but it just seems to be a flaw in the design.

It's good that reviewers are using better measurements now, like frametime graphs and 1% lows, because this tends to show why multi-GPU is actually worse while having more FPS.

That said, if you are building a really exotic PC, lets say a triple 4K display racing cockpit, well maybe you'll still go for SLI. It might still be worth it despite the problems. But for like 99% of people it's bogus.
 
I'm glad it died, you shouldn't have to run 2 video cards in your rig to be able to run a game. I tried it once a long time ago and the microstutter pissed me off, I sold both cards and bought the highest end card I could at the time
 
SLI used to be poised as an option for the budget gamer to upgrade cheaply. For example, your PC is getting a bit old. Double you GPU performance by buying another identical GPU, which you can find for $150 used or $200 new as they were being cleared off.

It wasn't ever exclusively marketed this way. In fact, much of the time two of the highest end cards were showed paired up far more frequently than two mid-range cards.

I'm glad it died, you shouldn't have to run 2 video cards in your rig to be able to run a game. I tried it once a long time ago and the microstutter pissed me off, I sold both cards and bought the highest end card I could at the time

There was never a case where you had to run two cards just to run a game. Either you used mid-range cards and equaled the performance of a high end card, but purchased over time or you went with two high end cards for the fastest possible framerates you could achieve at the time. There were one or two people that did buy two midrange cards instead of a single high end card, but this was rare and most regretted doing it that way.
 
It wasn't ever exclusively marketed this way. In fact, much of the time two of the highest end cards were showed paired up far more frequently than two mid-range cards.



There was never a case where you had to run two cards just to run a game. Either you used mid-range cards and equaled the performance of a high end card, but purchased over time or you went with two high end cards for the fastest possible framerates you could achieve at the time. There were one or two people that did buy two midrange cards instead of a single high end card, but this was rare and most regretted doing it that way.
Sure you didn't need 2 cards to run the game but crysis...
 
I'm glad it died, you shouldn't have to run 2 video cards in your rig to be able to run a game. I tried it once a long time ago and the microstutter pissed me off, I sold both cards and bought the highest end card I could at the time
Wow. First time I saw 1280 resolution.
 
There were one or two people that did buy two midrange cards instead of a single high end card, but this was rare and most regretted doing it that way.
I had dual GTX660s, and the only ones with regret were the people on forums who bought 680s, whose records I smashed in benchmarks. A single GTX680 cost more than two 660s.
 
Capacitor aging is a thing. After enough years of use they'll eventually fail to provide their max power output. I've seen it happen to many PSUs over the years.
It is a thing, but that bell curve is pretty wide. I can give lots of anecdotal instances of old computers still chugging along with original equipment.
 
When people eventually upgrade, there is usually a generational gap at that point. It wouldn't be someone getting another 3060 instead of upgrading to a 3090, it would be someone getting a 2nd 1070 or 2nd 980Ti. Older cards don't generally hold their value as well as the newest cards, making it more of a buyer's market
Have you checked eBay lately?
1070pricing02202021.png

A 5 year old card selling for up to $130 above MSRP? Fucking insane. Let's see what cards like my 980 Ti are going for...
1_980tipricing02202021.png

Not much better here, either. A quick search for 780s and 780 Tis didn't turn up much at all. Slim pickings at batshit crazy prices. First shortages+scalpers, and now throw Ethereum/Litecoin mining back into the mix again and you get this shit sandwich. It's a wonderful time to be selling GPUs and a waking nightmare to be in the market for one.
 
Back
Top