Screw current GPU pricing... bring on the old-school Quad GPU setups!

cyclone3d

[H]F Junkie
Joined
Aug 16, 2004
Messages
16,244
So this is more of a "the older GPUs have come down in price enough for me to buy" thread than anything else.

And besides that my R9-390 is still handling everything I play pretty well.

I found these cards for sale for what I deemed a good price so I bought them for some older systems.

2x Nvidia GTX295 - $42 a piece. ($500 MSRP per card when new)

2x ATI Radeon HD5970 - $70 a piece. ($600 MSRP per card when new)

I wanted those setups back when they were new but not only did I not have the money, I wouldn't have paid that much for a setup.

Gonna have me some Quad GPU fun. Closest I ever came before this was a couple of refurb 4870x2 cards I bought and one of them ended up being DOA :-(
 
if you waited a couple months you could probably get RX4xx or RX5xx cards as they will begin to flood the used market. (unless crypto takes off again).
 
Yeah, i think if you wait you could get 580s really cheap. I see where you are coming from, but when i saw how poorly my brother in laws 6970 2gb played "the division" on low settings I have to agree, it would be quite a waste unless your going for just benchmarks alone. 9 months ago i got a fury x for $200, in the near future they might be really cheap.
 
Yeah, these are not for playing anything at all recent. I wouldn't even dream of trying to use them for newer games.

These are just for fun with older games and systems.
 
lol. Ahhh, those were the days. I'm just waiting waiting waiting.... I've noticed prices are starting to come down. I can basically pay MSRP for a used card now.
 
ya overpaid for those 5970's i bought one for like 50 a few years ago for a dual xeon rig i had at the time. the first card to make me relise crosfire fucking sucks. on a side note the 5970 can play ark :p not well but it will run. (a 750ti is much faster on ark)
 
SLI and CF are aging technologies, I wouldn't be at all surprised to see them fade into obscurity in the foreseeable future, especially considering the cost of hardware outside the US. However, the ASUS Mars 2 x 295's in SLI looked the bomb in the day.

UZZSHnrh.jpg
 
SLI and CF are aging technologies, I wouldn't be at all surprised to see them fade into obscurity in the foreseeable future, especially considering the cost of hardware outside the US. However, the ASUS Mars 2 x 295's in SLI looked the bomb in the day.

View attachment 62286

i would love to see it evolve to a hardware level only thing where it would help any game regardless of support. where i can shove 8 titan v's into a build if i want and reap the benifits. i dont think that would ever happen though :(
 
ya overpaid for those 5970's i bought one for like 50 a few years ago for a dual xeon rig i had at the time. the first card to make me relise crosfire fucking sucks. on a side note the 5970 can play ark :p not well but it will run. (a 750ti is much faster on ark)
You remind me of when I got a HD5770, and thought it was awesome. I wanted a bit more power and went CF 5770s. I learned quite quickly why CF was a bad idea overall, and I largely sidegraded to a single HD5870. Because I didn't learn my lesson, I went to a HD5870 CF setup soon thereafter, and that didn't end well. When I got the chance to upgrade again, I went for the single fastest card available, the GTX580.

I've never even tried to do multiGPU since then.
 
i would love to see it evolve to a hardware level only thing where it would help any game regardless of support. where i can shove 8 titan v's into a build if i want and reap the benifits. i dont think that would ever happen though :(

This 100%!

Sadly I don't see it happening. Even mGPU is struggling to make any real traction under either DX12 or Vulkan, developers just don't appear to be interested.
 
My single HD6850 struggles a lot with modern games, even at sub-1080p resolutions. For me these cards aren't yet old or cheap enough for lab/nostalgia setups, but aren't useful enough for modern gaming. If I can pick up another HD6850 for $20 in a few years, I might give Crossfire a try just to see what it's like...if my PSU has the watts to spare. My nostalgia rigs use whatever old low-wattage PSUs I have laying around.
 
ya overpaid for those 5970's i bought one for like 50 a few years ago for a dual xeon rig i had at the time. the first card to make me relise crosfire fucking sucks. on a side note the 5970 can play ark :p not well but it will run. (a 750ti is much faster on ark)

It is the cheapest I have seen them in a long time.... well since I started looking at older stuff the past couple years.

My CF setups other than this have been:

Dual 6870s - worked well for almost every game I was playing at the time. Upgraded to a single 7970.

Then I went to Dual 7970s - Very nice.

Then when I got my R9 390, I sold one of the 7970s to a family member... still want to get it back at some point once they upgrade that computer.
 
I imagine this is more for nostalgia than anything else.

Those 5970's at $600 new would've paid for themselves many many times over if you got into Bitcoin mining early on. Sadly, I had a 5850 and didn't realize the potential there.
 
Using CPU and GPU for cryptocurrency is a waste of resources because those devices are designed to do so much more than mine.

I really hope the dedicated ASIC market takes off because then people will move to that and stop buying GPU's.
 
My experience with crossfire on a pair of Fury X was nearly perfect.

When it worked there was about 80% scaling. When it didn’t you never knew because it didn’t cause any problems. I don’t think I ever had to toggle crossfire on/off to try to fix some random issue in the six months + that I had that pair of cards. Timeframe of ownership of the crossfire Fury X was first half of 2017, after just owning a single Fury X for the nearly six months prior.

I highly recommend it for Fury X owners.

I tried it with Vega 56 launch cards for the first few months and it was terrible. Sold and went to nvidia.
 
Last edited:
lol, so true.

Why? They are only rated at 289w TDP each.
HD5970 have a TDP of 294w each.

My main rig GPU, an R9 390 has a TDP of 275w at stock speed.

To reach the TDP, you have to run the cards at full-tilt.. which almost never happens unless you are running something like furmark.

I live in a place where electricity is fairly cheap and the older rigs are not going to be running all that much. Just for older games when I feel like playing them and for benching.

I'm really not worried about the electricity expense.
 
Not sure what this means in terms of dollars and cents for your power bill, but some food for thought: The idle power draw for a single GTX 295 is in the neighborhood of 55W, which is around 10x Pascal's idle, and 5x Maxwell's. So two idling GTX 295's is like running a GTX 1070 at 70% around the clock.
 
i would love to see it evolve to a hardware level only thing where it would help any game regardless of support. where i can shove 8 titan v's into a build if i want and reap the benifits. i dont think that would ever happen though :(

In theory DX12/Vulkan can handle it, but it's up to the developer to figure out the implementation. And that's the real problem: Figuring out how to load balance between different GPU combinations. Mixing and matching doesn't play nice with Alternate Frame Rendering (latency becomes a MAJOR problem), but Split Frame Rendering has it's own issues.

*Maybe* when VRR becomes a thing and you aren't tied to a 60Hz refresh cycle you might see dual-GPUs become a thing again, but right now they just come attached with too many headaches.
 
I dabbled with several triple and quad setups from AMD and Nvidia in the past, mostly because I either had a batch of used cards to sell and was bored or I was stepping up to another config and the old card(s) was lying around. The ones that come to mind were:
  • dual 4870 x2s
  • 4870 x2 + 4870
  • triple 5870s
  • quad 480s
  • GTX 690 + 680
My experience with 2+ GPU setups has been extremely consistent: with every extra GPU, you get lower the performance return and higher likelihood of issues -- completely not worth the trouble. The only one I recall working somewhat without issue in many games was the dual 4870 x2 setup, but even this setup had significantly diminished actual performance for the amount of potential GPU power.

The most ridiculous setup I had, in terms of power consumption, was the quad GTX 480s, where I had to run a supplemental 300w supply jury rigged to a 1200w supply.
 
did you forget the 3dfx VooDoo2? Pulled my 3rd R920, wasn't doing anything other than the occasional App.HashcatGUI.exe - Shortcut, thowing heat and blowing hot air. DoS is easier anyways. Lazy.
 
Last edited:
Meh... SLI / Crossfire are pretty much dying technologies. You'll notice that Nvidia has all but abandoned SLI and AMD doesn't really make a big deal of Crossfire anymore.

Two 295s in SLI might look wicked as hell, but they're just going to chew through electricity doing what would be done better and faster by a current gen single card.

And you'll be booting in legacy mode since the 2xx series didn't support UEFI if memory serves....
 
Disagree about multiple cards, with increase usage for other stuff, professional stuff and great potential with VR - it is very much applicable and useful. CFX and SLI is DX 11 technology which is now obsolete from a new development standpoint, DX 12 and using multiple cards is what will take over and no need for CFX/SLI tech and the needed driver implementation for each application.
 
What indications from development studios have you gotten thus far that shows they are competent at developing multiple gpu support without assistance from Nvidia / AMD?
 
With DX12, mgpu support has moved to the game coders from the drivers so good point.
 
What indications from development studios have you gotten thus far that shows they are competent at developing multiple gpu support without assistance from Nvidia / AMD?

I certainly won't be holding my breath. I have very little confidence in developers putting much if any imput into mGPU without funding from either AMD or Nvidia.
 
I used to have 4x 290x .. I had to run dual PSUs on 240v circuit to keep them fed. It was close to 1500 watts
 
I certainly won't be holding my breath. I have very little confidence in developers putting much if any imput into mGPU without funding from either AMD or Nvidia.

On Mantle it was as easy as seeing another device to do tasks on rather then the AFR stuff most are used to. For stuff like Oxide engine once it is implemented in the engine anything using that engine can make use of it.
Not as much work as you make it out to be.
 
Back
Top