AMD Distances Itself from CrossFire with RX Vega

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
Being that CrossFire was non-existent during AMD’s media briefing and in RX Vega marketing materials, GamersNexus supposes that the company is giving up on their multi-GPU technology. While RX Vega 64 and RX Vega 56 will support CrossFire, AMD did note that the industry is moving away from these configurations and developer support is limited.

...AMD has minimized its marketing focus on multi-GPU for RX Vega and, although the cards can technically function in multi-card arrays, AMD noted that the value is rough when considering limited developer support. This aligns with NVIDIA’s decision to begin slowly winding-down SLI support during the Pascal 10-series launch event, where discussion of keyed 3-way SLI would be required (something later changed, though there’s no official support of >2-way SLI in games on 10-series cards).
 
If AMD and NVIDIA don't want to support multi-GPU solutions, then "fans" of the technology will simply not buy two cards or more when they upgrade. This isn't entirely on AMD and NVIDIA's shoulders either as the gaming industry seems to have moved away from ensuring games support the technology. Without that, what are NVIDIA and AMD to do? There is only one thing they could do, but the current architectures of AMD and NVIDIA's offerings simply may not lend themselves to doing it well if at all. What AMD/NVIDIA need to do is build multi-GPU solutions that work independently of the software. The old VooDoo2 cards in SLi didn't require software support beyond the Glide API. They simply worked and that's what AMD and NVIDIA need right now. I hoped that Vulkan gaining multiGPU support would help but unless Vulkan support becomes more widespread there is little hope of that.
 
Interesting they would stop that support since that kinda takes away from selling cards. I know there are not a ton of people that buy multiple GPUs but say 1000 people buy 3 card configs world wide GPU's that is essentially losing 2000 card sales.....that's a decent amount of revenue....kinda surprising they want to move away from that.....maybe there are other reasons.
 
I'm surprised this didn't come sooner. Most saw sli/ cf as an upgrade path only to find out it's a poor one. By the time you're ready for the second card, you'd be better off with next gen hardware.

I tried it once with two nvidia 7900 gto's and said never again. I expected to be able to run them longer before upgrading and that was not the case. New features were being utilized that those cards couldn't handle. In the end, they were horribly slow anyway.

Sli is a niche market that has probably gotten small enough to the point of not being worthwhile to bother with anymore.
 
Multi gpu has been dogshit for ages now, its always a fucking waiting game for drivers regardless of what "side" you have cards from. I've had both and sli is easily as ropey as crossfire despite what some chimps would have you believe. I think we're going to see it fade away in the next couple of years almost entirely, at least in multiple card form. That multi core in a single die tech seems interesting though.
 
Of course ideally one card would be powerful enough to acceptably run whatever is thrown at it, but we aren't there yet.
 
Interesting they would stop that support since that kinda takes away from selling cards. I know there are not a ton of people that buy multiple GPUs but say 1000 people buy 3 card configs world wide GPU's that is essentially losing 2000 card sales.....that's a decent amount of revenue....kinda surprising they want to move away from that.....maybe there are other reasons.

Like I said, I don't think it's something AMD and NVIDIA want to get away from. The way current GPU architectures work, support for the feature has to be provided in software. DX12 took the control totally out of the hands of graphics card makers and put it on developers to implement. Vulkan is the same way. Game developers for anything that will be offered on PC tend to code for the widest range of system configurations leaving niche features and options off the table as development cycles rarely allow the spare time to work on these features. If something like 5% or less of all PC gamers use multi-GPU systems, then it doesn't make sense to code for that. This is especially true if the gains are minimal. Most games today can be maxed out at 1080P on mid-range cards. You only need that kind of GPU power at 4K, which is again a small subset of today's market.

Again, we need a multi-GPU solution that works at a hardware level. The problem is that the way these architectures are built, that's probably not possible. AMD and NVIDIA will need to build new architectures from the ground up that don't require software implementation to function. We are probably 2 or 3 years away from that being a reality, even if AMD and NVIDIA have started working towards that. I wouldn't be at all surprised if they were working on that. The simple fact of the matter is, they don't want to lose sales and if you can make an architecture that will work well by scaling the number of graphics resources available, then they'll do it. That would not only help sell more GPUs, but it would also enable them to scale up future versions of a given graphics chip on a single card and double performance. Then again, achieving that goal and actually building an architecture that can do that is easier said that done. It may not come to fruition. That is, if AMD or NVIDIA has bothered to explore this path at all.
 
Of course ideally one card would be powerful enough to acceptably run whatever is thrown at it, but we aren't there yet.

Outside of 4K and odd resolutions made by multi-monitor setups we pretty much are there with the 1080Ti. At 1440P about the only game I currently play that can't keep above 60 FPS at all times is Crysis 3 and at this point it probably has more to do with game engine inefficiencies than it is a matter of pure horsepower.
 
My guess this is the motherboard manufacturers going broke providing SLI that few actually use due to cost and the questionable benefit of it. Nvidia doesn't care, but people who support SLI do. I did SLI back with my Geforce II< when it actually meant Scan Line Interleave, I did it with my 580's and 670's but I'm done with the tech now. If you have a dual-slot and buy a cheap version of the card you already own to get an extra year or two of life, that's just about the only scenario that makes sense to me now where SLI is concerned, and I know that logic is debatable.

Seems to me the smarter move is to take a single GPU and pair it up with a free/g synced monitor these days, that's the modern ask......
 
I'm so mad I bought 2 x 1080's . I thought directx 12 was going to support dual cards better as I game at 4k. Then a few weeks later the 1080ti comes out. Really don't feel like pulling my water cooled rig apart to sell my cards. But the second one is just sitting there doing nothing. If there not going to support it any more they are really going to have to kick it up a notch on performance where a single card is going to cover it all.
 
SLI, if it really worked as "Double the Cards, Double the fun." then it would be pretty awesome. Sometimes, it really does - Crysis 3, Rise of the tomb raider. However there are games like Hitman (DX11) that only yield a paultry 6 fps gain with drops and issues. Oh and DX12 didn't work.

But the fact is that it is pretty niche. When it's supported it doesn't yield the performance gain one would expect. Sometimes SLI can cause games to crash.


I think Multiple GPU setups would work well with VR. nVidia Works seems to integrated their own "VR SLI" which would allow 1 eye per card. This is probably the only reason I'd consider a SLI setup these days, besides the bling factor (for 1ish year).

 
OH they better fucking not kill off Crossfire. Same goes for SLI. With DX12 and Vulkan supposedly being more multi-GPU capable of any API that came before them, it would be utterly ironic that they cut multi-GPU support now.
 
Not much you can do when not many people use it, so not many devs support it. Not to mention people are getting smaller and smaller cases now, so there isn't much room for that second GPU anyway.
 
Maybe AMD should have got their act together and had more than one guy responsible for crossfire support.
 
I see by the GPU's in your sig you have much experience in Crossfire.


From using

2x 7970
2x r9 290
2x RX480

Crossfire was great....on games that supported it. There were a few games that had 0% increase.
 
Yeah, it is what it is I guess. Dan, is there room for maybe Intel to poop out a standalone card with a hardware solution to cross fire/sli?
 
I've had 2 SLI setups in the past and to me the juice was not worth the squeeze. From a hardware standpoint: increased power draw (which of course requires a better PSU), increased heat, increased cable clutter. I had two GTX570's in a small room and that thing would raise the ambient temp by 10-15 degrees. From a software standpoint: does the game support it? Maybe. If it does support it, does it actually work without crashing? Possibly. Does it work without editing config files or faffing with settings? Almost never.

In retrospect it has the appearance of a scam just to get people to buy twice as many video cards, but I don't think it was anything nefarious. The software support just wasn't there.
 
1080Ti says hi.

Runs everything I throw at it in glorious 4K60 with Ultra settings.

There are times when a single 1080Ti is insufficient. While you can use a single GPU and max out the settings of most games, I found it lacking in Mass Effect Andromeda and one or two other titles I play. That said, if shit doesn't turn around this may be the last multi-GPU setup I have until something changes on that front.

OH they better fucking not kill off Crossfire. Same goes for SLI. With DX12 and Vulkan supposedly being more multi-GPU capable of any API that came before them, it would be utterly ironic that they cut multi-GPU support now.

Again, the problem isn't the API's. It's the game developers that are at fault here. Game developers not supporting it is a multi-part problem. The developers are often at the mercy of the publishers they use who demand results in a given time frame. Game developers rarely if ever get the time they need to implement every feature they might like to and fluff like multi-GPU support that effects a small percentage of the market is one of the first things to get cut. Multi-platform gaming is another nail in the coffin for SLI. When consoles can see much higher volume sales than PC's do on the same titles, those platforms get priority. Again, the limited time frame for development comes into play when considering multi-platform titles. Lastly, small development studios and even large ones may outsource PC game porting to companies that do the job with varying degrees of success and skill. The company that did Arkham Knight's PC port shouldn't be allowed to touch the PC platform ever, but they've done a handful of AAA titles and badly at that.
 
Speaking as a lifelong geek who was never in a position to afford an SLI/Crossfire rig, now that things have started to improve for me, it seems sad that the two major manufacturers seem to be killing it off. : -(

I guess that in five or ten years someone will have an "Aha!" moment and start scaling out again instead of scaling up.
 
Speaking as a lifelong geek who was never in a position to afford an SLI/Crossfire rig, now that things have started to improve for me, it seems sad that the two major manufacturers seem to be killing it off. : -(

I guess that in five or ten years someone will have an "Aha!" moment and start scaling out again instead of scaling up.

Technology isn't a straight line path. Unfortunately, the direction AMD and NVIDIA have gone makes certain architectures perform very well in multi-GPU rendering modes and not so well on others. If you look at the history of SLI and Crossfire, you will see certain generations that did it very well and some that didn't. Maxwell did it better than Pascal, etc. The changes to DirectX 12 vs. DX11 and earlier versions are also to blame. DirectX 12 supports a whole lot of methods for getting the most out of multiple GPUs, but the developers have to support them. We probably won't see widespread DX12 games for another two or three years. If those developers didn't bother to support multiple GPUs during development, it won't make any difference. The same is true of Vulkan.

I really think that hardware independent support is the only way forward with multiple GPUs. That said, it will take a future architecture and probably something very different than SLI or Crossfire as we know them to achieve that.
 
Again, the problem isn't the API's. It's the game developers that are at fault here. Game developers not supporting it is a multi-part problem. The developers are often at the mercy of the publishers they use who demand results in a given time frame. Game developers rarely if ever get the time they need to implement every feature they might like to and fluff like multi-GPU support that effects a small percentage of the market is one of the first things to get cut. Multi-platform gaming is another nail in the coffin for SLI. When consoles can see much higher volume sales than PC's do on the same titles, those platforms get priority. Again, the limited time frame for development comes into play when considering multi-platform titles. Lastly, small development studios and even large ones may outsource PC game porting to companies that do the job with varying degrees of success and skill. The company that did Arkham Knight's PC port shouldn't be allowed to touch the PC platform ever, but they've done a handful of AAA titles and badly at that.

I agree. The famous Ubisoft "PC only, so who cares" programming note really underscores that.
 
In another 1-2 generations of GPU's 4k 120fps should be doable with one card. 1080Tis handle 60fps at 4k now for the most part. There isn't a need to spend resources supporting an option that game developers as a whole are not pushing.

It does seem that as wonderful as 64 PCIE lanes are on X399 it's not really needed, how many NVME drives will you be running in addition to your one GPU?
 
I see by the GPU's in your sig you have much experience in Crossfire.


From using

2x 7970
2x r9 290
2x RX480

Crossfire was great....on games that supported it. There were a few games that had 0% increase.
I've used Crossfire for the last seven months on a pair of Fury X on my primary rig and not had a single issue. Attended a couple LAN parties during that time too. No fiddling with settings, no nothing - it just worked. (or when it didn't work, it at least didn't degrade performance over using a single card.)

I've used SLI for the last two weeks on my primary rig and attended a LAN party this last weekend and had four occasions were I needed to disable SLI because of performance dedregation. Also it's not as smooth at 60hz vsync vs 75hz freesync.

I had just made up my mind to sell the 1080ti cards and go back to AMD for crossfire.

Shame reading this... Now I'm not sure what I'll do.

I have three 32" freesync monitors.
 
Last edited:
  • Like
Reactions: Zuul
like this
I ran CrossFire over the years and even had a Dual GPU ATI card and was never got the expected performance out of it. Now I'll just get the fastest single card I can afford in my next build. No fuss, no muss.
 
First Nvidia now AMD. The future is simply single GPU unless you do compute tasks.
 
How sad.

GTA v possibly the greatest PC game of all time works flawlessly with sli and crossfire.

That's the difference between a great developer and a shit developer.

And I for one do not blame Nvidia or amd for the death of multi gpu it's developers developers developers.
 
  • Like
Reactions: Zuul
like this
To those complaining about not being able To run ULTRA settings, why not just run high? I watched a video on this recently and it showed that you gain almost nothing with ultra unless you take screenshots, while it costs a ton of performance. I’ll rather run 4K at high than 1080P at Ultra with 100x MSAA.
 
How sad.

GTA v possibly the greatest PC game of all time works flawlessly with sli and crossfire.

That's the difference between a great developer and a shit developer.

And I for one do not blame Nvidia or amd for the death of multi gpu it's developers developers developers.

Works flawlessly and has good scaling are two big factors with SLI/CF. It's rare to have one or the other, and extremely rare to have BOTH in a single title.

It takes a lot of developer time and effort to make SLI/CF work well, and that's a huge waste of money for most devs. Why spend that many man-hours of development time to support 5% of your customer base?
 
Technology isn't a straight line path. Unfortunately, the direction AMD and NVIDIA have gone makes certain architectures perform very well in multi-GPU rendering modes and not so well on others. If you look at the history of SLI and Crossfire, you will see certain generations that did it very well and some that didn't. Maxwell did it better than Pascal, etc. The changes to DirectX 12 vs. DX11 and earlier versions are also to blame. DirectX 12 supports a whole lot of methods for getting the most out of multiple GPUs, but the developers have to support them. We probably won't see widespread DX12 games for another two or three years. If those developers didn't bother to support multiple GPUs during development, it won't make any difference. The same is true of Vulkan.

I really think that hardware independent support is the only way forward with multiple GPUs. That said, it will take a future architecture and probably something very different than SLI or Crossfire as we know them to achieve that.

Or they could just drop multi-gpu support. Miners probably have more cards that SLi/Xfire users combined. I mean why bother.
 
didn't AMD show-off systems with 2x VEGA cards at computex not that long ago? or was it around e3.. foggy.. anyway this seems like a bad move, but I guess with nVidia moving away from multi-gpu its even harder for AMD to convince developers to support it alone... sucks.
 
DX12 with mGPU was essentially the last death strike to CF/SLI if it wasn't dead already by people not having 2 cards and the ever increasing amount of gamers moving mobile. The work developers needed to do to an already niche group was multiplied multiple times compared to the APIs they had before. That's simply something you cant excuse to increase development time and cost for that.
 
To those complaining about not being able To run ULTRA settings, why not just run high? I watched a video on this recently and it showed that you gain almost nothing with ultra unless you take screenshots, while it costs a ton of performance. I’ll rather run 4K at high than 1080P at Ultra with 100x MSAA.

Because turning settings down in games isn't fucking good enough. That's why.
 
The concept of SLi is completely niche. I love it though, but could never afford to move down that path. I have friends you do and did. The problem is really market forces, but Nvidia did it right i think out of the two, but I just never saw a performance improvement for the amount of money that needed to be spent to make it happen.
 
I would be interested in seeing historical numbers on SLI/CF support. I just don't feel like it's ever been great but then there do still seem to be a good number of games that have decent support for it. I'd only recommend it certain situations where you have a lot games and want to be able to squeeze all you can out some of them. Sure it's niche and doesn't get the attention but when it does, and that's not rare, it can make a huge difference.
 
Like I said, I don't think it's something AMD and NVIDIA want to get away from. The way current GPU architectures work, support for the feature has to be provided in software. DX12 took the control totally out of the hands of graphics card makers and put it on developers to implement. Vulkan is the same way. Game developers for anything that will be offered on PC tend to code for the widest range of system configurations leaving niche features and options off the table as development cycles rarely allow the spare time to work on these features. If something like 5% or less of all PC gamers use multi-GPU systems, then it doesn't make sense to code for that. This is especially true if the gains are minimal. Most games today can be maxed out at 1080P on mid-range cards. You only need that kind of GPU power at 4K, which is again a small subset of today's market.

Again, we need a multi-GPU solution that works at a hardware level. The problem is that the way these architectures are built, that's probably not possible. AMD and NVIDIA will need to build new architectures from the ground up that don't require software implementation to function. We are probably 2 or 3 years away from that being a reality, even if AMD and NVIDIA have started working towards that. I wouldn't be at all surprised if they were working on that. The simple fact of the matter is, they don't want to lose sales and if you can make an architecture that will work well by scaling the number of graphics resources available, then they'll do it. That would not only help sell more GPUs, but it would also enable them to scale up future versions of a given graphics chip on a single card and double performance. Then again, achieving that goal and actually building an architecture that can do that is easier said that done. It may not come to fruition. That is, if AMD or NVIDIA has bothered to explore this path at all.

So you think it might be more of a hiatus from this config until the iron out a hardware level configuration rather than sticking with the software reliant setup like current gens? I can see that happening, the old VooDoo's ran the dual GPUs on 1 board if I can recall correctly. I remember when nvidia bought them I thought for sure they would incorporate a lot of their patents into the new boards but AMD was the only one to try the single card multi-gpu solution which had some decent success. I have always faulted AMD for their drivers more than their hardware. I know a number of people who when they got their cards brand new had issues with crashing. Never really ran into that with Nvidia. I just hope the competition between these two companies really drives so innovation in a rather stagnant field. I would like to see the hardware change for the good rather that status quo like we have been getting as of late. Bring back the new "old" designs! ;)
 
Not much you can do when not many people use it, so not many devs support it. Not to mention people are getting smaller and smaller cases now, so there isn't much room for that second GPU anyway.

That is rather a self fulling prophecy though. Users do not use it because devs do not develop for it because users do not use it. ;)
 
Back
Top