Near-Perfect mGPU Scaling in Strange Brigade

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,634
While we have certainly seen and heard a lot of talk about the "death of SLI and CrossFire," it is surely good to see AMD reporting on the usage of mGPU via Vulkan in this new Strange Brigade title. With the latest Radeon Software Adrenalin Edition 18.8.2, AMD is reporting a scaling rate of 1.9X with mGPU enabled using two RX Vega 56 cards.

Certainly it would be interesting to know just how much work went into making Strange Brigade mGPU compatible, and we have asked AMD as much. We are hoping to see more Vulkan titles with mGPU enabled.

On Day-0, AMD Radeon Graphics offers multi-GPU scaling in DirectX 12 and Vulkan. Strange Brigade is the first game in which Vulkan® supports multi-GPU capabilities.
 
I never understood why mGPU is so hard...

DirectX 12 Explicit mGPU is a matter of developers not wanting to fit the bill. But engines have been consolidating for a few years now, and everyone uses the same ones, why don't those support it natively?

And pretty sure Nvidia/AMD don't like selling two mid range cards instead of one high range, but limiting the functionality to select models is pretty easy.

So why is mGPU mostly dead?
 
It is a niche market regardless so that doesn't help with dev support.
It should't be thought of as niche, as the idea of monolithic dies is going way (see Ryzen). We need mGPU support to enable to future of graphics processing.
 
It is a niche market regardless so that doesn't help with dev support.
Niche like 4K? With the new pricing structure suggested by Nvidia and the fact that no single card can offer 60+ FPS at minimum in all games should be obvious enough theres a market for mgpu. Devs just need to do a better job of supporting it.
 
Wake me when they do proper split frame rendering.

AFR Will never be good enough, regardless of how well it scales.
 
sli and crossfire aren't dead.

they're just breathing funny.

hopefully this puts a fire under devs ass and causes a trend.

doubtful though.

I think that's a good way to put it...but I wouldn't invest in it either. It's basically on GoFundMe.
 
sli and crossfire aren't dead.

they're just breathing funny.

hopefully this puts a fire under devs ass and causes a trend.

doubtful though.

It won't because there are just not many people that own multiple GPUs. Since it literally doubles the price over a single GPU, it is always going to be a niche market. So you are just never going to see widespread developer support, since there's other things they could spend time on (with a big project like a game, you always have more things you can do). So it'll never be something that is great unless the GPU vendors can find a way to make it "just work" in hardware/drivers and not require special support by software. So long as the games themselves have to do shit to support it, we'll have a situation where a scant few will do great, a bunch will do meh, and some will just not work at all.
 
It won't because there are just not many people that own multiple GPUs. Since it literally doubles the price over a single GPU, it is always going to be a niche market. So you are just never going to see widespread developer support, since there's other things they could spend time on (with a big project like a game, you always have more things you can do). So it'll never be something that is great unless the GPU vendors can find a way to make it "just work" in hardware/drivers and not require special support by software. So long as the games themselves have to do shit to support it, we'll have a situation where a scant few will do great, a bunch will do meh, and some will just not work at all.

there is no wide spread use because support sucks, support sucks because there is no wipe spread use.

something has to give.
 
there is no wide spread use because support sucks, support sucks because there is no wipe spread use.

something has to give.

No, there's no widespread usage because it is expensive. You need to remember that the Hardforum types represent a small segment of the overall gaming public. Most people aren't going to buy 2 GPUs, even if support is perfect. As I said, it is literally twice the cost of buying one GPU, never mind that plenty of boards don't support it. So it'll always be niche, which means it'll never be high priority. It isn't like if games all had perfect support for multiple GPUs all gamers would suddenly rush out and buy a second one. A few would, but most would not.
 
You would have to look at the frame pacing as well. As stutter was one of the many reasons I stopped using multiple cards.
 
Any idea why they don't design a 2nd card that only provides more shaders/ROPs so they don't have to deal with the memory headaches?
 
Any idea why they don't design a 2nd card that only provides more shaders/ROPs so they don't have to deal with the memory headaches?

Pushing that much data over the PCI-E bus is suicide.
 
No, there's no widespread usage because it is expensive. You need to remember that the Hardforum types represent a small segment of the overall gaming public. Most people aren't going to buy 2 GPUs, even if support is perfect. As I said, it is literally twice the cost of buying one GPU, never mind that plenty of boards don't support it. So it'll always be niche, which means it'll never be high priority. It isn't like if games all had perfect support for multiple GPUs all gamers would suddenly rush out and buy a second one. A few would, but most would not.

the only motherboards that don't support sli or crossfire are budget boards these days.

also expensive is debatable.

if you need 60fps at 4k then you need 60fps at 4k. price is irrelevant.

there are no cards right now that can do that, 2080ti excluded because well go buy one (you can't).

if you could guarantee people 100 percent scaling hell even 80 you'd see a lot more sli setpus where 2 mid range cards can do what the halo card can do.
 
  • Like
Reactions: GHRTW
like this
This is interesting. It's the first time I have seen any example of multiple GPUs working with Vulkan since I heard of their support in version 1.1 which was announced, I think, last year in March. Despite various misgivings, I suspect this is the ultimate way forward. Unless someone comes up with a novel way for us to push *PUs way over 5Ghz, maybe 10Ghz, 50Ghz, or 100Ghz, serial computation isn't going to get much faster, so various ways of computing in parallel need to be used to continue with performance increases.

Maybe GPUs are expensive now, but if we get into a slump where GPUs two generations out are not much faster than the current crop, buying a cheap GPU from this generation to add to your existing system wouldn't be so expensive. It's just a hypothetical situation where this feature could prove useful.
 
Can you elaborate? Or point me to an article or something that explains why AFR is terrible.
(honest question)

Sure thing. It's been well documented that AFR necessarily results in added input lag at the same framerate.

Toms Hardware probably best illustrated it back in 1999 in their ATi Rage Fury Maxx (a dual GPU board)

e=http%3A%2F%2Fimg.tomshardware.com%2Fus%2F1999%2F11%2F08%2Fpreview_of_the_double_whopper_%2Flag.gif


People often get really picky about input lag caused by their monitors, but then completely ignore it when it happens in their render pipeline for some reason.

This is one is unavoidable. It happens based on how AFR works, and is why you should never use an AFR setup with two weaker GPU's to try to keep up with one stronger GPU. (If - however - you already have one of the fastest GPU's and it isn't fast enough, adding another and going AFR may be your only option, in which case it may be better than nothing.)

AFR implementations in Crossfire and SLI also have a host of other problems, including:

  • High average FPS, but minimum FPS (where it matters the most) only sees a marginal benefit, if any, turning them into dyno queens that put out great numbers, but fail on you in the most intense scenes when you need the performance the most.
  • Stutter
  • Compatibility issues

The above may be solveable based on implementation. Maybe the mGPU solution in this article does. I don't know. The input lag issue - however - is inherent in how AFR works, and unless someone can figure out how to do SFR properly, is going to be there no matter what.

The problem with SFR - however - is that it tends to scale very poorly, and is not compatible with many games. This is why both Nvidia and AMD has dropped it in recent drivers. They don't want people complaining about shitty scaling.

Presumably the shitty scaling is due to the need to transfer large amounts of data between the GPU's and coordinate which GPU renders what, etc. etc. Hopefully this can be improved in future architectures. AMD's Infinity Fabric looks promising here to me, as it allows massive amounts of data to move back and forth between dies.

If SFR comes back with a vengence, I'll definitely consider going mGPU again. If not, I'll probably keep buying a single very expensive GPU. I just don't want to deal with AFR.
 
id like to see the frame pacing on the scaling. you cant believe marketing.
 
Very thoughtful of you to ask them. I am also wondering just how much latency there is using mGPU. Frames mean shit if there's too much latency.
 
Sure thing. It's been well documented that AFR necessarily results in added input lag at the same framerate.

Toms Hardware probably best illustrated it back in 1999 in their ATi Rage Fury Maxx (a dual GPU board)

View attachment 101418

People often get really picky about input lag caused by their monitors, but then completely ignore it when it happens in their render pipeline for some reason.

This is one is unavoidable. It happens based on how AFR works, and is why you should never use an AFR setup with two weaker GPU's to try to keep up with one stronger GPU. (If - however - you already have one of the fastest GPU's and it isn't fast enough, adding another and going AFR may be your only option, in which case it may be better than nothing.)

AFR implementations in Crossfire and SLI also have a host of other problems, including:

  • High average FPS, but minimum FPS (where it matters the most) only sees a marginal benefit, if any, turning them into dyno queens that put out great numbers, but fail on you in the most intense scenes when you need the performance the most.
  • Stutter
  • Compatibility issues

The above may be solveable based on implementation. Maybe the mGPU solution in this article does. I don't know. The input lag issue - however - is inherent in how AFR works, and unless someone can figure out how to do SFR properly, is going to be there no matter what.

The problem with SFR - however - is that it tends to scale very poorly, and is not compatible with many games. This is why both Nvidia and AMD has dropped it in recent drivers. They don't want people complaining about shitty scaling.

Presumably the shitty scaling is due to the need to transfer large amounts of data between the GPU's and coordinate which GPU renders what, etc. etc. Hopefully this can be improved in future architectures. AMD's Infinity Fabric looks promising here to me, as it allows massive amounts of data to move back and forth between dies.

If SFR comes back with a vengence, I'll definitely consider going mGPU again. If not, I'll probably keep buying a single very expensive GPU. I just don't want to deal with AFR.
Yeah, vulkan mgpu isn't anything like cf. Everything is handled in the game engine, driver just helps translate to something the GPU understands. You'd have to ask the devs how they did mGPU to know what's going on, on each/every game that uses it.
 
Yeah, vulkan mgpu isn't anything like cf. Everything is handled in the game engine, driver just helps translate to something the GPU understands. You'd have to ask the devs how they did mGPU to know what's going on, on each/every game that uses it.

Which means probably AFR, because it is easier :/
 
Crossfire has always had fantastic scaling when it's implemented, around 90 to 95% for quite a while. The problem is the xfire profiles, if they ever get released, usually show up weeks after release. If AMD made this a thing and pushed xfire out to most every title on or near release day, I would be back on the red team so fast...

What I don't get is why high end monitors don't utilize dual inputs to handle AFR that way. Then implementation would be easy to make universal.
 
Been saying all through Vega that if AMD could somehow increase mgpu usage many would get into it for 4k/60hz gaming. Don't know about 56 but surely two 64's could rock it. This game was already on my radar and good for them.
 
I never understood why mGPU is so hard...

DirectX 12 Explicit mGPU is a matter of developers not wanting to fit the bill. But engines have been consolidating for a few years now, and everyone uses the same ones, why don't those support it natively?

And pretty sure Nvidia/AMD don't like selling two mid range cards instead of one high range, but limiting the functionality to select models is pretty easy.

So why is mGPU mostly dead?

Because scenes can vary wildly between games, and having a generic one size fits all mGPU profile doesn't work that well, even when implemented in the engine.

SFR wasn't that good either.

I would take higher minimums and lower averages over lower minimums and higher averages.
 
Pushing that much data over the PCI-E bus is suicide.

I used to believe as such for awhile before 4k. I was truly concerned about it. It's a part of the reason the mobo/cpu combo I did with my 4930k(40 pcie lanes) and the mobo has 2 x16(3.0) and a x8(3.0) so I was able to use all the lanes when I was running 2x 970's SLI and a 780(PhysX). Rocked like a champ. When I switched to the 2 1080's in SLI I have now and started regularly gaming in Cinema 4k(4096x2160 60hz) I used afterburner to monitor the bus. Rarely hit above 25%. At the time this was with much of the same games [H]ard been using for their recent performance analysis. Gaming just doesn't seem capable of using it. I am curious about PCIe drives and especially RAIDs. I figure they've somehow gotta be able to hit a ceiling.
 
the only motherboards that don't support sli or crossfire are budget boards these days.

also expensive is debatable.

if you need 60fps at 4k then you need 60fps at 4k. price is irrelevant.

there are no cards right now that can do that, 2080ti excluded because well go buy one (you can't).

I hate this generalization. There are so many modern games that are easily maxed out at 4k on current GPU's.

I've been gaming at 4k since 2014!
 
This is no surprise - Rebellion's in house engine used for the Sniper Elite series and now Strange Brigade is one of the few that seems to be programmed correctly and actually runs better w/ async compute among other features on both AMD and Nvidia hardware.
 
I hate this generalization. There are so many modern games that are easily maxed out at 4k on current GPU's.

I've been gaming at 4k since 2014!

I've been running 4k since summer of 2015. At that point I was rubbing dual 980ti's in SLI abd hating it.

Switched to a Pascal Titan X and it is just BARELY enough for most games. Big titles like Fallout 4 and Deus Ex Mankind Divided struggle, constantly falling to the upper 40's.

Some games work, but I'd argue that My Pascal Titan overcloxked on water is nowhere near enough. I'd need about 50% more power to be happy with all of today's titles.

Probably more to be happy with titles in the near future.
 
I've been running 4k since summer of 2015. At that point I was rubbing dual 980ti's in SLI abd hating it.

Switched to a Pascal Titan X and it is just BARELY enough for most games. Big titles like Fallout 4 and Deus Ex Mankind Divided struggle, constantly falling to the upper 40's.

Some games work, but I'd argue that My Pascal Titan overcloxked on water is nowhere near enough. I'd need about 50% more power to be happy with all of today's titles.

Probably more to be happy with titles in the near future.

I am CPU limited in Fallout 4. Do you run any mods? I run 10x more AI and I get about 80 fps when GPU limited, which is overkill at 60hz.
 
I am CPU limited in Fallout 4. Do you run any mods? I run 10x more AI and I get about 80 fps when GPU limited, which is overkill at 60hz.

Are you running all settings at low or something? My GPU is maxed out going between a high of ~60fps and a low of ~48fps.

And that's when running a custom ultrawide resolution of 3840x1646, letterboxed, in order to increase framerates.

I'm not willing to turn down the settings.
 
Back
Top