The future of multi-GPU gaming?

The thing with EMA is supposedly you will be able to use the onboard graphics on the CPU with the dedicated GPU. So there is a lot of people who would get more performance out of this. How much performance and how well this all works remains to be seen.

That's great, if you like input latency plus whatever else issues may be associated for a small to tiny increase. But again, only 3Dmark as far as I know at this point even support DX12 on Intel because nobody else bothered with a path. And the APUs already throttle more than enough.

But again, someone has to pay to implement it.
 
That's great, if you like input latency plus whatever else issues may be associated for a small to tiny increase. But again, only 3Dmark as far as I know at this point even support DX12 on Intel because nobody else bothered with a path. And the APUs already throttle more than enough.

But again, someone has to pay to implement it.

Like I said, it remains to be seen if and how well this will work. I don't have expectations, but do have hopes that EMA will be better than the Current Crossfire and SLI solutions. Time will tell.
 
Who said SLi is on its way out lol.
Nvidia officially doesn't recommend 3 or 4 way SLI anymore. The scaling issues are just too bad.

There needs to be an easier way for developers to optimize games for SLI/Crossfire, otherwise it's not worth the time investment, especially as how a relatively few amount of people actually have SLI/Crossfire setups because of the prohibitive cost.
 
I was talking about vanilla SLi. 2 cards is still a thing. They even released new bridges for that stuff.
 
I think the future of multi gpu setups is DX EMA! But now the weight of the support will be on the developer to program for specific architectures. See ASoT testing with different gpus.
 
I don't think mGPUs have a future tbh, especially if the bulk of the development effort gets transferred from the IHVs to the developers. I mean no offense, just look at the state of console ports sorry PC games these days, it's a fucking farce.
 
two words. "NOT GOOD"! lol

I think Nvidia and Amd have to find a way to get multiple gpus module working on the same die as one. I think AMD might be going that route with NAVI when they talk about scalability. Only way this is going to work in the future is if they can pull this off, so the software never sees 2 gpus. Or multi gpu is as good as dead because I doubt developers take the time to support it on their end and it will be games far and few in between.
 
two words. "NOT GOOD"! lol

I think Nvidia and Amd have to find a way to get multiple gpus module working on the same die as one. I think AMD might be going that route with NAVI when they talk about scalability. Only way this is going to work in the future is if they can pull this off, so the software never sees 2 gpus. Or multi gpu is as good as dead because I doubt developers take the time to support it on their end and it will be games far and few in between.

Dude, mutli gpu was pretty standard back in the day. Hell, dev'ing games on multiple api was standard back in the day too. You could choose between glide, dx, ogl, etc etc. But today, devs have gotten stupid lazy even when the tools and access have been given to them polished and all. We should expect them to deliver more, or at least to the same level that they did in the past before they got lazy from the pre-apu console days. Post apu console, porting has gotten rather easy for them, so I don't get their issues.
 
Dude, mutli gpu was pretty standard back in the day. Hell, dev'ing games on multiple api was standard back in the day too. You could choose between glide, dx, ogl, etc etc. But today, devs have gotten stupid lazy even when the tools and access have been given to them polished and all. We should expect them to deliver more, or at least to the same level that they did in the past before they got lazy from the pre-apu console days. Post apu console, porting has gotten rather easy for them, so I don't get their issues.

Yea but to be honest. Sad truth is consoles lead pc gaming and everything is prioritized according to that. So if consoles had multigpu in there I bet every game supports it lol. I think if AMD can push Microsoft and Sony to put 2 efficient dies together in a few years to push 4k gaming on consoles then what a treat that would be, no?

multigpu consoles will be the other way multi gpu support is present in all games. So its all up in the air right now but that is the only way I see developers making the effort. They usually tend to squeeze every bit of performance for consoles.
 
They just might need time to learn the new process. It takes years for devs to get to a point where they can maximize the hardware like they did with consoles before the apu.
 
MultiGPU in the past was much easier than today. MultiGPU have been broken ever since the scissor method stopped working.

The fundamental issue today is economics, the technical issues secondary.

You have to convince the CFO of every single developer/publisher that its good business to spend the time and resources for MultiGPU, so a tiny part of the gamers can use MultiGPU, despite not wanting to pay more for it. See the issue? Ye, I thought so. MultiGPU is dead as long as this exist.

AMD/Nvidia had a financial interest in MultiGPU, and that was fine when it was mainly driver oriented and they had to pay the bill. The only thing they can do now is to sponsor the MultiGPU development for every single game. We haven't had a working DX12 MultiGPU case yet without a IHV sponsorship have we?
 
MultiGPU in the past was much easier than today. MultiGPU have been broken ever since the scissor method stopped working.

The fundamental issue today is economics, the technical issues secondary.

You have to convince the CFO of every single developer/publisher that its good business to spend the time and resources for MultiGPU, so a tiny part of the gamers can use MultiGPU, despite not wanting to pay more for it. See the issue? Ye, I thought so. MultiGPU is dead as long as this exist.

AMD/Nvidia had a financial interest in MultiGPU, and that was fine when it was mainly driver oriented and they had to pay the bill. The only thing they can do now is to sponsor the MultiGPU development for every single game. We haven't had a working DX12 MultiGPU case yet without a IHV sponsorship have we?

They still do. The whole point of MultiGPU is to sell more cards. The problem is cyclical- by not actively promoting/supporting it in new games there is no reason to buy more than 1 card (and vice-versa). Once again, it looks like another A1 title, No Man's Sky is being released without SLI support. We'll find out in an hour but I'm not counting on it. And if games don't have SLI support at release there's a good chance they never will.
 
Based on my experiences right now, I can assure you, mGPU is not going to be commonly supported for a long time.

SLI and Crossfire "works" because the developer mostly does not have to do anything to make it work. It's the IHVs who make it work. There are very very few developers out there who actively design their games to work perfectly with SLI/Crossfire.

Now, once you accept that, think of the current progress that the GPU industry is making. Almost every 2 years, GPUs double in speed. This isn't like CPUs where we have more or less hit a brickwall in speed increases. This is why software is increasingly multithreaded now. GPUs still are increasing in speed and so long as that happens, developers are not going to bother optimizing their code for mGPU.

Besides, how many people have multiple GPUs to actually take advantage of mGPU? If the market is tiny, where is the financial gain for spending development time in optimizing for mGPU? I mentioned before, developers are under extreme pressure to deliver games in an insanely short time and small budget. All developers worth their salt are concerned with rendering it right and moving on. Very few studios have the resources to allow their developers free time to do such experimentations like mGPU.

Not only that, do you think DX12 experienced developers grow on trees? My studio has about 40 developers and programmers. Out of these 40, I would reckon only about 3-4 are trained sufficiently to work with DX12 APIs. Developers with DX12 experience are in crazy demand everywhere. It's the diamond dust among the gold dust.

The only time mGPU will be relevant is when GPUs start to hit a brickwall getting faster.
 
Hmm.. so multi-GPU owner are probably the ones with a little more disposable income. I wonder if developers could make a $20 SLI/Crossfire DLC package or something along those lines. I would pay $20 if I could get double performance in a favorite game.
 
Well Start Citizen :whistle: BF ONE, also DX12 mode.

Other way. Zero. Multi GPU is talked now as VR. Each GPU renders one screen 100% scale.

PC support is damp down since 2007.
 
Games like The Witcher 3 struggle to get a consistent 50FPS @ 4k maxed out on a high end setup with 2x 980Ti SLI. I have no idea about the 1080.

Future games that push the graphical envelope should have similar problems on a SLI 1080 setup.

980 Ti SLI was much, much faster in Witcher 3 at launch than it is now. Witcher 3 is kind of a rare occurrence where SLI got broken along the way during patches--specifically after the 1.06 patch.

For some reason since 1.06 980 Ti SLI is unable to scale AFR properly whenever there are water effects on screen if Water Detail is set to High or Ultra.

Most of all SLI problems in Witcher 3 can be fixed by setting Water Detail to Medium.

Running AA for some reason is a huge performance hit in Witcher 3 in SLI. It's like a 15-25% performance hit on SLI, but only about a 8-10% perf hit with a single card.

Anyway SLI is just broken in general in Witcher 3. A single Titan X (P) is faster than 980 Ti SLI, and a single 980 Ti is faster than 980 SLI. The game just doesn't play well with multi-GPU; scaling isn't great on Crossfire either and has the same water issues.
 
I'm surprised SLI/Crossfire has lasted as long as it has...it's a niche market with more problems then its worth for developers...
 
I'm surprised SLI/Crossfire has lasted as long as it has...it's a niche market with more problems then its worth for developers...


SLI/Crossfire drives 4k gaming, which is supposedly one of the advantage to owning a PC.
 
MultiGPU in the future might end up being just a VR thing. That's where you need all the horsepower you can get if you want to push visuals and rendering for each eye is probably easier to handle than two GPUs rendering a single screen.
 
Even for VR it looks dead. Just look at Pascals VR features. Not much gain with 2 cards vs 1.
 
Actually, for the only VR SLI title (VR Funhouse), the thing only enables '1 GPU per eye' if you have 3 GPUs - 2 for rendering and 1 for PhysX. Anything other than that and it disables VR SLI.
 
I have never found SLI to be a good option, unless your talking about the voodoo 2 - 8mb of vram for 1024x768!!
 
Multi-GPU has too many issues.

Incompatability with many display (3+) setups
Microstuttering
An extra layer of driver support that is at the bottom of the driver writers priority list
Double/Triple/Quadruple investment into a piece that can be replaced later for the largest performance increase

A single GPU is all you need, and all you really want. Running an aging 290x lightning, I get 144+ fps in the games i care about with a bit of tweaking and 60+ in everything @ 1080p. I'd imagine a 1080 can do similar things at 4k. No reason for more GPUs that will empty your wallet (so you have less to spend on future GPUs..) and cause problems.
 
Nvidia is pulling back on SLI because of Explicit Multi-Adapter. It's just a better way of implementing support for multiple GPUs. Now whether or not EMA support becomes common is really based on effort to implement it vs demand.
 
980 Ti SLI was much, much faster in Witcher 3 at launch than it is now. Witcher 3 is kind of a rare occurrence where SLI got broken along the way during patches--specifically after the 1.06 patch.

For some reason since 1.06 980 Ti SLI is unable to scale AFR properly whenever there are water effects on screen if Water Detail is set to High or Ultra.

Most of all SLI problems in Witcher 3 can be fixed by setting Water Detail to Medium.

Running AA for some reason is a huge performance hit in Witcher 3 in SLI. It's like a 15-25% performance hit on SLI, but only about a 8-10% perf hit with a single card.

Anyway SLI is just broken in general in Witcher 3. A single Titan X (P) is faster than 980 Ti SLI, and a single 980 Ti is faster than 980 SLI. The game just doesn't play well with multi-GPU; scaling isn't great on Crossfire either and has the same water issues.

That's why I ditched SLI. I'd try to go on BF4 for a quick match - stuttering. Fixed it with a cfg file. A few weeks later trying to fit in a quick match, massive dips from smoke effects.

If you can't enjoy your game, where you otherwise would have, it's pointless.
 
Scaling beyond the first TWO gpus in a multi-gpu setup has always been bad because of diminishing returns. I've run with ATI crossfire and Nvidia SLI.
The only people who really bother with TRI-SLI or TRI-FIRE and above now tend to be the overclockers who are trying to set speed records.

For Practical use, it's not really worth it. The future of multi-gpu is really the improvement of the single big gpu so that at least you can run two powerful cards with relatively decent efficiency.
That's just my opinion.

In any case, you'll be pleased to know that if anything, Nvidia has improved the SLI bridge in the GTX 1080. And you can pick anywhere from 1-2-3-4 GPUs.

They claim to have "doubled" the bandwidth on their new generation SLI bridge.
GeForce GTX 1080 Graphics Card

Now that SLI seems to be on its way out, I wonder if the shift will be towards AMD for 4k gaming? You still need more than 2 cards for bleeding edge games @ 4k. Eventually that will change, but right now (and maybe for 1 or 2 more generations) this seems to be the way it is.

My 3 way 980Ti/4k setup has been a disappointment as SLI support from Nvidia seems to be nonexistent (especially 3-way).

What does the future of 4k (and above) multi-GPU gaming look like what with the shift to DX12 gaming on the horizon?
 
Man, how did the 3dfx SLI get this messed up?

i remember the scaling back in the Vodoo 5000 days (2000ish) that is worked great, now we run into these issues nearly 16 years after the fact.

guess no R&D on SLI advancement.
 
Man, how did the 3dfx SLI get this messed up?

i remember the scaling back in the Vodoo 5000 days (2000ish) that is worked great, now we run into these issues nearly 16 years after the fact.

guess no R&D on SLI advancement.

Back then you could do a scissor method without performance penalty. But as complexity in graphics rose it was no longer possible.
 
Man, how did the 3dfx SLI get this messed up?

i remember the scaling back in the Vodoo 5000 days (2000ish) that is worked great, now we run into these issues nearly 16 years after the fact.

guess no R&D on SLI advancement.

It was a lot easier back then when they had direct access to hardware, hell games supported 3-4 API and it was not weird to expect such. Things have grown too complicated today, and Direct X has gotten in the way and convoluted things, creating an even bigger gap between the game devs and the hardware.
 
It was a lot easier back then when they had direct access to hardware, hell games supported 3-4 API and it was not weird to expect such. Things have grown too complicated today, and Direct X has gotten in the way and convoluted things, creating an even bigger gap between the game devs and the hardware.

Back then you was often vendor locked. And support was a nightmare with APIs being one pile of crap after the other. They didn't have direct access to hardware either. They was all high level APIs.
 
Man, how did the 3dfx SLI get this messed up?

i remember the scaling back in the Vodoo 5000 days (2000ish) that is worked great, now we run into these issues nearly 16 years after the fact.

guess no R&D on SLI advancement.

3Dfx's SLI (Scan-Line Interleave) was NOT the same as NVIDIA's SLI (Scalable Link Interface).
Comparing the two is not very bright.
 
Back
Top