Will we ever need more than 1 top end card from now on? Will SLI ever be seamless hardware wise?

Dutt1113

[H]ard|Gawd
Joined
Jun 30, 2005
Messages
1,601
with SLI / NVLink so unsupported by developers and how scaling goes, will we ever need more than 1 top end card to run the latest games at any given time in 4k, >60fps? Look at the 2080ti and games now. Do we really need to worry about having more than 1 and having to water cool them both and having a PSU big enough. I used to have 1080ti SLI and a lot of the games I played, SLI wasn't supported at all and the 2nd card was basically a paper weight.

Will multiple GPU technology ever combine them both on the hardware side so developers don't have to blow it off like they do 75%+ of the time?
 
Well. I think it partially has to due with the fact that high end cards do so much these days and are also so very insanely expensive now compared to the past. How many people even have $2000+ for graphics cards in SLI? In the past the top tier gen graphics card was like $300-400. Then mining happened, and new gen cards got released. Now the high end is $1000+.
 
There are already cases where a single top end GPU is not enough. The only card that can make a passable attempt at 4K gaming at max settings in newer games is the RTX 2080 Ti. Even then, you won't always get the frame rates your looking for. With RTX enabled in some instances, the hit is simply too much for even that card to handle.
 
There are already cases where a single top end GPU is not enough. The only card that can make a passable attempt at 4K gaming at max settings in newer games is the RTX 2080 Ti. Even then, you won't always get the frame rates your looking for. With RTX enabled in some instances, the hit is simply too much for even that card to handle.

I think the Ray Tracing tech was brought out way too early. The tech is not nearly optimized enough to be used mainstream. I think they should have waited several more years to optimize it more. With RT turned off, does a single 2080ti oc'd get AAA games to 4k @ 100 fps?
 
Hopefully PCIe 4.0 and beyond will open up doors in Crossfire and AMD cards. Also, dual GPU support has just been abandoned by game developers.
 
I think the Ray Tracing tech was brought out way too early. The tech is not nearly optimized enough to be used mainstream. I think they should have waited several more years to optimize it more. With RT turned off, does a single 2080ti oc'd get AAA games to 4k @ 100 fps?


When did 4k at 100 fps suddenly become a necessary feature in current gaming setups? The number of monitors/TVs that can hit that is pretty small.

4k 120 Ultra is the FUTURE, not today. If you're willing to compromise down to high, you can have that 100 fps today form a single RTX 2080Ti in multiplayer games (and 60-75 hz in more demanding offline games).

And it seems DLSS 2.0 means you don't have to render games at native resolution anymore, so RTX at 1440p upscaled to 4k is fairly viable.

Ampere is going to make things even faster, but for new RTX games, performance is viable on a 2080 Ti on upscaled 4k.
 
I think top end gpu's are progressing faster than game graphics so I think single high end cards are able to top out games easier than before when sli was more needed. Look at smart phones, the features and speed is topping out and they are finding new ways and smaller features to make them stand out less and less every year.
 
Will multiple GPU technology ever combine them both on the hardware side so developers don't have to blow it off like they do 75%+ of the time?
Probably with a 'chiplet' solution, similar to how AMD is building most Ryzen 3000 CPUs. Current multi-GPU solutions tie fully discrete GPUs together externally, which represents issues that have been hard to overcome, as we've seen. Frame-timing being one issue that bit AMD (and many of us) in the rear, but even when that's not an issue, alternate-frame rendering, the most 'compatible' way of using more than one GPU, necessarily introduces lag. Split-frame rendering is harder to do and usually results in worse scaling, but perhaps a better user experience. And we still haven't seen an implementation of VR rendering where one (or more) GPUs is used for each eye despite that being the near perfect application of multi-GPU technology since VR is essentially two slightly different perspectives of the same scene, so the GPUs should render concurrently and synchronized.

To make any or all of that transparent to applications, it will almost certainly have to be done at the driver level or later. That means that the hardware needs to scale.

I think the Ray Tracing tech was brought out way too early. The tech is not nearly optimized enough to be used mainstream. I think they should have waited several more years to optimize it more. With RT turned off, does a single 2080ti oc'd get AAA games to 4k @ 100 fps?

Ray tracing isn't DX8 --> DX9, DX9 --> DX10, or DX11 --> DX12 -- these are natural progressions of rasterization that represent the increasing performance capabilities afforded by improvements in hardware.

Ray tracing is a complete shift, or will be at least, when the hardware exists to do it fully. Right now it's a 'hybrid' solution.

Making that work is why Nvidia released it, uh, 'early'. Working hardware is necessary for something like ray tracing.

Hopefully PCIe 4.0 and beyond will open up doors in Crossfire and AMD cards.

AMDs record has been spotty -- see the threads on just getting single-GPU configurations of their latest architectures up to par. If they wanted to do it, I believe they could, but generally speaking it's unlikely for them to choose to try and lead here. Most would simply just buy a faster GPU, since those exist from other manufacturers.

Also, dual GPU support has just been abandoned by game developers.

Developed by the developers of games, yes. Not at all by the engine developers, who have upgraded to support at least DX12 and probably Vulcan too. Right now, developers are likely more focused on the common denominators -- say 1080p120, the upcoming consoles -- but there's no reason that they won't swing back to pushing 4k120 and VR and the like.
 
The best scaling I've ever seen was with Shadow Of The Tomb Raider with two 1080Ti's, it was DX12 mGPU and it worked great. Over 90% scaling at 1440p but rendered at a higher resolution close to 4K, it was like 95%+, AMD?? nope would not work with two Vega FE. As for SLI, the games that did it in the past does it now but newer titles are more and more scarce it seems. Now the two Vega FE's did scale to like 60% in FarCry 5, last game I played using CFX, later it stopped working.
 
Yeah, that's the thing. SotTR implemented DX12 mGPU and it works great. You can also do SFR with DX12 (I believe AotS did this) and that also works and solves frame pacing issues.

The problem isn't that the technology isn't there, it's 100% there and works 100%. It's that the ROI isn't there.

Developers would rather (and are probably better off) optimizing their games to the vast majority of users, which are probably on something like a GTX 1060 class-card, over catering to what is probably fraction of a percent of people with multi-GPU systems.
 
When did 4k at 100 fps suddenly become a necessary feature in current gaming setups? The number of monitors/TVs that can hit that is pretty small.

4k 120 Ultra is the FUTURE, not today. If you're willing to compromise down to high, you can have that 100 fps today form a single RTX 2080Ti in multiplayer games (and 60-75 hz in more demanding offline games).

And it seems DLSS 2.0 means you don't have to render games at native resolution anymore, so RTX at 1440p upscaled to 4k is fairly viable.

Ampere is going to make things even faster, but for new RTX games, performance is viable on a 2080 Ti on upscaled 4k.
If I can't do 4k120, how am I ever gonna do 8k240? 😜
 
Well, there are several 4K120 displays you can buy today, for around $1,000, not that expensive all things considered.

People buying triple-screen setups or nice big TVs could easily spend the same amount. I did, as did many others here.

Now actually hitting 120fps in 4K? That is less likely, even with a 2080 Ti (unless we're talking old games). So SLI/Crossfire would be good here if it actually worked.
 
Last edited:
Well, there are several 4K120 displays you can buy today, for around $1,000, not the expensive all things considered.

People buying triple-screen setups or nice big TVs could easily spend the same amount. I did, as did many others here.

Now actually hitting 120fps in 4K? That is less likely, even with a 2080 Ti (unless we're talking old games). So SLI/Crossfire would be good here if it actually worked.


So are you telling us that, after both Samsung and LG both offer 1440p 120hz hardware upscaler on their TVs plus freesymc, , that we somehow have a need to go 4k 120hz anytime soon?
Use a TV. Run full-screen games at 1440p, and desktop at 4k 120.

At this point, 4k for gaming is mostly dick-waving. (for monitor size 35" and smaller) It you decide you want to paint yourself into a corner like that one, then that's on you..

And DLSS 2.0 salso allows for indistinguishable up-scaling. Who said we needed more than 1440p native res for 4k output?
 
defaultluser Maybe you confuse me with someone else. I wasn't the OP, but I was just saying that 4K120 is available today, no need to wait for the future.

But, personally, I ran 4K for about 2 years and it looked nice, but it was too much of a struggle to hit even 60fps, and I like high refresh, so I game at 1080p now.

I would rather run 1080p native than upscale. That just looks bad no matter how you slice it (I know image sharpening helps, I've tried it, but nothing beats native).
 
Main reason I would want a high frequency future 4K+ monitor is because of increase GSync/FreeSync range, which I've found to be invaluable for smooth game play. It will have to have DP 2.0 preferred and at least HDMI 2.1 and would be best if the monitor has both.
 
Main reason I would want a high frequency future 4K+ monitor is because of increase GSync/FreeSync range, which I've found to be invaluable for smooth game play. It will have to have DP 2.0 preferred and at least HDMI 2.1 and would be best if the monitor has both.

That is the most reasonable thing anyone has said about high refresh 4K. We may not be able to drive it consistently, but until then we can at least be smooth in games not needing low latency.
 
Main reason I would want a high frequency future 4K+ monitor is because of increase GSync/FreeSync range, which I've found to be invaluable for smooth game play. It will have to have DP 2.0 preferred and at least HDMI 2.1 and would be best if the monitor has both.
Correct. With Adaptive Sync, even around 90 fps feels butter smooth. I mean, higher is better, but I feel like around 90 - 100 fps is the sweet spot.

I have FreeSync on my 4K TV, but it only works at 48 - 60 Hz, which makes it almost useless. With a higher range, you could see big benefit, even if you are not pushing the full 120 fps.
 
Last sli I ran was 2x 9800gt and alternate frame rendering seemed to work for everything back then. Shouldn't the game not care? Please enlighten taco.

Also, with these heavy duty prices, it means I'll be upgrading less often (personal choice). I can understand next person might want ultra settings and decent frames, so he or she might be inclined to upgrade often anyways.

On a side note, I discovered a new trick on my.phone key board! Dragging up nd dwn let's cursor scroll up or dwn! w00t!

The shift to DX12 and Vulcan means the game developers and game engines are in control of the GPUs now. Even back in the DX11 and older days though, nVidia and AMD had to work with game developers to optimize SLI/crossfire profiles in their drivers to ensure maximum scaling.
 
So are you telling us that, after both Samsung and LG both offer 1440p 120hz hardware upscaler on their TVs plus freesymc, , that we somehow have a need to go 4k 120hz anytime soon?
Use a TV. Run full-screen games at 1440p, and desktop at 4k 120.

Just FYI, LG's current OLED TVs (B9 and up) actually support 4k 120 Hz (with VRR/G-Sync/Freesync as well). Same deal with Samsung's Q90R, just with no G-Sync, only Freesync. There's just no graphics cards out now with HDMI 2.1 on them to test them out with. Dunno why they don't just put DisplayPort on high-end TVs now at this point too.
 
I think the Ray Tracing tech was brought out way too early. The tech is not nearly optimized enough to be used mainstream. I think they should have waited several more years to optimize it more. With RT turned off, does a single 2080ti oc'd get AAA games to 4k @ 100 fps?

I think the RTX 2080Ti can do 100+fps @4k in Wolfenstein: YoungBlood
 
I think the Ray Tracing tech was brought out way too early. The tech is not nearly optimized enough to be used mainstream. I think they should have waited several more years to optimize it more. With RT turned off, does a single 2080ti oc'd get AAA games to 4k @ 100 fps?

This is nothing new really, and is pretty much the norm. When tessellation first came out the top card at the time (Radeon 5870 iirc) was barely able to pull it off. Two generations later you could barely tell it was running.
 
I gave up on SLI with the 2080's I originally purchased with an NVLink Bridge. Not enough games support it. SLI/CrossfireX was supported on the DX11 hardware level, but since DX12 is game-developer based...good luck on those lazy bastards to support it moving forward.

Most modern game engines fully support MGPU.
 
Last edited:
Nvidia has an interest in actively reducing the functionality of consumer multi-GPU implementation. This forces more frequent upgrades and fewer options to buy pre-owned hardware.

Essentially, Nvidia is competing with itself for sales when it comes to the pre-owned market, and giving someone the option of adding another card to boost performance cuts into their current sales.
 
One 2080 Ti has been plenty of card for me at 4K.
You have some pretty low standards then as a 2080ti cannot even average much less maintain 60fps in several of the more demanding games on max settings at 4k. And that is not even getting into any of the demanding anti-aliasing settings either. But I'm guessing you're one of those people that thinks anti-aliasing is never needed at the magical 4K resolution. Also part of that extremely high price was for ray tracing so good luck with that in most games at 4K...
 
Last edited:
When did 4k at 100 fps suddenly become a necessary feature in current gaming setups? The number of monitors/TVs that can hit that is pretty small.

4k 120 Ultra is the FUTURE, not today. If you're willing to compromise down to high, you can have that 100 fps today form a single RTX 2080Ti in multiplayer games (and 60-75 hz in more demanding offline games).

And it seems DLSS 2.0 means you don't have to render games at native resolution anymore, so RTX at 1440p upscaled to 4k is fairly viable.

Ampere is going to make things even faster, but for new RTX games, performance is viable on a 2080 Ti on upscaled 4k.

Concur completely with this. It seems that people's expectations in performance have gotten so far out of whack with what's currently possible (or even in the mid-term pipeline). I keep seeing posts of people being indignant they can't hit 4k 120fps in the latest single-player games on a $500 card; expectations that the new consoles -- looking strongly that are going to have a 5700 XT-level performance -- are going to push 4k 60fps (hint: they're not).

If people are expecting 4k 120fps ultra performance, what's going to happen is the industry is going to take the path of least resistance getting there. And that path is certainly not easiest in hardware where performance increase over time is generally slow and linear -- the easiest path to get there is in software. Devs are going to (and have already in many games) stagnated graphical complexity/requirements. Think, for example, about the most graphically-intense PC games from 2015 vs now... say, Deus Ex: Mankind Divided vs. Metro Exodus.

Bringing it back to the original question, SLI/X-fire is still the only way to hit these respective levels of performance in most games. There is still, therefore, a hypothetical place for mGPU, although as some have mentioned, it's really too cost-prohibitive at this point. In essence, the market size for mGPU users is so small, it doesn't support the cost for nV/AMD. Unless top-end GPU prices come down considerably back to relative levels from 5-6 years ago, mGPU is effectively dead.
 
Nvidia has an interest in actively reducing the functionality of consumer multi-GPU implementation. This forces more frequent upgrades and fewer options to buy pre-owned hardware.

Essentially, Nvidia is competing with itself for sales when it comes to the pre-owned market, and giving someone the option of adding another card to boost performance cuts into their current sales.
Does that also apply to AMD? Or is it just evil ngreedia.
 
I was hoping that sli would grain traction again with RT.

It seems RT would benefit a lot from mGPU, but there's no support at all.

After seeing the SW reflection demo running on 4 volta gpus, I thougt it was the natural next step

Maybe someday
 
When did 4k at 100 fps suddenly become a necessary feature in current gaming setups? The number of monitors/TVs that can hit that is pretty small.

4k 120 Ultra is the FUTURE, not today. If you're willing to compromise down to high, you can have that 100 fps today form a single RTX 2080Ti in multiplayer games (and 60-75 hz in more demanding offline games).

And it seems DLSS 2.0 means you don't have to render games at native resolution anymore, so RTX at 1440p upscaled to 4k is fairly viable.

Ampere is going to make things even faster, but for new RTX games, performance is viable on a 2080 Ti on upscaled 4k.

What kind of fucking bullshit talk is this? Resolutions have been going NOWHERE for years.

Let's put this shit into perspective. I was gaming at 2560x1600 in 2009.

2009. 11 years ago. Hardware is supposed to advance. 4k should be ubiquitous by now and the only reason it isn't is because video card companies have dropped the ball hard and tried to distract people from that fact by pushing snake oil like ray tracing cores or, like AMD, just straight up failing.
 
with SLI / NVLink so unsupported by developers and how scaling goes, will we ever need more than 1 top end card to run the latest games at any given time in 4k, >60fps? Look at the 2080ti and games now. Do we really need to worry about having more than 1 and having to water cool them both and having a PSU big enough. I used to have 1080ti SLI and a lot of the games I played, SLI wasn't supported at all and the 2nd card was basically a paper weight.

Will multiple GPU technology ever combine them both on the hardware side so developers don't have to blow it off like they do 75%+ of the time?
While it has heavily died off in SLI/Crossfire use for games it is bigger than ever for machine learning and data science. My last crossfire setup was dual rx480s which when it did work actually allowed decent 4k gaming but most of the time it was a waste of power and heat. Right now my main machine has a weird mix of rtx 2060, gtx 1080, and the 2 rx480s. All of them are used for tensorflow/gpu dependent development which is a specialty situation. Gaming wise it can be nice to have one gpu running a data set while another can be used for gaming. If it were pure gaming though I think dual gpus are dead.
 
I think the Ray Tracing tech was brought out way too early. The tech is not nearly optimized enough to be used mainstream. I think they should have waited several more years to optimize it more. With RT turned off, does a single 2080ti oc'd get AAA games to 4k @ 100 fps?

Ray Tracing isn't too early and it isn't about optimization. Something not being smooth or as fast as we'd like sometimes comes down to the hardware not being fast enough. As for your question, the answer is "sometimes."

Hopefully PCIe 4.0 and beyond will open up doors in Crossfire and AMD cards. Also, dual GPU support has just been abandoned by game developers.

That won't have an impact on Crossfire for the reason you stated.

When did 4k at 100 fps suddenly become a necessary feature in current gaming setups? The number of monitors/TVs that can hit that is pretty small.

4k 120 Ultra is the FUTURE, not today. If you're willing to compromise down to high, you can have that 100 fps today form a single RTX 2080Ti in multiplayer games (and 60-75 hz in more demanding offline games).

And it seems DLSS 2.0 means you don't have to render games at native resolution anymore, so RTX at 1440p upscaled to 4k is fairly viable.

Ampere is going to make things even faster, but for new RTX games, performance is viable on a 2080 Ti on upscaled 4k.

Getting 120FPS at 4K is what guys like me currently want. I want the speed of a high end 34" Ultra-wide display without having to compromise a bit on resolution or eye candy to achieve it. After that, I'm going to want 8K at 120Hz and so on. Until this shit looks like reality or we transition to holodecks, people are always going to want more.

I can see pixels at 1440p on a 27". Doesn't stop me from using it, also doesn't mean I don't want or couldn't use better.

Exactly.

One 2080 Ti has been plenty of card for me at 4K.

I'd argue its still not enough in all games. It's just the only card that can do a decent job at it that's available today.
 
What kind of fucking bullshit talk is this? Resolutions have been going NOWHERE for years.

Let's put this shit into perspective. I was gaming at 2560x1600 in 2009.

2009. 11 years ago. Hardware is supposed to advance. 4k should be ubiquitous by now and the only reason it isn't is because video card companies have dropped the ball hard and tried to distract people from that fact by pushing snake oil like ray tracing cores or, like AMD, just straight up failing.

And? I'm assuming that you don't mind playing at 60 hz then (and prefer eyecandy over framerate?) I mean, those old LCDs had some of the highest input lag we've ever seen! You can buy your new 120 hz 4k monitor, then turn on adaptive sync (to make your 60hz experience smooth).

You can get that kind of framerate from and RTX 2080 Ti, on Ultra in the majority of games. Unless you suddenly grew a hard-on for 120 hz, I don't see what the problem is.

And all of you whiny bitches will be satisfied in another two years (when Ampere's successor ships). Should have more than enough horsepower or 120 hz 4k (in every game), and 75hz RTX.

BUT IT WON'T BE CHEAP. Because demaning gamers (like yourselves) are a corner case of a corner case.
 
Just to clarify, when you say Hz, you really mean FPS.
While its technically correct (I guess), Hz is more commonly used for monitor refresh rate, while FPS is used for frame rate.

But I really don't want to nitpick cause there is variable refresh rate and then it just gets more confusing.

just my $.02
 
Just to clarify, when you say Hz, you really mean FPS.
While its technically correct (I guess), Hz is more commonly used for monitor refresh rate, while FPS is used for frame rate.

But I really don't want to nitpick cause there is variable refresh rate and then it just gets more confusing.

just my $.02


I think of your monitor's native refresh rate as your "target, but not required" fps in most games. If you turn off vsync, or play with adaptive sync, then you've always been able to get away with less (without sacrificing smoothness)

How high you want it i on you - that is why I specify your refresh rate, and not your game frame rate.
 
Last edited:
I was hoping that sli would grain traction again with RT.

It seems RT would benefit a lot from mGPU, but there's no support at all.
Tie RT support in with DX12, and multi-GPU becomes an easily identifiable 'step too far', it would seem.

I still think that the industry will circle back for a number of reasons, but it's obvious that it will be on the back-burner for a bit first.
 
My thoughts are simply summed up when it comes to SLI or Crossfire

f6c.jpg
 
Back
Top