Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Only the 3090.Do the new cards even support bridged SLI?
Those bridges are so god damn over priced.It's called NVLink now. AFAIK they didn't change the format, just look for "NVLink Bridge".
This one might work: https://www.amazon.com/dp/B07HZD8P2R/
However, unless you are doing scientific/AI applications, I would highly advise not wasting money on SLI.
For games not really.Yes, I know that both cards need to have the same clock and bios settings. I am still thinking did I need secound but for now I am sure that 2 cards should be much better from one.
Hey guys, how big a market do you think there is for fake GPUs? Shroud, fans, blank PCB with no cutouts, bridge with no fingers, RGB powered and controlled through PCI slot. Get the look of an SLI system and all the performance benefit, too.The numbers on this are pretty clear. SLI hasn't been worth it since the Maxwell days. I ran two 1080 Ti's in SLI and found the experience lackluster more often than not. There were a few games where it worked OK, but it didn't scale worth a crap. Gamer's Nexus did testing on SLI and the 3090's and found that it either didn't do anything or only increased performance by a modest amount. It certainly never gained enough performance to justify buying two cards that are only 15% faster than a 3080 on its best day and twice the price of that card.
Let me illustrate what I mean:
Take the 3080 as your baseline. We'll call that, 100% performance. Those cards are $699.99. So, here is how things stack up:
3080 = 100%@$700
3090 = 115%@$1,500
2x 3090 = 115%@$3,000. Sometimes 145% in one or two titles @ $3,000.
Does that make sense to you? I'm the guy who used to buy all NVIDIA or ATi/AMD's GPU's in pairs. I've purchased Titans in pairs. I'm currently running a single RTX 2080 Ti. SLI has been less worth while with each generation. As of January 2021, NVIDIA will no longer include SLI profiles in its drivers. That means what little support there is today is going to absolutely crater after that. Buying two 3090's is the equivalent of lighting $1,500 on fire, or using that to buy a 3090 as a desk ornament. Sure, you've got a second 3090 but it won't be doing anything at all.
Hey guys, how big a market do you think there is for fake GPUs? Shroud, fans, blank PCB with no cutouts, bridge with no fingers, RGB powered and controlled through PCI slot. Get the look of an SLI system and all the performance benefit, too.
Or the people that put after-market spoilers on a Yaris.
No, no, I am sure it also makes and excellent space heater in the winter months. Just run 3DMark in the background.Buying two 3090's is the equivalent of lighting $1,500 on fire, or using that to buy a 3090 as a desk ornament. Sure, you've got a second 3090 but it won't be doing anything at all.
No doubt 2 in SLI would keep a room warm in the winter.No, no, I am sure it also makes and excellent space heater in the winter months. Just run 3DMark in the background.
I can't hate the guy for wanting the best but he has more money then sense based off all his posts through out the board.
Lol true I felt like he was full of shit myself but you never know. I haven't had SLI since 370s. I didn't like it then since micro studders drove me nuts. It just never felt as smooth as one card.I understand wanting the best. I've been guilty of overspending on small gains just to achieve that. But at some point, you have to recognize that certain expenses offer such a poor return on investment that they aren't worth while for anyone. Even if I won the lottery tomorrow, I don't think I'd buy two 3090's. Or if I did, I wouldn't be putting them in the same rig. I'm the guy who rode the SLI train longer than it made sense to do so. Come January 1st, it will make even less sense than it does now.
As for the original poster and his spending habits, I'm not sure he's ever actually purchased any of the stuff he asks about.
The next generation of nvidia gpus will reportedly be chiller based.Does that mean that a chiplet design like Ryzen is out of the question for GPU's?
The key to a chiplet design working well would be a super fast interconnect like infinity fabric to connect the chiplets and make it so they can share the memory, in the past multi-gpu has relied on PCIe bandwidth and the sli/crossfire bridge to communicate and the lack of bandwidth can cause issues.Does that mean that a chiplet design like Ryzen is out of the question for GPU's?
as someone who has a 3080 FE already and just used it to game.. its seriously uncomfortable to be near it. its that hot. the entire room gets flat out hot. not just warm, hot. im getting a waterblock to try to somewhat tame this inferno, but thats just making the heat radiate in a different way. all jokes aside, im glad were going into winter, because this will be brutal in the summer with no AC on. i really dont want to imagine 2x 3090s with 3dmark pegging them. i did considering getting two myself, and still somewhat am only because i have over a thousand dollars in company credit with one of the AIBsNo, no, I am sure it also makes and excellent space heater in the winter months. Just run 3DMark in the background.
The key to a chiplet design working well would be a super fast interconnect like infinity fabric to connect the chiplets and make it so they can share the memory, in the past multi-gpu has relied on PCIe bandwidth and the sli/crossfire bridge to communicate and the lack of bandwidth can cause issues.
If they did it right games wouldn't see it as a multi-gpu setup but as a single card.
Did you remember ATI Radeon 7990 and GeForce 590 ? It should be the same but inside the GPU chip. I think Nvidia and Intel will have own technology like Infinity fabrics on AMD.Infinity fabric isn’t that fast. It’s around PCIe speeds. GPUs need a magnitude of speed faster for “seamless”.
No it isn't, I couldn't find numbers to compare Infinity Fabric 2 against PCIe 4.0 but a x16 PCIe 3.0 slot has 15.75 GB/s while IF with 2666 Mhz memory and PCIe 3.0 has 42.667 GB/s bandwidth on package and 37.926 GB/s intersocket. Obviously PCIe 4.0 increases that but IF2 which is based on PCIe 4.0 also does as does higher RAM speed(to a point). The other reason it would be a huge improvement in that regard is that PCIe bandwidth is already fairly saturated before the cards in a multi gpu setup start trying to use the same bandwidth to communicate while the IF links could be dedicated.Infinity fabric isn’t that fast. It’s around PCIe speeds. GPUs need a magnitude of speed faster for “seamless”.
Run F@H and actually do something productiveNo, no, I am sure it also makes and excellent space heater in the winter months. Just run 3DMark in the background.
I'm planning to undervolt mine for this exact reason.as someone who has a 3080 FE already and just used it to game.. its seriously uncomfortable to be near it. its that hot. the entire room gets flat out hot. not just warm, hot. im getting a waterblock to try to somewhat tame this inferno, but thats just making the heat radiate in a different way. all jokes aside, im glad were going into winter, because this will be brutal in the summer with no AC on. i really dont want to imagine 2x 3090s with 3dmark pegging them. i did considering getting two myself, and still somewhat am only because i have over a thousand dollars in company credit with one of the AIBs