Ashes Of The Singularity Running On AMD And Nvidia Hardware Simultaneously

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
Human sacrifice, dogs and cats living together…mass hysteria! Check out this demonstration of AMD and Nvidia cards working in tandem for RTS game Ashes of the Singularity.

Note that in this case, the hardware obviously isn’t configured for any kind of SLI passthrough — AMD doesn’t use physical bridges the way Nvidia still does, and this kind of hardware wouldn’t be able to take advantage of it, since the two company’s use very different pinouts. Nevertheless, the rig works and works well.
 
What sorcery is this? It must be a thing of the Devil! Pure evil!.

"Can't prevent it outright." We will see about that. If past behavior is any indicator of future actions, there is a coding team at NV already dedicating themselves to insuring this does not work for long, and their TWIMTBP team is likely continuing to do what it can to vendor lock devs onto the Nv specific path.
 
What sorcery is this? It must be a thing of the Devil! Pure evil!.

"Can't prevent it outright." We will see about that. If past behavior is any indicator of future actions, there is a coding team at NV already dedicating themselves to insuring this does not work for long, and their TWIMTBP team is likely continuing to do what it can to vendor lock devs onto the Nv specific path.

AFAIK, it's MSFT sorcery in DX12. The evilest of software. :D
 
Awesome.

In b4 teh crazy... oops too late.

Crazy, hmmm,
Like having code that disables or cripples your own card when it detects another vendors card is present?
Or perhaps like preventing SLI in drivers if you have a motherboard that has not paid the extra fee to enable it?
Or maybe requiring an extra, completely extraneous, chip be included on a motherboard to enable SLI.
Or is coding AA in game for a dev, and then making your competitors card do the same work, b4 the vendor lockout, so it performs as if it is doing AA, but does not actually render with AA. Well that's not crazy I guess, just dirty. But the point is there none the less.

It's not crazy to expect they will try to block or hinder this somehow. It is crazy to expect they will not try to do so.
 
My big question ...

Does this mean I can finally choose freely between FreeSync and G-Sync monitors? It shouldn't matter what card is physically tied to my monitor ... so I could buy a cheap nVidia card and use a G-Sync or a cheap AMD card and use FreeSync, right?
 
These devs are just awesome so far. First nvidia tried to shame them about Async when it was their own hardware limitation and drivers having problems, now they get multi brand cards working together for the first time in a public release.

My big question ...

Does this mean I can finally choose freely between FreeSync and G-Sync monitors? It shouldn't matter what card is physically tied to my monitor ... so I could buy a cheap nVidia card and use a G-Sync or a cheap AMD card and use FreeSync, right?

Well it will depend on the master card. AMD as master card so far seems to work a lot better than nvidia at the point in time. Yes it matters which card is connected.
 
Crazy, hmmm,
...
It's not crazy to expect they will try to block or hinder this somehow. It is crazy to expect they will not try to do so.
Crazy is as crazy does. At least you're aware.
 
I find it hilarious when I was lurking the forums awhile back and all the Fanboiis from both sides where spewing diarrhea at the mouth about how DX12 won't do something like this, and here I am watching a demo of it happening.

Well, the proof is here everyone. For once I am legitimately excited for the future of GPU development.

Maybe I might not have to sell my last 290X. I'll hang on to it to do some testing on my own with one of my 980Ti's.
 
I recall a few years ago someone making a motherboard that allowed you to sli/crossfire ATI and Nvidia cards together. The name of it eludes me though.
 
I find it hilarious when I was lurking the forums awhile back and all the Fanboiis from both sides where spewing diarrhea at the mouth about how DX12 won't do something like this, and here I am watching a demo of it happening.

Citation needed. Because all I've ever seen posted is that it would be up to developers to implement per-game (which is correct), rather than being a default feature inherent to any DX12 title.
 
These devs are just awesome so far. First nvidia tried to shame them about Async when it was their own hardware limitation and drivers having problems, now they get multi brand cards working together for the first time in a public release.



Well it will depend on the master card. AMD as master card so far seems to work a lot better than nvidia at the point in time. Yes it matters which card is connected.

I meant it shouldn't matter for the cross-GPU going on which monitor is the one connected. Obviously you would connect the nVidia to the G-Sync and the AMD to the FreeSync.
 
Does Vulkan have a similar option? Also, wouldn't this take all of the problems found in SLI and Crossfire setups and make them even worse?
 
SLI and Crossfire are still a clusterfuck of compatibility issues, anyone who actually thinks this is going to work well at any point in the foreseeable future is crazy.
 
SLI and Crossfire are still a clusterfuck of compatibility issues, anyone who actually thinks this is going to work well at any point in the foreseeable future is crazy.

It's not crossfire or SLi. That's what makes it awesome. DX12 can divide the load up as the developer sees fit. :)
 
It's not crossfire or SLi. That's what makes it awesome. DX12 can divide the load up as the developer sees fit. :)

Developers can't even be bothered to make sure multiple GPU setups work well now, you think they are going to lift a finger trying to optimize performance when a user has two cards with different performance, different frame times, and different drivers?
 
This will only work with a very select few games so still better to buy two of the same than go this route.
 
Developers can't even be bothered to make sure multiple GPU setups work well now, you think they are going to lift a finger trying to optimize performance when a user has two cards with different performance, different frame times, and different drivers?

I very much agree.
 
Nvidia and AMD,
Run together in peerrfect harmoneee,
Side by side on my motherrrrrboooooard,
Oh Lord, why can't weeeee?
 
and they said there was no reason to get windows 10

I think what they actually said was there's no reason to get 10 until there at least a few DX12 games. 2016 for that.
 
Developers can't even be bothered to make sure multiple GPU setups work well now, you think they are going to lift a finger trying to optimize performance when a user has two cards with different performance, different frame times, and different drivers?

If only Batman Arkham Knight were DX12, then Warner Brothers wouldn't be having all those pesky problems!
 
IF this can work in a stable and reliable fashion and IF enough games include support for it, we are looking at a new world. One vendor's card is better at shaders and the other at tesselation? Buy one of each!
 
One vendor's card is better at shaders and the other at tesselation? Buy one of each!

That would never be how it would work. If one vendor's card is worse at shaders and the other is worse at tesselation, you'll end up with the worst from both. Lowest common denominator, not highest.
 
That would never be how it would work. If one vendor's card is worse at shaders and the other is worse at tesselation, you'll end up with the worst from both. Lowest common denominator, not highest.

no he was right a game could be made to target the stronger card for the task
 
Back
Top