Name games that don't work with Crossfire

Archaea

[H]F Junkie
Joined
Oct 19, 2004
Messages
11,826
Name games that don't work with Crossfire:

I'm posting a sister thread here:
https://hardforum.com/threads/name-games-that-dont-work-with-sli.1930282/

I'd like recent examples:
Let's put a timestamp on this little inquiry. Specifically, in the last year. I don't care what happened with your multi GPU graphics card setup 5 years ago. How is that at all relevant to today with different drivers, different cards, different DX, Open GL, Vulcan etc.?
 
I've not had much historic experience with dual cards. Just a pair of GTX 560 TI in SLI several years back. (long enough ago I won't comment on them here).

I keep reading that Crossfire and SLI are not worth messing with.
I can't see that based on my 5 month Crossfire experience because I haven't had a single issue with it since mid December 2016 when I got my second Fury X card.

I tend to NOT buy brand new games at full price but rather wait till they are $20-$30 -- so maybe that's why I'm not having any bad experiences -- as by the time I buy they've had some time to be patched -- but I've literally done NO tinkering to make anything work with Crossfire. It just works. I put the cards in Crossfire in December, and I don't think I've even taken them out, until last week when my friend wanted to borrow a card to test freesync. It's been flawless for me.

Games I've played with the Fury X crossfire setup since December follow. I've just verified each of these with my various game clients "last played" dates.

AND -- as to whether or not each of these games actually truly benefits from Crossfire I'd not know or care - because if it doesn't benefit from it --- it hasn't been noticably detrimental to it either. I mean, it hasn't caused me any grief or a bad play experience, or any reason to seek to disable Crossfire --- and frankly a single Fury X is sufficient in older/indie titles to hit 75Hz at 1440p, which is my freesync max. So if it's only using one card -- I woudn't know or care! I just know I've not had to fiddle with the setting, disable it, or futz with it in any way --- It's been set and forget --- so I can't complain about the Crossfire experience at all --- and I wonder what the larger rub is. Most every thread I read you see the general guidance -- DON'T USE MULTI GPU.


Steam, Orgin, and UPlay games I've played with a pair of XFX Fury X Crpssfire, on my Z87 MSI GD65, I7 4770K @ 4.5Ghz, 16GB RAM since mid December 2016-----WITH NO ISSUES!

  1. Assassins Creed IV Black Flag
  2. Battlefield 1
  3. Depth
  4. Dirt 2
  5. Dirt Rally
  6. Dragon's Age Inquisition
  7. Elder Scrolls V: Skyrim
  8. Evolve Stage 2
  9. Need for Speed
  10. Need for Speed Most Wanted
  11. Need for Speed Rivals
  12. Mad Max
  13. Middle Earth: Shadow of Mordor
  14. Path of Exile
  15. Plants vs. Zombies - Garden Warfare
  16. Rise of the Tomb Raider
  17. Ryse Son of Rome
  18. Shark Attack Deathmatch 2
  19. Star Wars BattleFront
  20. Titanfall
  21. Trine 2
  22. Windward
  23. Witcher 3 Wild Hunt
  24. Wolfenstein Old Blood


I realize I haven't played anything BLEEDING edge (with the exception of Battlefield 1 at launch). But out of all those games I've played with Crossfire since December 2016 -- ALL worked without issue!

So why all the hate on Crossfire?

Somebody give me a game to try that DOESN'T work -- or one I have to tinker with to play successfully. I'll try it and relay my experience (if I own it)
Especially since I keep reading people dissing Crossfire setups (or more realistically Multi GPU setups) all the time on this forum.

OH as to the abhorred microstutter I read warnings against. I use Freesync on a 1440P HP Omen 32 --- and whether Freesync plays into it or not --- I've never experienced microstutter in 5 months of use on these games --- not that I can see anyway.
 
Last edited:
I've played with Crossfire since December 2017

bcf89787b054ba512c777abfad16fff4.jpg
 
When crossfire activated in a recent driver Mass Effect Andromeda went from running pretty good to a glitchy nightmare. I have had flickering issues on Titanfall2 when it came out. I have had issues in dozens of modern titles with crossfire flickering but usually a patch or 2 later or driver or 2 later it goes away.
So crossfire can be frustrating for new titles that are "fresh out of the box". You will see notes about the flickering problems etc in driver release notes for newer games all the time. It's pretty easy to track down the historical issues via the driver release notes.

I ran Nvidia SLI for years and years before this and I had similar issues with brand new games and SLI. From what I understand Kyle has found that AMD's crossfire tends to have a higher frame timing rate which can make the scene look jittery then SLI usually has. I would have to agree with him. I didn't really experience that to bad with SLI but have seen it alot with crossfire.

However, I have had crossfire work more frequently in a broader list of titles then SLI did. I am however planning on just buying a single card for gaming the next time. Which I have not done since the Nvidia FX aka the renamed 5000 series in 2003. So I have lived SLI and Crossfire for about 14yrs now and I finally realized that it's about time to call it quits after seeing my wife play with 1 card all time and just not have the same level of issues that I have.

It seems that neither Nvidia or AMD quite have it working smoothly across most games. Note I game in
  • 3440 x 1440 Resolution
 
When crossfire activated in a recent driver Mass Effect Andromeda went from running pretty good to a glitchy nightmare. I have had flickering issues on Titanfall2 when it came out. I have had issues in dozens of modern titles with crossfire flickering but usually a patch or 2 later or driver or 2 later it goes away.
So crossfire can be frustrating for new titles that are "fresh out of the box". You will see notes about the flickering problems etc in driver release notes for newer games all the time. It's pretty easy to track down the historical issues via the driver release notes.

I ran Nvidia SLI for years and years before this and I had similar issues with brand new games and SLI. From what I understand Kyle has found that AMD's crossfire tends to have a higher frame timing rate which can make the scene look jittery then SLI usually has. I would have to agree with him. I didn't really experience that to bad with SLI but have seen it alot with crossfire.

However, I have had crossfire work more frequently in a broader list of titles then SLI did. I am however planning on just buying a single card for gaming the next time. Which I have not done since the Nvidia FX aka the renamed 5000 series in 2003. So I have lived SLI and Crossfire for about 14yrs now and I finally realized that it's about time to call it quits after seeing my wife play with 1 card all time and just not have the same level of issues that I have.

It seems that neither Nvidia or AMD quite have it working smoothly across most games. Note I game in
  • 3440 x 1440 Resolution

Thanks for the input! This side of the dual pronged thread multi-gpu thread has been pretty quiet! Not sure if that's because there are less crossfire users? or less problems with crossfire?

Do you use freesync on your display? What AMD cards have you used for crossfire?

A patch or two after release to enable crossfire doesn't seem so bad all things considered -- but it would be nice if was running from day 1. ( I buy most of my games when they drop to $30 or come out as a game of the year package type thing with all of the expansions -- so by that time they've been patched --- perhaps that's why I haven' experienced issues.

As to your other note - I've not seen frame time jitter in any of the games I've played since starting to use crossfire five months ago -- I've been using freesync -- I'm curious if freesync eliminates that frametime/jitter concern?

I game in 2560x1440 or 7680x1440 with three HP Omen 32. Freesync and 75Hz are enabled --- driven by a pair of Fury X.
 
Most people don't buy two bleeding-edge expensive graphics cards to play old-ass shit. I play older games but I only have one mid-tier graphics card.
 
Most people don't buy two bleeding-edge expensive graphics cards to play old-ass shit. I play older games but I only have one mid-tier graphics card.

Thanks for your valuable input.

I bought two - so I could play the games I own at 7680x1440. (11,059,200 pixels)

As compared to 4K's (8,294,400 pixels.)

So far I've been able to play any game I've tried within freesync range at ~ 50% more pixels than 4K with the two cards Fury X card crossfired --- at high settings or above. That suits my intentions well. I find value in not spending $60-70 on new games, before they are patched, and before they have all the DLC in place -- especially when you can buy them for $30 or less six months later. I'm simply not in that big of a hurry/my backlog catalogue is huge.
 
And I said most. You are clearly an exception as is your decision to have two Fury X's crossfired.
 
Just Cause 3
The Division
CS:GO is spotty
Diablo 3 works sometimes, usually not for me
Path of Exile has no profile and the standard AMD profiles are known to all cause flicker or worse so I'd like to know what magic Archaea is using.
 
Thanks for the input! This side of the dual pronged thread multi-gpu thread has been pretty quiet! Not sure if that's because there are less crossfire users? or less problems with crossfire?

Do you use freesync on your display? What AMD cards have you used for crossfire?

A patch or two after release to enable crossfire doesn't seem so bad all things considered -- but it would be nice if was running from day 1. ( I buy most of my games when they drop to $30 or come out as a game of the year package type thing with all of the expansions -- so by that time they've been patched --- perhaps that's why I haven' experienced issues.

As to your other note - I've not seen frame time jitter in any of the games I've played since starting to use crossfire five months ago -- I've been using freesync -- I'm curious if freesync eliminates that frametime/jitter concern?

I game in 2560x1440 or 7680x1440 with three HP Omen 32. Freesync and 75Hz are enabled --- driven by a pair of Fury X.

I started with 290X's I had 2 then 3 then 4 of them. Yes I'm kinda crazy. Then I had 1 390X 8GB that the wife used. She had no issues at all. Then I upgraded to two Fury's one is a ASUS Strix the other is a Gigabyte Tricooler. They are both set to the same frequencies. The gitter problem usually happens to me on games that don't officially support crossfire as I have run it anyway enabled. Also I tend to go and turn on V-Sync when that happens as well and yes usually its ok.
My monitor is a LG 34UM95-P with no freesync.

I do think freesync helps with it. Also it has improved in the last year with the later drivers as I usually only see flickering now for a few weeks till its fixed in games instead of the Jitter. But I did get Jitter in the new Mass effect when the driver enabled crossfire. What I didn't realize is the game changed all my settings with the update at the same time. So it took me a minute to get it situated and playable again. While its a little bit of a battle here and there it's really not all that bad.


My twin Fury's overclocked are quite enough for my gaming at 1440 atm. But I am eyeing Vega. I may wait for its replacement. Depends on how these hold up. The wife uses 1 Fury Nano and games all the time with no issues at all and laughs at me and my overclocking and power crazyness. What she doesn't realize is she is on a old de-lidded 3770K at 4.5ghz overclocked to the teeth :) with 32GB of ddr3-1600 and SSD's. I keep her rocking more then she realizes.

I would never install SLI or Crossfire for her. But I used to run 3 monitors for years so it was a necessity. Now i just run Ultrawide and now hate blizzard really bad. (See overwatch 21:9 Support threads) I havn't had a game that I didn't get the jittery solved eventually.
 
Just Cause 3
The Division
CS:GO is spotty
Diablo 3 works sometimes, usually not for me
Path of Exile has no profile and the standard AMD profiles are known to all cause flicker or worse so I'd like to know what magic Archaea is using.

Path of Exile is my favorite game I have thousands of hours logged in the title. Not one issue with crossfire enabled. Not sure it's helping either though because a single Fury X card pegs 2560x1440 at 75hz (my monitor max) so the second card may or may not be beneficial --- but I've not found it detrimental either.
I own CS GO and Diablo 3 of your list. I'll give them a try soon and report back. A buddy is borrowing one of my Fury X cards right now so he can try free sync. I've got to get it back first. Stitch1
 
I would count "not beneficial" as a mark against mGPU, rather than "not detrimental" as a plus for mGPU, IMHO.

The whole reason you are going mGPU is to get benefit from both GPUs, and if games don't benefit from the second GPU, be it either because it doesn't support it, or because it's easy enough to run without having second GPU around, not much point in going mGPU.
 
Path of Exile is my favorite game I have thousands of hours logged in the title. Not one issue with crossfire enabled. Not sure it's helping either though because a single Fury X card pegs 2560x1440 at 75hz (my monitor max) so the second card may or may not be beneficial --- but I've not found it detrimental either.Stitch1

Which profile are you using on PoE? 1:1? Does another game use the same engine that's compatible? They all caused heavy tearing with my 295x2 so I've been running it single card.
 
I would count "not beneficial" as a mark against mGPU, rather than "not detrimental" as a plus for mGPU, IMHO.

The whole reason you are going mGPU is to get benefit from both GPUs, and if games don't benefit from the second GPU, be it either because it doesn't support it, or because it's easy enough to run without having second GPU around, not much point in going mGPU.
You know going in you aren't always going to need two cards for all games. I attend LAN parties a few times a year. Lots of them we play old games in the mix. Anything more than about three years old will be completely maxed out by a single top end card at 1440 without question.

So no, that's not detrimental.

Detrimental is having to futz with settings half the time when you fire up a new game or turn it off because your frame rates cut in half over a single card, or your computer freezes, or the game crashes, or you can't figure it out without lots of tinkering all the time. I knew full well going in I wouldn't get benifit in every game --- but if I was having trouble with 25-50% of all my games because multi GPU was enabled. That would be detrimental and unacceptable. That's were the SLI guys seem to be.
 
Last edited:
Which profile are you using on PoE? 1:1? Does another game use the same engine that's compatible? They all caused heavy tearing with my 295x2 so I've been running it single card.

DX11 for POE.

No tearing EVER. What ever Crossfire profile is default. I haven't fiddled with any of that--- it just works.

Maybe try a freesync monitor?
 
I have a 1440p 144hz freesync monitor. None of the settings work. Oh well, with everything on highest settings it will drop to 50FPS only with heavy particles flying. Maybe I'll try again.
 
I have a 1440p 144hz freesync monitor. None of the settings work. Oh well, with everything on highest settings it will drop to 50FPS only with heavy particles flying. Maybe I'll try again.

You may already know this - if so my apologies
You have vsync on or off?
With freesync (if vsync off) you have to use targeted frame rate control to ensure you won't go over your displays hz limit--- or you'll get tearing above your monitor hz rate ---- just like you will slow free syncs minimum.

Are you by chance going above 144 FPS when you see the tearing
I just use vsync and freesysync with POE.
 
I was really disappointed with EA/Dice with Battlefield 1. When BF4 came out, we had three and four way SLI/Crossfire support. Now with BF1, they got lazy...only 2 GPU's supported in DX11 and only one GPU supported in DX12.
 
Nice thread and I don't think you are in the minority for NOT spending $60 on a new ame when you can wait a few weeks (see Andromeda) and save $20 while having a smooth running game after any patches.
Is a spring 2017 game going to 'blow you away' compared to a early 2017 game?
OP, your second Fury was basically free with your savings on NOT buying bleeding edge games.

Worth it in IMHO
 
  • Like
Reactions: N4CR
like this
asumming does not work with means does not get a benefit from, and actually not "not work".

Master of Orion
Civilization 1
Civilziation 2
Wolfensteind 3D
Speer of destiny
Doom 1 (DOS)
Doom II (DOS)
CD man
Stunts 4D sports drive
Simcity
Simants
The incredible machine 1
The incredible Machine 2
UFO enemy unknown
UFO enemy from the deep
 
Vsync enabled in game or 144hz vsync in Crimson? Doesn't in game vsync lock it to 60?
vsync with double buffing will limtes you FPS to any interger diviser of your HZ

so if you are running 144hz it will be

1/1 = 144
1/2 = 72
1/3 = 48

etc etc
This it due to the fact that both buffer are full and you wil lhave to wait for next refresh before you can swap them and start rendering again
 
asumming does not work with means does not get a benefit from, and actually not "not work".

Master of Orion
Civilization 1
Civilziation 2
Wolfensteind 3D
Speer of destiny
Doom 1 (DOS)
Doom II (DOS)
CD man
Stunts 4D sports drive
Simcity
Simants
The incredible machine 1
The incredible Machine 2
UFO enemy unknown
UFO enemy from the deep


No as the first post says. I'm interested in games that you have to disable crossfire to play. Negative performance scaling or buggy issues that force you to troubleshoot or turn of Crossfire. The nvidia guys seemingly have to disable SLI fairly regularly because of the game doesn't support it, it can cause game breaking issues, leading to frustration. My brother recently upgraded from a pair of SLI 970 to a 1080ti because he said he tired of messing with SLI as it's not reliable for games and he has to turn it on and off all the time and it's a hassle. He had SLI for 2-3 years or so and he said never again will he do SLI.

My observation on the Crossfire cards is I've not had to disable Crossfire yet, since December when I installed it, and not had one problem. So that's the point of the thread. To inquire to others experiences.

I don't expect all titles to actually gain performance from Crossfire(or SLI) but I shouldn't see game breaking bugs or negative performance scaling either. (Like nVidias SLI owners state)

As to those titles you listed, those may not get performance gains from Crossfire if you say, but they are all so old a single card will max out performance right so extra performance needs for such older titles now is probably moot. Some of those gams are 20 years old!!!!! Or more!
 
Back
Top