Crossfire 580 vs single 1070

Stitch1

Weaksauce
Joined
Dec 12, 2016
Messages
106
I currently have a GTX1070 am in thinking about selling it off to buy two 580 to run in crossfire mostly due to buying a freesync monitor.

Do you guys think this is blasphemy or should I do it? How well does Crossfire work these days what issues might I run into?

Any thoughts or ideas?
 
I would say the same thing as previous poster and add that 2x 580 probably use more power than your 1070 and may need a new PSU.
 
well, what games do you play -- its easy enough to see if they are supported. I run two 480s and on games that support crossfire they absolutely kill it on maxed settings. Though, its really a gamble if it will be supported. My advice is to wait and see what vega does... 6 weeks to go.
 
I currently have a GTX1070 am in thinking about selling it off to buy two 580 to run in crossfire mostly due to buying a freesync monitor.

Do you guys think this is blasphemy or should I do it? How well does Crossfire work these days what issues might I run into?

Any thoughts or ideas?
Can you hold out a little longer. Vega should be comparable to 1070.
 
Can you hold out a little longer. Vega should be comparable to 1070.

If Vega is only comparable to 1070 it's an EPIC fail for AMD. The Fury X that came out 2 years ago is fairly comparable to the 1070.
 
The only time multiple GPUs make sense is if there isn't a single GPU that's fast enough for your purposes.

Since there's... Literally 6 Nvidia GPUs that are faster than anything AMD sells, even the 580...

No one should ever, under any circumstances, be running Crossfire at this point in time.
 
If Vega is only comparable to 1070 it's an EPIC fail for AMD. The Fury X that came out 2 years ago is fairly comparable to the 1070.
I'm sure they will release multiple versions based on Vega. How would a lower end version be an "epic fail" for being comparable to a a 1070?
 
I always recommend staying away from multi gpu. not worth it keep your 1070 and wait for Volta. I am sure it will be here in next month or two. That will probably serve you better.
Fixed. Probably going to be out before Vega at this rate too lol.
 
My first few full builds were all AMD and ATI. I loved the idea of supporting the underdog that made a point of includingg gaming and OCing as major selling points, putting the enthusiast community first. My friends called me the "digital crack dealer", having convinced one after another to spend basically everything they had (back in highschool, mind!) on a new PC to get the kind of experience I had. I was the guy that showed up at your house on Friday night with an Antec (Chieftech?) full tower, a Dell P1110 21" CRT in the car that weighed half what I did, a power strip (you didn't have enough plugs by half), a 10/100 switch, and a backpack full of keyboards, mice, and networking cables.

Here's my GPU upgrade history near as I can remember, just so we're clear: S3 Virge to ATI Rage 3D, 3DFX Voodoo 3500 AGP, ATI Radeon DDR, ATI Radeon 7800 Pro, (memory gets hazy), Nvidia 8800 GTX, 2 way SLI, 3 way SLI, ATI (AMD?) Radeon 4890 2-way Crossfire, AMD Radeon 6950 (unlocked) 2GB 2 way Crossfire, GTX 980Ti 2 way SLI. I am brand agnostic, I have always gone where the power is.

AMD has done nothing but shit the bed like it's their job (with regard to high end gaming, which to be fair is all I care about) ever since. ATI stayed competitive after the acquisition for a while, but in recent years has turned in to a similarly pathetic shitshow.

Price to performance is great as a rule, don't get me wrong. I love competiton! But when that's the only selling point for your product, when you literally can't offer anything even close in terms of what your competitor has had out for months and instead have to say "look, their stuff is too expensive, ours isn't nearly as fast but it costs less for what you're getting" and then your competition comes back with "we can beat you on price, too" by means of the GTX 1060? Yeah, you "don' fucked up" and need to put nose to grindstone. This? This is the 290X/390X (and 7970 GHz edition bullshit) all over again. This is the height of corporate laziness. AMD's graphics division is putting out "Intel CPU" improvement numbers, except... they aren't market dominant like Intel in the CPU market. Idiots.

Fuck you, AMD. FUCK. YOU. This (RX580) is a nothing product that should never have seen the light of day. Get your shit together, launch a new architecture that's at least as fast as your current top end competitor, or give up. Sell the assets to someone who's actually going to try, and move on. I'm tired of carrying this heavy-ass torch.
 
Last edited:
Definitely see what Vega has to offer. If a single Vega will suffice and be priced competitively its a win/win. If they aren't as powerful they will have to sell with a lower price so you could just Xfire those.
 
My first few full builds were all AMD and ATI. I loved the idea of supporting the underdog that made a point of includingg gaming and OCing as major selling points, putting the enthusiast community first. My friends called me the "digital crack dealer", having convinced one after another to spend basically everything they had (back in highschool, mind!) on a new PC to get the kind of experience I had. I was the guy that showed up at your house on Friday night with an Antec (Chieftech?) full tower, a Dell P1110 21" CRT in the car that weighed half what I did, a power strip (you didn't have enough plugs by half), a 10/100 switch, and a backpack full of keyboards, mice, and networking cables.

Here's my GPU upgrade history in full, just so we're clear: onboard to Rage 3D, 3DFX Voodoo 3500 AGP, Radeon DDR, Radeon 7800 Pro, (memory gets hazy), 8800 GTX

AMD has done nothing but shit the bed like it's their job (with regard to high end gaming, which to be fair is all I care about) ever since. ATI stayed competitive after the acquisition for a while, but in recent years has turned in to a similarly pathetic shitshow.

Price to performance is great as a rule, don't get me wrong. I love competiton! But when that's the only selling point for your product, when you literally can't offer anything even close in terms of what your competitor has had out for months and instead have to say "look, their stuff is too expensive, ours isn't nearly as fast but it costs less for what you're getting" and then your competition comes back with "we can beat you on price, too" by means of the GTX 1060? Yeah, you "don' fucked up" and need to put nose to grindstone. This? This is the 290X/390X all over again. This is the height of corporate laziness. AMD's graphics division is putting out "Intel CPU" improvement numbers, except... they aren't market dominant like Intel in the CPU market. Idiots.

Fuck you, AMD. FUCK. YOU. This (RX580) is a nothing product that should never have seen the light of day. Get your shit together, launch a new architecture that's at least as fast as your current top end competitor, or give up. Sell the assets to someone who's actually going to try and move on. I'm tired of carrying this heavy-ass torch.


Lets be honest. You only got Zen to be competitive because GPU division had to give. AMD poured whatever resources they had to save their company and they are primarily CPU company if they fail there again its game over. They will get back in the GPU game eventually. They are staying some what competitive at the cost of more power is still okay in my book. We will see how Vega does, I think it will probably improve in power compared to Polaris. But it will likel wont match Nvidia. Navi might be something that finally brings power improvements in the way they need them.
 
Lets be honest. You only got Zen to be competitive because GPU division had to give. AMD poured whatever resources they had to save their company and they are primarily CPU company if they fail there again its game over. They will get back in the GPU game eventually. They are staying some what competitive at the cost of more power is still okay in my book. We will see how Vega does, I think it will probably improve in power compared to Polaris. But it will likel wont match Nvidia. Navi might be something that finally brings power improvements in the way they need them.

If you can't even offer a "halo" product that at least matches your competiton you don't deserve to have a place in the market. I'm tired of making excuses for these hype-beasting idiots. AMD/ATI has been nothing but a constant source of disappointment to anyone interested in the top end market for 7+ years now.
 
I would wait to see what Vega looks like rather than getting two 580s. I'm running 480 Crossfire on this rig. It's OK for some games, but there are lots of titles that either don't take advantage, or get poor performance, glitches, etc.

The reason to mess around with SLI/Crossfire is running large resolutions (like 4K or Surround/Eyefinity) as one card can barely cut it. If you are at 1080p or even 1440p, you are better off just getting one decent card.

Normally I would say Nvidia would be better, but if you really want FreeSync I'd say wait for Vega. It hopefully will be something competitive.
 
I would wait to see what Vega looks like rather than getting two 580s. I'm running 480 Crossfire on this rig. It's OK for some games, but there are lots of titles that either don't take advantage, or get poor performance, glitches, etc.

The reason to mess around with SLI/Crossfire is running large resolutions (like 4K or Surround/Eyefinity) as one card can barely cut it. If you are at 1080p or even 1440p, you are better off just getting one decent card.

Normally I would say Nvidia would be better, but if you really want FreeSync I'd say wait for Vega. It hopefully will be something competitive.

What games have you encountered crossfire issues with? Are you talking at game launch? or still have problems?

I've had my crossfire setup for six months now with the Fury X and I haven't had to turn it off a single time for any of the games I've played. It's been plug in play.
 
What games have you encountered crossfire issues with? Are you talking at game launch? or still have problems?

I've had my crossfire setup for six months now with the Fury X and I haven't had to turn it off a single time for any of the games I've played. It's been plug in play.

You're incredibly lucky, then.

Again, look at my history -- I've been running multiple cards every gerenation since 8800 GTX. Went from that to 4890s, 6950s, now 980Tis. There are /regularly/ titles I have to disable multi-GPU or at bare minimum spend an hour or more messing with profile settings and editing INIs. This is more and more common, not less. Modern multi-GPU support is worse than it once was, and getting worse still by the day.

Lets be honest. You only got Zen to be competitive because GPU division had to give. AMD poured whatever resources they had to save their company and they are primarily CPU company if they fail there again its game over. They will get back in the GPU game eventually. They are staying some what competitive at the cost of more power is still okay in my book. We will see how Vega does, I think it will probably improve in power compared to Polaris. But it will likel wont match Nvidia. Navi might be something that finally brings power improvements in the way they need them.

If this is the case, they deserve to die. When you as a company don't have the sense to find new people when current staff are failing left and right, and pay them what they're worth to make sure you're competitive... time to die.
 
Last edited:
Again, look at my history -- I've been running multiple cards every gerenation since 8800 GTX. Went from that to 4890s, 6950s, now 980Tis. There are /regularly/ titles I have to disable multi-GPU or at bare minimum spend an hour or more messing with profile settings and editing INIs. This is more and more common, not less. Modern multi-GPU support is worse than it once was, and getting worse still by the day.

While multi-GPU can be finicky, I've not really had too many problems with it, using a single 4k monitor. It can be a little trickier using Surround. And there's still a good number of titles that support it. ME did, Sniper Elite, don't know about Prey. RE 7 didn't but there's a profile for it.

Still at this level, I think you're better off with a single card. Though it will all depend on the games one plays how it really works out in the end.
 
I personally did not have a very good experience with CrossFire. My last setup was 290X CF and when it worked, it was excellent, but in a lot of new titles it was broken for a while and in some cases remained broken (visual anomalies) months after release. Mantle in BF4 was excellent at the time, so it wasn't a hardware problem, it was just drivers + game support.

I've been running a 980 Ti and now Titan X since then and the consistent frame times make games a lot more enjoyable. I would keep the 1070 and buy a GSYNC monitor.
 
If this is the case, they deserve to die. When you as a company don't have the sense to find new people when current staff are failing left and right, and pay them what they're worth to make sure you're competitive... time to die.

It seems to me you do not understand economics or the marketplace. Just because AMD does not offer a product that is competitive on the high end does not mean they are not competitive on the whole. The market for CPU and GPU is vast. AMD's new products insure that they will have a growing market share. That was their plan and OMG they are actually successful at it.
 
You're incredibly lucky, then.

Again, look at my history -- I've been running multiple cards every gerenation since 8800 GTX. Went from that to 4890s, 6950s, now 980Tis. There are /regularly/ titles I have to disable multi-GPU or at bare minimum spend an hour or more messing with profile settings and editing INIs. This is more and more common, not less. Modern multi-GPU support is worse than it once was, and getting worse still by the day..
I'm a gamer from way back in the 1980's too My first card was the S3, and then a 3dfx vodoo1, and on up the chain --- though I've not had much experience with dual cards.

I just can't agree that the current crossfire experience is poor because I haven't had a single issue with it since mid December 2016 when I got my second card.

I've not run crossfire or SLI for any length of time before, with the exception of nVidia 560TI pair. The 560TI in SLI seemed less plug in play than the Fury X. (but that's been a few years ago --- so I won't hold that against them now)

I tend to NOT buy brand new games at full price but rather wait till they are $20-$30 -- so maybe that's why I'm not having any bad experiences -- as by that time they've had some patching -- but I've literally done NO tinkering to make anything work with crossfire. It just works. I put the cards in crossfire in December, and I don't think I've even taken them out, until last week when my friend wanted to borrow a card to test freesync.

Games I've played with the Fury X crossfire setup since December follow. I've just verified each of these with my game clients last played date

As to whether or not each of these games actually benefits from crossfire I'd not know or care - because if it doesn't it hasn't caused me grief or a bad play experience, or any reason to disable crossfire --- and frankly a single Fury X is sufficient in older/indie titles for 1440p frame rate to my monitors 75Hz freesync max. So if it's only using one card -- who knows or cares?


Steam, Orgin, and UPlay games played with Fury X in Crossfire since mid December 2017 when I got the card: (WITH NO ISSUES - not one)

  1. Assassins Creed IV Black Flag
  2. Battlefield 1
  3. Depth
  4. Dirt 2
  5. Dirt Rally
  6. Dragon's Age Inquisition
  7. Elder Scrolls V: Skyrim
  8. Evolve Stage 2
  9. Need for Speed
  10. Need for Speed Most Wanted
  11. Need for Speed Rivals
  12. Mad Max
  13. Middle Earth: Shadow of Mordor
  14. Path of Exile
  15. Plants vs. Zombies - Garden Warfare
  16. Rise of the Tomb Raider
  17. Ryse Son of Rome
  18. Shark Attack Deathmatch 2
  19. Star Wars BattleFront
  20. Titanfall
  21. Trine 2
  22. Windward
  23. Witcher 3 Wild Hunt
  24. Wolfenstein Old Blood

Benchmarks:
FutureMark
Ungine Superposition



So my question stands....I realize I haven't played anything BLEEDING edge - just released (with the exception of Battlefield 1). But out of all those games I've played with Crossfire since December 2017 -- all worked just fine (or at least didn't glitch up or require me to disable crossfire --- or require I do any tinkering at all to make it work).

So what is so awful about Crossfire (or AMD drivers for that matter?) that everybody is dissing it all the time? I mean I'm sitting at a 100% success rate right now (out of 24 games in my game inventory) --- with no nonsense frustrations. I don't get it. I've had a Fury X card for over a year now (Feb, 2016 purchase date) --- and the Crossfire setup for 5 months (Dec, 2016 purchase date). I mean somebody give me a game to try that DOESN'T work -- that I have to tinker with to play if we are going to be bad mouthing a crossfire setup all the time on the forum.
 
Last edited:
silent-circuit

You wanna take a look down memory lane -- here you go - post 40
https://hardforum.com/threads/your-greatest-overclocks-of-all-time.1927339/#post-1042952388

My favorite CPU overclocks for my various systems I've owned.

As far as graphics cards -- I'm afraid I might have had too many to recall them all. I bought and sold quite a few graphics cards over the years -- I was never buying top end cards until recently --- and so I'd be trading various mid tier cards all the time --- basically buying and selling on ebay. I never got into overclocking graphics cards like the CPUs --- the gains always seemed rather minimal for extra heat, fan noise, and instability. I was Nvidia ONLY IIRC - from about 2002 to 2015. Mostly because at the LAN parties I'd host regularly - the AMD guys would always have driver problems with the older titles.

In 2015, I bought an AMD 285 SPECIFICALLY BECAUSE THEY STARTED SUPPORTING PLP (portrait/landscape/portrait) MONITOR CONFIGURATIION (I had 20"/30"/20" monitor setup). NVIDIA NEVER DID, AND STILL DOESN'T SUPPORT PLP === LAME. Then I bought the 380 pair to try Crossfire for better performance driving those three screens, but got a bad 380 card, and so sent them both back and went with a Fury X instead, then second Fury X after I swapped out my PLP monitor setup with three 32" Omens for FreeSync, (to better drive 7680x1440 resolution) but I've not missed my NVidia experience. The drivers are pretty equitable IMO between the two companies -- I've been very satisfied with AMD on these last three cards. AMD drivers have been solid for me. Triple screen Eyefinity is great when Field of View is properly accounted for by the game developers. Sadly most of the time it's not. Pretty much it's good for racing games, and not much else (shadow of Mordor was an exception and it looked amazing at 7680x1440). If there is stretching on the outer monitors, I just play on the center single monitor at 1440p. Flawless Widescreen application helps when the game is aupported.

PLP
 
Last edited:
Thanks for the posts everyone. I enjoyed the banter back and forth.

My gaming habbits are changing the more I play PC games. I am much less about the new releases but like waiting till their frist price drop or GOTY versions come out. I play a lot of single player shooters and RPGs. I am currently playing my second play through of Fallout 4.

I got a great deal on an freesync monitor during black friday last year. I really have no motivation to upgrade as I would be spending FAR more on a GSync monitor than I would swapping cards. My 1070 seems fairly easy to sell. a 34" ultra wide is a bit more for of a niche market. Plus I really like this monitor and I hate playing russian roulette with backlight bleed, dead pixel, and the numer of other things that could go wrong with a monitor upgrade.

While I do agree a single card dose seem simpler in every fashion. AMD just doesn't offer one yet. I may stick it out till Vega but two 580s seem to get me in the performance range I was looking for with Freesync added in.

Another option I have been thinking is finding a used Fury card but 4gigs of vram is a setting limiter for Doom already. Makes me worried that I may get blocked out of other features in the future.

If GSync was cheaper or if AMD had a true 1070 competitor this would be a much easier debate.
 
So, I've had this rig for about a month and it's my first experience with Crossfire. Of the games I've tried, it's been a hit or miss (mostly miss).

GTA V - works great, getting 65 - 70 fps with max settings 4K resolution. The one solid win, aside from an odd performance glitch that went away.
DOOM - Unplayable at 4K, doesn't seem to support Crossfire. 1080p was fine.
RE7 - Playable at 4K on only lowest settings. Seems there is a way to get CF working but need to tweak things.
RoTR - Around 45 FPS at 4K. CF is working but the game is demanding. 1080P works well.
Saint's Row IV - Looks great and smooth at 1080p but crashes frequently. Got one BSOD with an ATI driver error.
Assasin's Creed Brotherhood - Horrible choppiness/stutter throughout and some crashing. Eventually got it to work after heavy tweaking, but it took days.
Far Cry Primal - Does work and seems to support CF but fps still around 45 - 50 at 4K, even on medium settings. Granted, I can barely hit 60 fps on my other Titan X machine, so this is mostly the game.
Batman Arkham City - Does work at 4K high settings in most areas, but certain parts will drop to 25 fps for no visible reason. With Nvidia single GPU there were no problems, even on the same machine.
Borderlands - Bad performance at 4K. Even on almost lowest settings, still barely reaching 60fps. This is a game from 8 years ago, it should be possible to play at 4K with 2 of AMDs top of the line cards.

So overall I am not impressed. All the games I've tried so far had some issue, and GTA V was the only one that eventually ran with the theoretical performance of the machine. That said, I will still explore more and may get Vega Crossfire but I can't say I'm not a tad disappointed.
 
Last edited:
For 4K FreeSync I would wait for Vega to then make an informed decision. CFX in my experience when it worked it was a great experience, when not it was a rather big let down. I am finding SLI to be similar so it may depend more upon you if good or bad. Now if one is talking 1440p resolutions - nothing wrong with a Rx 580 by itself for most games, CFX for some games would be over kill and a disappointment in others. Now if we are talking about 3440x1440p, that is a resolution where a single 1070 struggles and you will need to reduce settings at time which mean a single Rx580 would be much worse while 2x Rx 580x would be great anytime CFX works well enough and sucky when not. Vega there too I would recommend. Freesync at 3440x1440 maybe will help a Rx580 in having smoother game play.

I keep monitors around 4 years+ for the high price ones, my Dell 3440x1440p, unless it blows up has at least another 2 years+. The 4K FreeSync monitor, the same. I am not about to go out and buy a GSync monitor at an added cost and then be locked down when Nvidia should be supporting Adaptive Sync standard to begin with. So the 6700K machine will most likely get a Vega card in the future, FreeSync is just too good not to use and makes 4K pretty much playable with a Nano, without FreeSync I would be stuck at 1440p more often than not. With FreeSync and 4K plus Nano I play Doom, BF1, Mirrors Edge, BattleFront at 4K with max or near maxed settings because of FreeSync. Add 40% to 60% improvement with a Vega card and I will be in 4K heaven so to speak.
 
You already have a GTX 1070. You're going to sell it at a loss in exchange for a pair of 580s. I understand wanting to be able to take advantage of Freesync, though. Just wait for Vega. It should only be a couple more months before it releases.
 
I'm taking a $100 loss on the 1070. Yes, that does kinda suck. But it's not like I bought it as an investment. This is a hobby and form of entertainment. Also, I am not trying to push 4K with these two cards. However, I am tempted to wait it out for Vega. I am guessing we will get the official word on Vega during Computex.
 
I'm taking a $100 loss on the 1070. Yes, that does kinda suck. But it's not like I bought it as an investment. This is a hobby and form of entertainment. Also, I am not trying to push 4K with these two cards. However, I am tempted to wait it out for Vega. I am guessing we will get the official word on Vega during Computex.

If you're going to go through with this foolishness, you should at least do yourself the favor of not selling the 1070 until you've tried out the CF mess for a week.

Freesync = the same thing as vsync-off, with no tearing. I'm not sure that's worth the trouble, since the best performance you will get is the same as the 1070.
 
Freesync = the same thing as vsync-off, with no tearing. I'm not sure that's worth the trouble, since the best performance you will get is the same as the 1070.

Nah it's smoother too, 48fps on my monitor feels smooth, 47 does not. It's out of freesync range and you can easily tell if the FPS dips out of free sync range -- as soon as it dips out of freesync hz range. (This was very perceivable when I was playing through shadow of Mordor at 7680x1440 with a single Fury X.)

I've not tried gsync, but I'd assume it's the same.

Not only that but turn free sync on and vsync off and use targeted frame rate control and it feels slightly smoother still.
 
Last edited:
I have limited experience with freesync and next to no experience with crossfire. What I have used and seen of freesync looks really nice. I really like how smooth it makes games. But I am worried crossfire could become a bit of mess if games don't fully support it.

Looking at 480 crossfire benchmarks don't seem too bad at all. Seems to fall between the 1070 and 1080 in most that I am seeing.
 
What's everyone's thoughts on the R9 Fury serise compared to the 580? Does a single Fury make more sense?
 
Imagine if Nvidia did the same shady business Intel has been doing for years. 1-5% increase in performance every year. I'd be out of gaming business forever.
Sure, Nvidia is ripping us off.... but at least we're getting alright improvements! 980Ti > 1080Ti = 70% or so, I'll gladly pay premium price for that!
 
4-500W and 500$ to get 500$ 1080 performance in the limited cases where mGPU actually works? Sounds rather silly.

If you are die hard set on an AMD card wait for Vega. If you just want more performance buy a 1080 or 1080ti.
 
What's everyone's thoughts on the R9 Fury serise compared to the 580? Does a single Fury make more sense?
It's a decent performer that is bottlenecked by the 4GB VRAM. In newer games you'll be brushing up against that limit even at 1920x1080. It's about on par with a GTX 1070, otherwise, so I wouldn't pay more than $350 USD for one.
 
Fury is a great 1440P card. It does very well especially in free sync setups. Its by no means a 4 K GPU unless you plan to run CFX . I have yet to run out of VRAM at 1440P even in new games. I had a Pro Duo as well for a few weeks and it rocked. CFX is scaling better than SLI. AMD driver support has improved and continuous to do so. Fury is a decent deal at the moment and it does well for a 2 year old GPU. Vega will be here in a few weeks and if I were in your position I would wait it out until its release. I'm pretty sure that the low end Vega will be matching Fury x / !070.
 
I spent the better part of the night looking and watching every bit of content regarding Vega. It seems worth the wait. Plus anything that is currently available isn't going anywhere. So, I am gonna hop on that hype train and see where it leads.

Thanks everyone.
 
Fuck you, AMD. FUCK. YOU. This (RX580) is a nothing product that should never have seen the light of day. Get your shit together, launch a new architecture that's at least as fast as your current top end competitor, or give up. Sell the assets to someone who's actually going to try, and move on. I'm tired of carrying this heavy-ass torch.

You're not helping the situation by giving Jen-Hsun your money, so he can pay developers to optimize games with loads of hidden tesselation and TWIMTBP features.
I'm still running 290 crossfire because I refuse to give the GameWorks(TM) monopoly my money.
 
You're not helping the situation by giving Jen-Hsun your money, so he can pay developers to optimize games with loads of hidden tesselation and TWIMTBP features.
I'm still running 290 crossfire because I refuse to give the GameWorks(TM) monopoly my money.
The irony is Quantum Break with DX12 was horrendously designed for Nvidia GPUs in terms of the game engine and global illumination/volumetric lighting/compute shader, so it does swing both ways.
And that is not the only game that is not structured to work great on Nvidia (they can work some magic with DX11 with their internal driver team to overcome some of the headaches but even then not guaranteed).
Tesselation is at least something that can be controlled at a more global level by AMD in their settings or directly by the Studio and developers, case in point is Witcher 3 where tesselation can be controlled within the ini file or by AMD GPU.
Nvidia does not dictate they must use 16x.
Cheers
 
If you have to choose, IMO, FPS > adaptive sync. Vega may be a pipe dream, or it may not. Only time will tell.

You're not helping the situation by giving Jen-Hsun your money, so he can pay developers to optimize games with loads of hidden tesselation and TWIMTBP features.
I'm still running 290 crossfire because I refuse to give the GameWorks(TM) monopoly my money.
Low fps crusade, eh?
 
Back
Top