AMD vs Nvidia Multi GPU Scaling

plexx

n00b
Joined
Dec 28, 2011
Messages
42
chart speaks for itself

Nvidia-AMD-Multi-GPU-Benchmarks.jpg


amd-nvidia-multi-gpu-performance-per-dollar-1.png
 
With no background, context, settings, test environment, etc. Charts like this are completely pointless.
 
MultiGPU sucks from both vendors. Don't let them fool you.
 
Yeah it's that Korean site that tested with 9 out of 12 AMD Gaming Evolved games.
We talked about this before.

I believe my comment at the time was, and still remains, do you think we are stupid?

hVuGJuW.png
 
All of those games run extremely well on Nvidia hardware. Remember AMD doesn't have a GameWorks black box requirement for Gaming Evolved games. So there is nothing holding it back from working well on Nvidia's hardware, and they have extremely nice frame rates when played.
 
All of those games run extremely well on Nvidia hardware. Remember AMD doesn't have a GameWorks black box requirement for Gaming Evolved games. So there is nothing holding it back from working well on Nvidia's hardware, and they have extremely nice frame rates when played.
The results are useful for testing CrossFire/SLi scaling in Gaming Evolved games which are 2-3 years old. It's useless data for anyone playing games released in 2015 and/or GameWorks games... Which is probably everyone buying multiple Fury X's or 980 Ti's. Witcher 3? GTA V? It's not even a GameWorks game, why isn't that in their bench?

Throw those results in the garbage, start over.
 
Here we go again.

Did you take the time to look at these reviews before you make such ridiculous statements?

The first one in Korean, is just an aggregate score of accumulated frame rates. There is no detail there and does not provide an accurate representation of performance.

Its cool but he needs to go into more detail to really make his review worthwhile.
 
People are using the data because he's the only person on the planet with 4 Fury X's and 4 980 Ti's as far as I know. He also has an agenda, apparently... Which is great because AMD needs all the good publicity they can get.

afaik CF scales better than SLi which is particularly useful in lower tiers where AMD is more competitive. Based on these results I have to assume that's not true for the FX/980 Ti since the author makes such a serious effort to paint AMD in a positive light.
 
How are we supposed to take this seriously? I mean, it doesn't even mention what the buckets setting was for these tests.
 
It was 11.

Well how can you know for sure, do you speak Jive?

EDIT: until he shows us the actual performance numbers for individual games, this compilation of data is worthless. How can we check his conclusions when there is no data?
 
I don't think it is possible for any one else to pick a more unscientific graph than the ones OP picked...

EDIT: The graphs actually tells us more about the OP than what the graphs seemingly is trying to depict.
 
Amd multi gpu scaling has crushed sli since they started using XDMA
 
Give me lowest 1% FPS and I might give a shit. Averages mean nothing. It's the lowest that determines your game settings.

And again, multiGPU blows. I've tried it from both vendors.

Ooh look, Tainted shitting on AMD.

/shock

I don't know what you're talking about. Tainted is bipolar. Never know which one you'll get! (Pro-AMD or Pro-nVidia) :). Some call it "objective".
 
Ooh look, Tainted shitting on AMD.

/shock
It looks like both the Korean source and WCCF are burying their test suite as neither article mentions it. You have to go all the way back to the original benchmarks just to find the games they used -- lo and behold -- it's chock full of Gaming Evolved games. /shock

This is just as retarded as an SLi benchmark full of Ubisoft GameWorks games... It's a waste of time, an insult to our intelligence, and horribly disingenuous. Shame on the Korean author, shame on WCCFTech, shame on OP, and shame on everyone else for making me the first person to point it out.

And these benchmarks pop-up while everyone in the AMD community whines endlessly about Nvidia bias on the internet (which is probably true, tbh). You can't start adding tech sites to your anti-AMD blacklist while simultaneously flauting these benchmarks. Hypocrites... If the AMD community actually cared about integrity and "fair" reviews they would be shitting all over this, too.
 
Amd multi gpu scaling has crushed sli since they started using XDMA

objective numbers can be incredibly misleading especially when it comes to multi gpu. plenty of sites for years have been posting stellar cf numbers without any mention of how poor the gaming experience really was. the only real thing xdma does is improve smoothness. without a subjective description the numbers aren't worth anything.
 
XDMA technology using pcie bus is a few orders of magnitudes faster than the small sli bus and overall a superior implementation. Why are people surprised?

If anything AMD's lead should be larger in multi-GPU scaling than it currently is.
 
These tests are invalid because it does not have Project Cars.

To do a proper comparison you need to add Project Cars.
 
Yeah it's that Korean site that tested with 9 out of 12 AMD Gaming Evolved games.
We talked about this before.

I believe my comment at the time was, and still remains, do you think we are stupid?

hVuGJuW.png

And when Brent uses more gamesdontwork titles you do not complain and whine gtfoh.
 
And when Brent uses more gamesdontwork titles you do not complain and whine gtfoh.

personally who the hell cares what games they use as long as they don't use heavily bias games. Like the one I mentioned a few posts up.

And for the record Dirt is heavily biased game as well. Just like Ryse as well.

Just my 0.02c
 
The results are useful for testing CrossFire/SLi scaling in Gaming Evolved games which are 2-3 years old. It's useless data for anyone playing games released in 2015 and/or GameWorks games... Which is probably everyone buying multiple Fury X's or 980 Ti's. Witcher 3? GTA V? It's not even a GameWorks game, why isn't that in their bench?

Throw those results in the garbage, start over.

Guaranteed he tested Witcher 3 and GTA 5, but the results didn't fit his narrative so they were omitted.
 
This is nothing surprising, the 980Ti still wins in over all performance but its nice that you get a bit more performance for your buck when you crossfire AMD cards.

With that said the bridge idea has been outdated for a while. Its not seriously behind so I don't fault Nvidia for not using the PCIe lanes like AMD does but they should really consider ditching the bridge with Pascal.

The results are useful for testing CrossFire/SLi scaling in Gaming Evolved games which are 2-3 years old. It's useless data for anyone playing games released in 2015 and/or GameWorks games... Which is probably everyone buying multiple Fury X's or 980 Ti's. Witcher 3? GTA V? It's not even a GameWorks game, why isn't that in their bench?

Throw those results in the garbage, start over.

Doesn't matter, as long as the crossfire profile works with the game you should see similar scaling with newer games. Remember these graphs aren't showing overall performance, it is performance gained when adding the second card. AMD's solution is more efficient, but not astronomical or enough to take the performance crown.

Personally I'm surprised Nvidia hasn't ditched the bridge yet.
 
Better scaling means F-all if the FPS is still lower than its competitor.

I can claim a GPU to scale 100% across 50 cards and yet still can't run tetris without stuttering.
 
Actually, wouldn't it make sense to use whatever games are most popular regardless of which GPU its biased towards? Isn't the point of a benchmark to see which GPU is going to give you better performance in the games you want to play? How many people play Dirt 3? Why do I still see Tomb Raider and Bioshock Infinite all the time in new benchmarks? 4 way SLI??? 4 way Crossfire??? Look up 4k gaming on YouTube and look at people running 4 Titans, massive stutter. If you are doing multiple Titans for other resolutions... Does it really fricken matter if one card has 276 FPS and the other card has 269 FPS? This is why people have to turn the buckets up to 11.

Lets get more reviews of games running at 640x480 just to be sure consumers are getting the real results they need.
 
This is nothing surprising, the 980Ti still wins in over all performance but its nice that you get a bit more performance for your buck when you crossfire AMD cards.

With that said the bridge idea has been outdated for a while. Its not seriously behind so I don't fault Nvidia for not using the PCIe lanes like AMD does but they should really consider ditching the bridge with Pascal.



Doesn't matter, as long as the crossfire profile works with the game you should see similar scaling with newer games. Remember these graphs aren't showing overall performance, it is performance gained when adding the second card. AMD's solution is more efficient, but not astronomical or enough to take the performance crown.

Personally I'm surprised Nvidia hasn't ditched the bridge yet.

I agree with you wholeheartedly. I wonder why Nvidia is still using that silly bridge. Maybe they will upgrade with Pascal or wait for PCIe 4.0. There was a guy on the forums that had 4 Titan X and got little scaling with 3 and zero scaling with 4. Think his name was Bashaa? Something like that. Been awhile.
 
Actually, wouldn't it make sense to use whatever games are most popular regardless of which GPU its biased towards? Isn't the point of a benchmark to see which GPU is going to give you better performance in the games you want to play? How many people play Dirt 3? Why do I still see Tomb Raider and Bioshock Infinite all the time in new benchmarks? 4 way SLI??? 4 way Crossfire??? Look up 4k gaming on YouTube and look at people running 4 Titans, massive stutter. If you are doing multiple Titans for other resolutions... Does it really fricken matter if one card has 276 FPS and the other card has 269 FPS?

Lots of benchmark use the same games because they've used them with older cards. Allows for comparison between cards that they may no longer have and are unable to test with newer games.

That said, I agree that it's not really of much relevance if the game runs 100+ fps in high resolutions (1440p and above), then it most likely runs well even on lower end hardware.

Nvidia does seem to have issues with scaling on the 980 Ti and Titan X. The results don't seem to give as big a boost as you get from two 970s (which is roughly equivalent to a single 980 Ti) for example. Could be anything from CPU bottlenecks to driver issues.

I doubt Nvidia will ditch the SLI bridge though, isn't NVLink going to be pretty much the replacement for that?
 
LOL it has single Fury X at higher perofrmance per dollar than single 980 ti, :rolleyes: LOLOLOLOLOLOLOLOLOL :rolleyes:
 
I doubt Nvidia will ditch the SLI bridge though, isn't NVLink going to be pretty much the replacement for that?

Why are so many people so hung up on thinking NVLink is going to some super magic trick for consumer X86 based systems? NVLink is a bandwidth protocol which rides on top of the PCI-E 3.0 bus. It allows IBM Power CPUs and a large number of Nvidia Telsa GPUs to communicate with speeds faster then the current PCI-E 3.0 spec, basically a fancy form of data compression..

NVLink will never come to consumer Mobos unless Nvidia can afford to pay Intel (or AMD with their new Zen Arch) billions to incorporate the necessary logic in their X86 CPUs...Without that, NVLink is just a marketing bulletin...
 
The joke here is crossfire where the support is nearly non existent outside triple A games and even then expect the crossfire profile a month after a game release.
 
Unable to find a "fair" site in the United States and Europe, AMD has gone to North Korea to find a benchmark that favors them.

2EOuaLF.gif
 
Problem with crossfire scaling is yes it's great in average fps, and will be great in DX12 as it is in Mantle. But in DX11 the minimum fps is currently affected by the high overhead driver so those graphs are deceptive and don't represent true performance. This is also why Freesync isn't really relevant yet because CF performance will go below Freesync threshold quite often.
 
Problem with crossfire scaling is yes it's great in average fps, and will be great in DX12 as it is in Mantle. But in DX11 the minimum fps is currently affected by the high overhead driver so those graphs are deceptive and don't represent true performance. This is also why Freesync isn't really relevant yet because CF performance will go below Freesync threshold quite often.

Very true and the reality is FreeSync will never gain traction because AMD will likely be out of business in the near future. As a consumer, I wouldn't even buy an AMD GPU right now because driver support might cease in the next few years if the company goes under, much the same way 3dfx did. The only way AMD will survive as a company (before their debt repayment in 2019) is to sell off Radeon Technology Group to someone like Microsoft or even NVIDIA and go back to spending what little resources remain on CPUs. My prediction is that NVIDIA will be one of the companies bidding on RTG once the time comes much like it did with 3dfx.
 
Last edited:
what drugs are you on?

There are a ton of freesync monitors on the market, and intel is now going to support async.
 
Last edited:
Joker seems to think the market will be better without AMD... because like his namesake, some people just want to watch the world burn. And talk about it endlessly.
 
i enjoy a good train wreck as much as the next person, but not so much if it affects my wallet.
 
Back
Top