AMD vs Nvidia Multi GPU Scaling

Joker seems to think the market will be better without AMD... because like his namesake, some people just want to watch the world burn. And talk about it endlessly.
The market would be better off without Nvidia...
AMD is already in the consoles (good for ports), and look at all the stuff they've accomplished even with their pitiful budget. Imagine what AMD could do with Nvidia's cash. :D

Their biggest priority is hiring some new software engineers and the only thing hold them back is cash flow.

Intel/AMD is a different story.
 
And when Brent uses more gamesdontwork titles you do not complain and whine gtfoh.
Probably because the games are new and push the limits of what GPUs can do not games that already run well enough without multi gpu.
 
what drugs are you on?

There are a ton of freesync monitors on the market, and intel is now going to support async.

So are you going to be doing a lot of gaming with your Intel IGP on a FreeSync monitor? :rolleyes:


Joker seems to think the market will be better without AMD... because like his namesake, some people just want to watch the world burn. And talk about it endlessly.

Doesn't have anything to do with me thinking the market will be better off without the current AMD (which you're right, I do but not for the reasons you probably think), but rather the reality of the situation. They are holding massive debt that's due soon with sharply declining revenue and no profits to speak of and a lowered credit rating. It should be obvious that's why they created Radeon Technology Group, that's next on the chopping board if things get really bleak in the next year. If AMD had to just slash 5% of it's workforce, what do you think the odds are of them providing adequate GPU driver support in the next 1-2 years? They'll keep slashing until they have the bare min. amt of guys working on this stuff.


The market would be better off without Nvidia...
AMD is already in the consoles (good for ports), and look at all the stuff they've accomplished even with their pitiful budget. Imagine what AMD could do with Nvidia's cash. :D

Their biggest priority is hiring some new software engineers and the only thing hold them back is cash flow.

Intel/AMD is a different story.

There's been nothing to suggest AMD's console chips have resulted in any advantage over NVIDIA when those games are ported over, in fact it's usually the other way around with NVIDIA winning the top end more often than not. And I can imagine what AMD would do with NVIDIA'S cash, it would look something like this:

burning-dollars-live-wallpaper-250435-1-s-307x512.jpg
 
Last edited:
So are you going to be doing a lot of gaming with your Intel IGP on a FreeSync monitor? :rolleyes:




Doesn't have anything to do with me thinking the market will be better off without the current AMD (which you're right, I do but not for the reasons you probably think), but rather the reality of the situation. They are holding massive debt that's due soon with sharply declining revenue and no profits to speak of and a lowered credit rating. It should be obvious that's why they created Radeon Technology Group, that's next on the chopping board if things get really bleak in the next year. If AMD had to just slash 5% of it's workforce, what do you think the odds are of them providing adequate GPU driver support in the next 1-2 years? They'll keep slashing until they have the bare min. amt of guys working on this stuff.
Intel's integrated graphics have gotten a lot better in Broadwell and Skylake and low/mid range parts where you can't run a consistent >60fps benefits a lot from FreeSync/GSYNC. Certainly nothing wrong with them adding that feature, and it means A LOT more people will have access to it.
 
The market would be better off without Nvidia... AMD is already in the consoles (good for ports)

I haven't see any correlation, TBH. I've seen no evidence that AMD chips in consoles have made any diff in ports, for any IHV. I know that's been the oft-sung tale among bros since AMD chips were announced in them, but both use extremely proprietary API's. Can you think of a single game where AMD in the consoles has translated into a demonstrable advantage for AMD GPU's on PC?

look at all the stuff they've accomplished even with their pitiful budget.

Mantle's a pretty big deal as it turns out since they allowed Microsoft to co-opt it for DX12 (bad) and gave Khronos Vulkan (good). It's a shame they aren't able to capitalize on it commercially more, they aren't getting sufficient credit for it IMHO. I reckon their longterm play was "Giving it away will help our APU division eventually" but short term it feels like they're owed more.
 
Doesn't have anything to do with me thinking the market will be better off without the current AMD (which you're right, I do but not for the reasons you probably think), but rather the reality of the situation. They are holding massive debt that's due soon with sharply declining revenue and no profits to speak of and a lowered credit rating. It should be obvious that's why they created Radeon Technology Group, that's next on the chopping board if things get really bleak in the next year. If AMD had to just slash 5% of it's workforce, what do you think the odds are of them providing adequate GPU driver support in the next 1-2 years? They'll keep slashing until they have the bare min. amt of guys working on this stuff.

Feel free to explain your reasoning behind AMD leaving the market being a positive for consumers.
 
The market would be better off without Nvidia...
AMD is already in the consoles (good for ports), and look at all the stuff they've accomplished even with their pitiful budget. Imagine what AMD could do with Nvidia's cash. :D

AMD would skin you alive if they had no competitor.

550$ 7970 should be good reminder of what they can do when they are ahead.
 
What's the point of better scaling when half the games don't work and/or drivers take a year to materialize (Farcry 4). Unless you are still playing Tomb Raider, Thief and Hitman Absolution or some other game from 2013/early 2014, nvidia is the only choice for today's games.

Their win10 beta software is hit or miss but in mature OS like win 7 and 8 you can get better experience with SLi than CFX, granted you are doing single screen gaming.

If you are on MGPU and multi screen then you are fucked sideways lol be it nv or AMD.
 
Feel free to explain your reasoning behind AMD leaving the market being a positive for consumers.

AMD management has been incompetent for the better part of a decade and things are not getting better, just worse. With them constantly cutting jobs just so they can pad the books and make sure their executives get paid (and paid BIG), they've let their product R&D drop to the lowest levels they've ever been and this trend has no signs of slowing down. At this point, they are not remotely competitive on the CPU side which gives Intel a clear monopoly but prevents any talk of x86 monopoly because AMD is just hanging on. On the GPU side, it's the same situation really, with probably <20% market share right now and very little money to spend on R&D, marketing, dev relations and promotion, they will continue to get pummeled by NVIDIA and not really pose much competition. So those people that always talk about doomsday scenarios without AMD don't realize that AMD is already absent and has been for a while.

If AMD's demise was speed up without being prolonged for the cushy executive golden parachutes, there's hope that someone more competent could come in, get rid of the executives, maybe spin off RTG to NVIDIA or another company with a generous licensing agreement for GPU IP and start focusing on CPUs again. I know the question of the x86 cross license always comes into play but I'm of the belief something would be worked out.

An NVIDIA dominated GPU market is already here, AMD no longer making GPUs would hardly make a dent because NVIDIA has to keep innovating to drive purchases, otherwise everyone would sit on their GPUs for 4-5 years and NVIDIA would go bankrupt. They can't afford 5-10% incremental gains like the x86 market because GPU customers wouldn't go for it, at least not enough to sustain the type of growth and margins NVIDIA likes. So while things would continue as they have on the GPU side of things, AMD would no longer be fighting a war on two fronts and be more cash flush which would potentially give consumers more choice in the stagnant x86 market, especially if the new owners of AMD take over it's debts and inject more into R&D. AMD's problem is not the engineering, they've got a lot of talented people, it's always been the trash at the top. One of the biggest mistakes in GPU history was the acquisition of ATi by AMD (overpaid) and the subsequent sale of ATi's tech to Qualcomm but unfortunately that's how the leeches that run AMD do things, they keep slicing and dicing until there's nothing left and then they leave.

Edit: Just want to add that I think in the long term, even NVIDIA is screwed unless it comes up with a better strategy to diversify. The future is integrated mobile devices, AMD was on the right track but just couldn't execute. NVIDIA tried with Tegra and failed and now is depending on GeForce more than ever which isn't a good thing.
 
Last edited:
I haven't see any correlation, TBH. I've seen no evidence that AMD chips in consoles have made any diff in ports, for any IHV. I know that's been the oft-sung tale among bros since AMD chips were announced in them, but both use extremely proprietary API's. Can you think of a single game where AMD in the consoles has translated into a demonstrable advantage for AMD GPU's on PC?

You can thank DX11 for that. With DX12 the closer to the metal optimizations should allow exploiting the similar feature set of the hardware.
 
AMD would skin you alive if they had no competitor.

550$ 7970 should be good reminder of what they can do when they are ahead.

Let's not pretend nVidia wouldn't/doesn't do the exact same. How much did a full GF114/GF110 cost? And how much did a full GK104/GK110 cost?
 
What's the point of better scaling when half the games don't work and/or drivers take a year to materialize (Farcry 4). Unless you are still playing Tomb Raider, Thief and Hitman Absolution or some other game from 2013/early 2014, nvidia is the only choice for today's games.

Their win10 beta software is hit or miss but in mature OS like win 7 and 8 you can get better experience with SLi than CFX, granted you are doing single screen gaming.

If you are on MGPU and multi screen then you are fucked sideways lol be it nv or AMD.

Seen many many people bitching about win10 drivers on Nvidia than AMD... Single card or more. People with experience in both typically say AMD is better on W10. Why is that?

But oh of course, in green world, there are more out there and AMD drivers suck regardless of what people say.
 
Seen many many people bitching about win10 drivers on Nvidia than AMD... Single card or more. People with experience in both typically say AMD is better on W10. Why is that?

But oh of course, in green world, there are more out there and AMD drivers suck regardless of what people say.

I've got a VAIO with a 640M LE that runs great under Windows 10. My desktop has Titan X SLI and I've had zero problems with it in games and Windows 10.
 
Seen many many people bitching about win10 drivers on Nvidia than AMD... Single card or more. People with experience in both typically say AMD is better on W10. Why is that?

But oh of course, in green world, there are more out there and AMD drivers suck regardless of what people say.
I have already stated that win10 beta software is a hit or miss for nvidia. However, if you read latest hard review fury was a miss in 3/6 games of their suite whereas nvidia ran all 6 games fine. Fwiw.
 
I've got a VAIO with a 640M LE that runs great under Windows 10. My desktop has Titan X SLI and I've had zero problems with it in games and Windows 10.

GTX970 SLI here, zero problems with Windows 10. I was a huge ATI fan but I got sick of their horrible drivers and taking forever to get a crossfire profile for even AAA titles. Nvidia usually releases an SLI profile for AAA titles either the day of release or within a week. I also just jumped ship from a FX-8350 to a 4790k and can't believe the difference. I was a HUGE AMD/ATI fan but I'm not going to put up with subpar crap just because I "like" a company.
 
Last edited:
Oh god, this is BEYOND fanboy. First ever chart showing single Fury X > 980ti. Thanks for the laugh OP
 
Very true and the reality is FreeSync will never gain traction because AMD will likely be out of business in the near future. As a consumer, I wouldn't even buy an AMD GPU right now because driver support might cease in the next few years if the company goes under, much the same way 3dfx did. The only way AMD will survive as a company (before their debt repayment in 2019) is to sell off Radeon Technology Group to someone like Microsoft or even NVIDIA and go back to spending what little resources remain on CPUs. My prediction is that NVIDIA will be one of the companies bidding on RTG once the time comes much like it did with 3dfx.

People need to pay attention to the above post. For those of us vets who were around at the beginning of 3d graphics in gaming- read the above post.

The loss of 3DFX changed everything. Forcing GLIDE API games onto OpenGL- assuming they had that support. Leaving a lot of people left in the lurch. Even though the drivers were downloadable for a while.

Beware of this with AMD. The graphics division would likely be gobbled up by Intel, or worse, Apple who would not support PCs going forward.

The processor business would be sold as bits of IP- leaving nothing cohesive for continued development.

This is a real danger- and it's not good for the industry.
 
All businesses fail. One day Nvidia, AMD, IBM, etc will be gone. All civilizations fail also. Who thought Rome would die off? Doesn't mean that you spend all day waiting for the sky to fall. :)
 
All businesses fail. One day Nvidia, AMD, IBM, etc will be gone. All civilizations fail also. Who thought Rome would die off? Doesn't mean that you spend all day waiting for the sky to fall. :)
There are people buying graphics cards right now who will still be using those exact cards when AMD goes under. Something to think about.
 
So what would you do if nvidia was only GPU maker released 2 cards 970 and TitanX, sold the 970 claiming 4GB over half the memory was lower performance RAM and caused stuttering @ 1080. Either spend 1K, accept the lie or don't PC game
 
So what would you do if nvidia was only GPU maker released 2 cards 970 and TitanX, sold the 970 claiming 4GB over half the memory was lower performance RAM and caused stuttering @ 1080. Either spend 1K, accept the lie or don't PC game

What would YOU do?
 
What's the point of better scaling when half the games don't work and/or drivers take a year to materialize (Farcry 4). Unless you are still playing Tomb Raider, Thief and Hitman Absolution or some other game from 2013/early 2014, nvidia is the only choice for today's games.

Their win10 beta software is hit or miss but in mature OS like win 7 and 8 you can get better experience with SLi than CFX, granted you are doing single screen gaming.

If you are on MGPU and multi screen then you are fucked sideways lol be it nv or AMD.

Nvidia is the only choice? really dude? when 290x still beating nvidia for less LOL

BTW SLI sucks at the moment, you should know that... so please....
 
Nvidia is the only choice? really dude? when 290x still beating nvidia for less LOL

BTW SLI sucks at the moment, you should know that... so please....


I don't understand the blanket hate of SLI. I've been running SLI for a couple of years now without any major issues. First GTX760 and now GTX970, first on an AMD setup (fx8350 on Asus Crosshair V) now on an Intel setup (see sig). I ran crossfire before this (6950's shader unlocked) and had a ton more problems than I have had with SLI. It scales great in the games I play (BF and Batman along with many others) and I don't have any stability issues.

I guess maybe I'm just lucky or something but I don't understand the blanket SLI/Multiple GPU sucks statements that are always made around here.
 
SLI is certainly not without its problems (EG MFAA not working with SLI, and DSR not working with SLI if you have G-Sync monitor hooked up, this problem has been around since at least december last year).

From reviews I have seen, assuming when both Crossfire and SLI are supported, Crossfire works better in general. But often the other two parts of that comparison (Actual Crossfire/SLI support and individual GPU horsepower) can often skew the end result.

Fury X crossfire is a perfect example of the above scenario. 980ti is better than Fury X in most regards, but Crossfire them together, Fury X trades blows with 980ti for Crossfire and SLI supported games.
 
Back
Top