AMD vs Nvidia; GTX 1060 vs. RX 480 - An Updated Review.

cageymaru

Fully [H]
Joined
Apr 10, 2003
Messages
22,087
GTX 1060 vs. RX 480 - An Updated Review.
http://www.hardwarecanucks.com/foru.../73945-gtx-1060-vs-rx-480-updated-review.html

Hardware Canucks tested reference and OC'd variants of AMD RX 480 and GTX 1060 cards in a head to head showdown at resolutions of 1080p and 1440p. They tested a TON of games to see how the drivers for these cards have improved since launch. If anything you should be able to look at your favorite games and see which of these cards is faster for the games that YOU play.


The GTX 1060 6GB versus RX 480 8GB saga obviously doesn’t end here and if the last few months are anything to go by, these cards will be fighting tooth and nail until the day they’re replaced. What AMD has accomplished between Polaris’ initial rollout and now is impressive to say the least but their board partners have given the RX 480 a slight premium above its $240 launch price. NVIDIA on the other hand is hanging doggedly on and their board partners have responded by lowering the GTX 1060’s entry price. This has caused what should have been a runaway AMD win to degenerate into a tit-for-tat situation

So which one of these would I buy? That will likely boil down to whatever is on sale at a given time but I’ll step right into and say the RX 480 8GB. Not only has AMD proven they can match NVIDIA’s much-vaunted driver rollouts but through a successive pattern of key updates have made their card a parallel contender in DX11 and a runaway hit in DX12. That’s hard to argue against.

GTX-1060-UPDATE-100.jpg



GTX-1060-UPDATE-101.jpg


 
Wow a real not biased website updated a review with newest drivers and updates to the games? This usually does not happen since it is so time involved.....Tbh I did not expect these results, specially in DX11
 
Wow a real not biased website updated a review with newest drivers and updates to the games? This usually does not happen since it is so time involved.....Tbh I did not expect these results, specially in DX11

Yes, drivers DO make a difference over time. Pretty startling how well AMD did in some GameWorks titles also. Lots of good reading in there.
 
  • Like
Reactions: N4CR
like this
I saw this article this morning and thought about starting a thread on it. If you read this forum regularly you'd probably have the impression that the RX 480 is a POS, but this shows that it is pretty equal to a 1060 in most games and comes out on top in quite a few. VR is, of course, a different story as the [H] reviews show, but for regular gaming the RX 480 is a great card, and that has been my experience with the RX 480.

As an aside, the VR deficiencies of the RX 480 must be driver related. AMD really needs to focus on improving the drivers for a lot of the VR titles out there.
 
I saw this article this morning and thought about starting a thread on it. If you read this forum regularly you'd probably have the impression that the RX 480 is a POS, but this shows that it is pretty equal to a 1060 in most games and comes out on top in quite a few. VR is, of course, a different story as the [H] reviews show, but for regular gaming the RX 480 is a great card, and that has been my experience with the RX 480.

As an aside, the VR deficiencies of the RX 480 must be driver related. AMD really needs to focus on improving the drivers for a lot of the VR titles out there.

AMDs upcoming ReLive driver contains some VR. If that's just PR BS or actually delivering we have to see.
 
Misleading, they swapped out games in their test suite. Randomly compared a few of the games and the margins are basically the same.

Will need to compare lists, I bet they swapped out 1 or 2 Nvidia-favorables for AMD-favorables. When talking about a few % difference, that comes down to literally 1 game swinging the results. They could have used one game like Watch Dogs 2 (Nvidia) or Civ6 (AMD) and it would have flipped back in the 1060's favor. If you're going to compare single-digit differences over time, you really need to use the same games.
 
Misleading, they swapped out a bunch of games in their test suite. Randomly compared a few of the games and the margins are basically the same.

Will need to compare lists, I bet they swapped out 1 or 2 Nvidia-favorables for AMD-favorables. When talking about a few % difference, that comes down to literally 1 game swinging the results. If you're going to compare single-digit differences over time, you really need to use the same games.

Pointless to do drivers comparison if they are not gonna bother using the same game. If they are gonna add new games, might as well call it a new review instead of an update.
 
Pointless to do drivers comparison if they are not gonna bother using the same game. If they are gonna add new games, might as well call it a new review instead of an update.
Looking at the popularity of this article I might be able to guess why they did what they did.
AMD FineWine™ technology is REALLY GOOD at driving traffic to a website...
 
I think they used enough games that swapping one or two shouldn't matter.
 
Looking at the 10-point swing in DX11...

Games in the original review (July) which are gone: Far Cry 4, RottR, Battlefront.
Two games that favor Nvidia, one Gaming Evolved.

Which were replaced by: Infinite Warfare, Mankind Divided, BF1.
All three favor AMD, two Gaming Evolved.

If you want to see how much those games impacted the final result I guess you could average the difference of all 3 from both reviews and compare them. Then remove them from the final calculation and you will get a true 'driver improvement' comparison (those 6 games induce error).
 
Last edited:
Misleading, they swapped out games in their test suite. Randomly compared a few of the games and the margins are basically the same.

Will need to compare lists, I bet they swapped out 1 or 2 Nvidia-favorables for AMD-favorables. When talking about a few % difference, that comes down to literally 1 game swinging the results. They could have used one game like Watch Dogs 2 (Nvidia) or Civ6 (AMD) and it would have flipped back in the 1060's favor. If you're going to compare single-digit differences over time, you really need to use the same games.

Uff :(

Pointless to do drivers comparison if they are not gonna bother using the same game. If they are gonna add new games, might as well call it a new review instead of an update.

Yep.
 
Looking at the 10-point swing in DX11...

Games in the original review (July) which are gone: Far Cry 4, RottR, Battlefront.
Two games that favor Nvidia, one Gaming Evolved.

Which were replaced by: Infinite Warfare, Mankind Divided, BF1.
All three favor AMD, two Gaming Evolved.

If you want to see how much those games impacted the final result I guess you could average the difference of all 3 from both reviews and compare them. Then remove them from the final calculation and you will get a true 'driver improvement' comparison (those 6 games induce error).

Well ROTTR has improved tremendously on AMD cards, Battlefront runs great on AMD cards, and Far Cry 4 has always been a posterchild for AMD performance. So if you add those in the lead for AMD would increase significantly in my opinion.

FC4 performance from a million years ago.
http://www.techspot.com/review/917-far-cry-4-benchmarks/page3.html
 
Hardwarecanucks is using their own results, which has the 1060 up in all 3.
Here are the results of the 6 games. % is 1060 over/under 480.

Old
Far Cry 4: 11.85%
ROTTR: 22.8%
SW: 10.6%

New
DEMD: -7.44%
IW: -16.27%
BF1: 4.17%
 
You're missing the point. HardwareCanucks removed 3 games with a net gain for the 1060 and replaced them with 3 games with a net loss for the 1060. Then, in their conclusion, they compared the average result between the 2 benchmark suites as a difference over time.

Even in the hocp bench you linked, the 1060 is still ahead or at least tied.
 
You're missing the point. HardwareCanucks removed 3 games with a net gain for the 1060 and replaced them with 3 games with a net loss for the 1060. Then, in their conclusion, they compared the average result between the 2 benchmark suites as a difference over time.

Even in the hocp bench you linked, the 1060 is still ahead or at least tied.

Barely. If you could start up your PC and not have Geforce Experience or Crimson icons in the taskbar; I doubt you could honestly tell which card is running ROTTR. And that's the point in dropping games that nobody is playing at this time. It saves time for the reviewer and allows for newer games that people care about to replace them. Imagine that they had tested Dragon Age Inquisition and you saw that it had it's own test page. You and I would have skipped right over it as it is almost as old as Far Cry 4 and we know it runs fine on all hardware.

You have to cater to your audience and they upgrade their games FASTER than their hardware.
 
AMDs upcoming ReLive driver contains some VR. If that's just PR BS or actually delivering we have to see.

Yup, I lean AMD if it's close, but VR performance of the 1060 made it a no-brainer as it's not even close. They have to fix that but it's been a known issue for months now.
 
Yup, I lean AMD if it's close, but VR performance of the 1060 made it a no-brainer as it's not even close. They have to fix that but it's been a known issue for months now.

If I were doing VR, I would lean towards an Nvidia card. To be honest Kyle is right that one should consider a GTX 1070 as entry level VR. Nobody wants to be sick because of frame rate. Also the GTX 1060 doesn't have SLi capabilities and the new Serious Sam VR game has tremendous mGPU gains for AMD and some for Nvidia.

So the fact that the FIRST mGPU VR game doesn't support the GTX 1060 completely eliminates it from discussion. Of course the RX 480 is an afterthought for VR due to whatever driver issue or game engine optimizations that need to done to bring it's performance up in VR. The fact that it is awesome in CrossfireX under mGPU VR gives me some hope that AMD can fix whatever they need to do.

Just my opinion. Quite sure that others have one also. ;)
 
Last edited:
Barely. If you could start up your PC and not have Geforce Experience or Crimson icons in the taskbar; I doubt you could honestly tell which card is running ROTTR. And that's the point in dropping games that nobody is playing at this time. It saves time for the reviewer and allows for newer games that people care about to replace them. Imagine that they had tested Dragon Age Inquisition and you saw that it had it's own test page. You and I would have skipped right over it as it is almost as old as Far Cry 4 and we know it runs fine on all hardware.

You have to cater to your audience and they upgrade their games FASTER than their hardware.
The premise of the article is false, or at least the conclusions people are drawing from it are false, which is why I said it was misleading.
You can't take an original benchmark that had games favoring Nvidia, then do a new benchmark with 3 new games that favor AMD, and claim that AMD has "improved performance" over the last 6 months. Replacing games in the suite doesn't mean anybody lost or gained performance. By the same logic we might claim HardwareCanucks used too many Nvidia-favorable titles back in July which exaggerated the 1060's lead.

And again, instead of DEMD + IW + BF1 they could have used WD2 + CIV6 + Dishonored 2 which would have skewed the results in the opposite direction and they could have claimed the 1060 gained performance since July. If you want to see how much the relative performance of GPUs has changed over a period of time, you have to test the same games. Their conclusion is entirely dependant on the games they chose, in this case those 3 games in particular change the result more than anything else.

We also can't make any claims about those old games' performance today (you cited ROTTR) since HardwareCanucks didn't re-test them in their suite, and it's not accurate to compare benchmarks from different websites.

And just to be clear, the article itself points out that they are using different games. People reading the article are ignoring that and citing FineWine, ie driver improvements.
 
The premise of the article is false, or at least the conclusions people are drawing from it are false, which is why I said it was misleading.
You can't take an original benchmark that had games favoring Nvidia, then do a new benchmark with 3 new games that favor AMD, and claim that AMD has "improved performance" over the last 6 months. Replacing games in the suite doesn't mean anybody lost or gained performance. By the same logic we might claim HardwareCanucks used too many Nvidia-favorable titles back in July which exaggerated the 1060's lead.

And again, instead of DEMD + IW + BF1 they could have used WD2 + CIV6 + Dishonored 2 which would have skewed the results in the opposite direction and they could have claimed the 1060 gained performance since July. If you want to see how much the relative performance of GPUs has changed over a period of time, you have to test the same games.

Their conclusion is entirely dependant on the games they chose, in this case those 3 games in particular change the result more than anything else.

And just to be clear, the article itself points out that they are using different games. People reading the article are ignoring that and citing FineWine, ie driver improvements.

I understand what you're saying. I'm just saying that nobody is playing some games thus Hardware Canucks viewership don't care about those. I mean [H]ardocp probably would test Crysis 1 if you'll would give more page clicks on it. ;) Watch Dogs 2 just came out so you know that AMD and Nvidia performance will increase over time; is it fair to include that? Infinite Warfare, Battlefield 1, and Mankind Divided have been out at least a month and many bug fixes have been released. Thus it makes sense to include those. Civilization 6 literally just got it's DX12 patch and I'm not sure how many revisions to squash bugs in that title have been released?

Oh and AMD performance supposedly increased in Dishonored 2 after the 1.03 patch. Why do I know this even though I don't own the game? I read too many articles. :( Ha ha!
 
If I were doing VR, I would lean towards an Nvidia card. To be honest Kyle is right that one should consider a GTX 1070 as entry level VR. Nobody wants to be sick because of frame rate. Also the GTX 1060 doesn't have SLi capabilities and the new Serious Sam VR game has tremendous mGPU gains for AMD and some for Nvidia.

So the fact that the FIRST mGPU VR game doesn't support the GTX 1060 completely eliminates it from discussion. Of course the RX 480 is an afterthought for VR due to whatever driver issue or game engine optimizations that need to done to bring it's performance up in VR. The fact that it is awesome in CrossfireX under mGPU VR gives me some hope that AMD can fix whatever they need to do.

Just my opinion. Quite sure that others have one also. ;)

Well, I'm using a 1060 now for VR with Oculus and haven't gotten sick at all. I can usually maintain 90fps in the games I play most, so it's not an issue - even with my anemic cpu.

Minimum for Oculus is a gtx960, and a gtx1060 is roughly on par with a gtx980 (non ti), so it's plenty for a good experience. Will a 1070 be better? Sure. $150 worth better? Not for me.
 
Oh come on HardwareCanucks... You had one job...

This review is totally worthless as a 'performance over time' comparison if they swapped those games lol.

We should go look at the games that actually were tested twice and focus on the change in relative performance on those.
 
If they swapped only 3 games does that not says enough about the magical game performance of the GTX 1060.
 
Oh come on HardwareCanucks... You had one job...

This review is totally worthless as a 'performance over time' comparison if they swapped those games lol.

We should go look at the games that actually were tested twice and focus on the change in relative performance on those.

Yes, because nobody buys new games to play. I think it would be a good idea to test the original Mechwarrior as it used to KILL my 386 PC on the highest settings when I was in college ~ 1991.

So in 5 years from now and Hardware Canucks wants to do a retro RX 480 vs GTX 1060 "Where are they at now?" series they should only use the original games in the review instead of including some games that have been released in that 5 year window? It would be blasphemous to alter the testing suite? Kick some titles that nobody is playing because the new shiny is out?

If I were to test classic car performance over time, I can only use leaded gas from that time period even though it was outlawed? Guess that means that the Tesla can't be tested against a 1950's Mercedes Benz.

I understand your desire for pure Apples to Apples over time, but nobody cares about Far Cry 4 nowadays. I own Battlefront and I think it's cool. To reminisce about in my EA library just before I fire up Battlefield One or TitanFall 2.


I think the next level of video card testing should use unpatched games. To keep the playing field level and Apples to Apples... If a developer patched the game then it would skew the results right? Especially if they did something like add Vulkan support or DX12 at a later date!
 
Nah what people are saying is saying you need to ensure settings/games are the same because otherwise, the conclusion of the article isn't justified because there is no direct correlation between the end %, which is a valid argument.

now if Hardware Canucks, kept the same games and then a different part of the review with different games, then we can see the correlation of how things shifted, but without the original shift (same games/settings across that time frame) you can't see the changes taking place, further more then trying to add in the new games the shift can be even more dramatic, which causes margins of error to be exceeded.

Now you might be able to make a general conclusion but by no means is that conclusion 100% absolute because its like looking at a trend instead of actual results. A Trend might be wrong and less data you have the more chances it might be wrong, and that is exactly what you have here a trend with compounded by the fact you need to exclude some of the data points because there is nothing to compare to because of the reasons above.

Keep this in mind remember the trend of steam numbers and graphics card market share %? I stated that trend the only way it would be correct is because of there are other trends from retailers that are mimicking it. If it was only steam or only one of the other retails that had that trend, I would not have backed up those numbers.
 
Last edited:
I can't believe anyone would have an issue with the suite of games - if you want to determine which card is better now, you use the games that are popular now, it's that simple. And, in the games that are popular now, the 480 has taken a small lead vs the games that were popular when the original evaluation took place.
 
I can't believe anyone would have an issue with the suite of games - if you want to determine which card is better now, you use the games that are popular now, it's that simple. And, in the games that are popular now, the 480 has taken a small lead vs the games that were popular when the original evaluation took place.


I don't think anyone has a problem with the suite of games, the problem is, there are difference between the testing of settings of those games and different games being used too.
 
Last edited:
I can't believe anyone would have an issue with the suite of games - if you want to determine which card is better now, you use the games that are popular now, it's that simple. And, in the games that are popular now, the 480 has taken a small lead vs the games that were popular when the original evaluation took place.

There is a huge issue with not using the same suite of games (and not even with the same settings in the games they kept...) in both reviews and then trying to draw conclusions about performance decrease/increase over time. It's comparing apples to oranges.
 
On a side note, I doubt it will sway many people to choose a different card. Most people have their minds set on a video card from the tier of performance and the brand, not a tiny increase in a few games compared to the other. AMD fans will buy the RX 480 and Nvidia fans will buy the 1060. They are close enough in performance that the margin gets filled with brand loyalty anyways.
 
The point of the article is to assess performance improvements since release... Testing anything other than the exact same games at the exact same settings on the same exact machine doesn't contribute whatsoever to the conclusion.
 
The point of the article is to assess performance improvements since release... Testing anything other than the exact same games at the exact same settings on the same exact machine doesn't contribute whatsoever to the conclusion.

So if a game developer patches in say DX12 suddenly out of the blue then it is WRONG to test it in a follow up test? If a card's drivers or a game patch alleviates a performance bug then it is wrong to use different settings in a followup test of a game? That's kinda weird don't you think? So what WOULD you be testing in that scenario? It would be disingenuous to run the same settings if something has changed to make the situation better. That's the whole point of showing what was possible then and what is possible now.
 
The point of the article is to assess performance improvements since release... Testing anything other than the exact same games at the exact same settings on the same exact machine doesn't contribute whatsoever to the conclusion.

But is that the point of the review? On the first page he states:
With all of this taken into account and countless numbers of gamers looking towards the RX 480 versus GTX 1060 debate as being central to their buying choices, I’ve decided to delve back into this battle. As such, this particular article will use many of today’s newest triple-A titles alongside some old faithful games to see how things stack up now that both these cards have had nearly four months to settle in their lineups. Has AMD been able to leverage their frequently-marketed DX12 superiority to good effect? Has NVIDIA’s supposed driver superiority been able to keep their card ahead? Before we find out, let’s talk about the competitors since they run the gamut from reference to pre-overclocked.

He then goes on to say:
The entire focus of this “re-review” is to figure out where the GTX 1060 6GB and RX 480 8GB stand after more than 120 days on the market. Out of box performance in the newest games is the aim here and I won’t even touch upon overclocking since that’s a slippery slope where one sample may benefit over the other. The goal here is to see which card offers the most bang for your buck and now than both AMD and NVIDIA have settled things with drivers and new games have launched before Christmas, this should be a perfect time for an update to our original reviews.

To me (and if you read the rest of the first page) it sounds like the point of the article is to compare the two cards as they stand in the market today, against the current crop of relevant games. This includes performance in the latest games as well as the latest prices for each card. It is not to see how drivers have improved performance relative to launch drivers.

I don't understand why people are getting so bent out of shape about this.
 
Nah what people are saying is saying you need to ensure settings/games are the same because otherwise, the conclusion of the article isn't justified because there is no direct correlation between the end %, which is a valid argument.
Maybe people should read what the article stated before drawing conclusions or dismissing them.

I don't understand why people are getting so bent out of shape about this.
It is weird eh :)

The point of the article is to assess performance improvements since release... Testing anything other than the exact same games at the exact same settings on the same exact machine doesn't contribute whatsoever to the conclusion.
from the review :
The entire focus of this “re-review” is to figure out where the GTX 1060 6GB and RX 480 8GB stand after more than 120 days on the market. Out of box performance in the newest games is the aim here and I won’t even touch upon overclocking since that’s a slippery slope where one sample may benefit over the other. The goal here is to see which card offers the most bang for your buck and now than both AMD and NVIDIA have settled things with drivers and new games have launched before Christmas, this should be a perfect time for an update to our original reviews.

That is his point not what you write ...
 
  • Like
Reactions: Meeho
like this
The misleading comparison tables at the end are causing people to draw false conclusions about AMD's performance changes, re: driver improvements.

There are a few very specific problems:
1. The article is going viral based off misinformation from people who didn't actually read it.
2. His justification about changing the games vs the relative results at end are flimsy at best. Like I said before he replaced 2 GameWorks games with 1 AMD GE game which swings their relative performance unnecessarily in AMD's favor. The choice was subjective on his part, had he chosen any pro-Nvidia titles it would have swung the result back in Nvidia's favor and make it appear as though the GTX 1060 aged better.

The actual change in performance from identical games is only a few %, had that been the headline this story obviously would have fallen flat. In it's current state it's borderline clickbait.

So to summarize, the article and the benchmarks themselves are fine. The problem is that at a glance (which is how most people view these articles) it gives the wrong impression.
 
But is that the point of the review? On the first page he states:


He then goes on to say:


To me (and if you read the rest of the first page) it sounds like the point of the article is to compare the two cards as they stand in the market today, against the current crop of relevant games. This includes performance in the latest games as well as the latest prices for each card. It is not to see how drivers have improved performance relative to launch drivers.

I don't understand why people are getting so bent out of shape about this.


How does that compare with having different settings across DX11 and DX12 in the same game then?
 
I would postulate the article and benchmarks were specifically designed to make AMD look like it gained more performance than it actually did, but that's just my speculation.
The goal here was to make a viral, FineWine-clickbait article for /r/AMD to salivate over and it worked.
 
Maybe people should read what the article stated before drawing conclusions or dismissing them.


It is weird eh :)


from the review :


That is his point not what you write ...


Oh Pieter maybe someone should look at the settings used across DX11 and DX12 versions of the same game then? No one is making assumptions about the article without looking at what was done.

Look at H reviews, why do you think Kyle, Brent and other H guys put in the APPLES TO APPLES comparisons a few years ago? Because that is the base line from where a reader can analyze from.

Without that baseline you have no analysis. You have a guesstimate. A guesstimate is no better than a trend without other metrics backing up that trend.
 
I would postulate the article and benchmarks were specifically designed to make AMD look like it gained more performance than it actually did, but that's just my speculation.
The goal here was to make a viral, FineWine-clickbait article for /r/AMD to salivate over and it worked.


That is the thing it can both ways no one knows because there is no baseline. I wouldn't say they did it for click bait lol, Hardware Cnucks tend to do a fairly good job, most likely they just didn't realize what they were doing for what ever reason, lack of time?

One of the few sites that have done driver revision tests, so I will give them the benefit of the doubt.
 
Last edited:
How does that compare with having different settings across DX11 and DX12 in the same game then?

Are you talking about differences in the settings in DX11 vs DX12 solely in the new article, or as compared to the old article? I honestly didn't pay much attention to the settings in DX11 vs DX12 in the new article too much, but which ones are different? If it were me I would have used the same settings...
 
Both. That is very important, you need to look at both and see where things changed. Well just an example ROTR, settings are way different between both reviews.

This isn't even margin of error on that one, its just not right to begin with.
 
Back
Top