AMD RX Vega 64 Outperforming NVIDIA GTX 1080 Ti in Forza 7

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
ComputerBase has published a number of Forza 7 benchmarks, and the majority of them list the RX Vega 64 on top. The test system included an Intel Core i7 6850K overclocked to 4.3GHz, paired with 16GB of DDR4 memory running at 3000MHz in quad-channel mode. The drivers used were Crimson ReLive 17.93 for AMD and 385.69 for NVIDIA.
 
image.png
 
Interesting to see the rx580 in it's rightful place with 1070 and 1060 so it means these things:


Vega architecture is good
Amd have their drivers sorted for this game

or Nvidia have messed up drivers which seems unlikely as rx580 is in it's rightful place by it's margin.

If the first scenario holds true it adds somewhat more value to vega but it's still not good by a stretch but I think the vega refresh may be a better hit just like:
2900XT-3870->4870 as they are practically the same arch with die shrinks,scales with larger designs and use of more modern api functions and loads.
or it''s a one time event and the card is just bad :D

anyways it may show what future games will bring as it would be the 1st,2nd,3rd or 4th time AMD cards age well...
I like to say that AMD know where they need to go, just not when.
 
Something wrong with Nvidia drivers they using then. Vega 56 should not be beating/matching a TI. Maybe some of the console special sauce made into the PC release since the Xbone using AMD hardware.
 
They need to stop showing benchmarks for one game. It's kinda meaningless. I don't care how graphic cards stack up for one game if it's not really indicative of anything concrete. Show some sort of aggregated score for as many games as you can. That will tell me which graphic card I want.
 
We will see.

I am a little bit of a skeptic, and think Nvidia's drivers may just be messed up on this one. We don't usually see this type of radical difference from title to title, or with driver/optimization maturity.

Truth is, Vega is essentially still built upon Southern Islands, right? So middling performance is likely not due to lack of optimization, but rather just due to being saddled with legacy architecture. It will be interesting to see what Navi will do, being the first all-Raja Koduri led (well, at least up until his sabbatical) design.
 
I remember reading something about Forza 7 on PC being limited to only using one core? Not sure if that combined with drivers on either side accounting for this.

My take on it is this: a broken clock is still right twice a day. Showing one single game out of the crap load of AAA heavy duty titles out there tells me absolutely nothing. Seems AMD is always really good at picking *ONE* single game and showcasing it to show how awesome their rehashed silicon is.

They are kicking ass on the CPU front and I wish them the best, but they haven't done anything in the GPU world that really challenges team green in quite a while.
 
Forza games have never been well optimized, so this seems like a random thing more than anything else.
 
I remember reading something about Forza 7 on PC being limited to only using one core? Not sure if that combined with drivers on either side accounting for this.

My take on it is this: a broken clock is still right twice a day. Showing one single game out of the crap load of AAA heavy duty titles out there tells me absolutely nothing. Seems AMD is always really good at picking *ONE* single game and showcasing it to show how awesome their rehashed silicon is.

They are kicking ass on the CPU front and I wish them the best, but they haven't done anything in the GPU world that really challenges team green in quite a while.

we've exhausted the whole theory about the 1 core bs and have proven it uses more than 1 core so that's dead. either way the game was developed specifically to use AMD hardware so it doesn't surprise me that it performs better on it.. we've seen similar results on the nvidia side when games are specifically developed for their cards.. the results don't really surprise me all that much but i'm sure nvidia will figure it out given they have the money and resources to do it. ultimately it'll come down to whether or not nvidia cares enough.
 
Favorable benchmarks like this whet the appetites of AMD fanboys and keep them happy while having competition is good for Intel and Nvidia. That benefits those of us that prefer Intel and Nvidia.
 
A more well rounded test would have been 8xMSSA 1080p, 4xMSAA 1440p, 2xMSAA 4K though might not have altered the outcome much. I think it boils down to the GPU dependency versus CPU dependency relationship going on with these MSAA settings and resolutions. It's like the inverse of the Ryzen vs Intel situation oddly enough. At first it might seem like it doesn't make a lot of sense yet it makes a huge amount of sense when you really stop and think about it. If this is somewhat consistent it means AMD/Radeon can offset each others weaknesses and balance out fairly neutral on performance around 1440p, but improve towards 4K on either front next generation. All they have to do is shift their focus more heavily in the opposite direction for their GPU's and CPU's and anyone that has a current AMD or Radeon CPU or GPU benefits by leaps and bounds in relative terms. I don't know if that's what's going on though it makes me wonder and would be a smart play in terms of long term strategy. It's funny because Intel and Nvidia don't have such a luxury at play unless happen to be in conversation with each other in the direction they are headed, but are competitors at the same time so is just rather unlikely. I'd like to see this tested both VEGA cards and both 1080 card's at these 3 resolutions for MSAA x2, x4, and x8 just for comparison sake out of curiosity sake. It'll interesting to see how the VEGA cards stack up against the 1080's once Ryzen is refreshed in 2018 do they become better or does the performance gap between companies widen?
 
This makes me happy. AMD deserves at least one little ray of sunshine on the only game is exsistance that it's good at. Correction, it's good a lot of games, just not the best. I'm glad this is a stand out game if nothing more than for the owners of AMD's new cards. I'm a huge proponent of Underdogs. I've loved AMD for this very reason for years. I don't currently use AMD GPU's but I'm open to the possibility in the future. I am hoping that the company gets bought and we can get some real leadership and talent to catch up to nVidia. Competition is so crucial. We've already seen how zero competition from AMD has allowed nVidia to release a 1070 ti product to go after ther AMD Vega 56. Which of course wouldn't have been done had AMD had more powerful GPU's released.
 
Looks slightly hopeful for Vega 64 but it's still horrendously overpriced at the moment ($620 - $650).
 
They need to stop showing benchmarks for one game. It's kinda meaningless. I don't care how graphic cards stack up for one game if it's not really indicative of anything concrete. Show some sort of aggregated score for as many games as you can. That will tell me which graphic card I want.

It is DX12 so there is only an one game result possible since there no other similar games using the same engine for the same features.

When you see an over all result on DX12 games it would not matter either because they(single game) each use features tailored to their own needs, which is much different from performance on DX11 API.

Every time someone mentions DX12 benchmarks I have a quiet chuckle because there is no such thing :)
 
Eventually all get dethroned its the way of things but there are some smoke and mirrors going on here.

In terms of 1080 or 1080TI performance it's been known for some time these cards aren't really designed for 1080p gaming. At that point a lot can get shifted over to the CPU and the cards don't really engage much then. I've done similar tests with my 1080SLI in 1080p and seen much of the same for many games. These are made for 1440p/4k not 1080p.

I read much of the thread over at Guru3d and saw some interesting points regarding frame caps in the engine, sampling gimmicks, and some unconventional approaches to workarounds to get the caps unlocked. This is far from a reliable test but more of a set of instructions on how to optimize your vega for this game. Good for them!

I still love my Pascal's but honestly its been a crap shoot when it comes to their drivers. I still believe the hardware is superior but I know from experience if you can optimize something then you'll get impressive results. The latest drivers finally restored my HDR and other color depths for my 4k displays, as well as SLI support for more than a few games. I can once again enjoy my 1080SLI rig. However, I'd say there's something wrong with them with this game. One post I saw on that thread related to DX12 and I suspect it's closer to the truth than not.
 
So a $650 video card isn't really an issue anymore?

I tip my hat to you guys, you are doing better then me.
 
I read much of the thread over at Guru3d and saw some interesting points regarding frame caps in the engine, sampling gimmicks, and some unconventional approaches to workarounds to get the caps unlocked. This is far from a reliable test but more of a set of instructions on how to optimize your vega for this game. Good for them!
.

I loved those comments where Nvidia crowd said driver problem. Then this guy bursts everyone's bubble by posting this
 
They need to stop showing benchmarks for one game. It's kinda meaningless. I don't care how graphic cards stack up for one game if it's not really indicative of anything concrete. Show some sort of aggregated score for as many games as you can. That will tell me which graphic card I want.

At least wait for the damn thing to release?
 
The butthurt is strong in this thread

Nah. I don't think it's butthurt. AMD upping their game literally helps everybody but nvidia. So unless some of the people here in the thread work for nvidia or own their stock, I don't think it's butthurt.
 
Nah. I don't think it's butthurt. AMD upping their game literally helps everybody but nvidia. So unless some of the people here in the thread work for nvidia or own their stock, I don't think it's butthurt.
Not sure an unintended developer bug counts as AMD "upping" their game. The 4K performance demonstrates it's not a GPU issue. Apparently the devs are fixing it, one cpu core is maxing out causing a bottleneck.

When factoring a Vega 64 barely keeps up with a 1070 in most games, I don't think anyone is going to be factoring a temporary fluke with a single game seriously into any buying decision.
 
Looking at the bigger picture. The so called fluke may just become more and more present. I own 2 1080 TI's and a couple of Vega cards. I could give two shits about who is faster. The FreeSync on the Samsung C32HG70 make it a nice combo :)
 
Looking at the bigger picture. The so called fluke may just become more and more present. I own 2 1080 TI's and a couple of Vega cards. I could give two shits about who is faster. The FreeSync on the Samsung C32HG70 make it a nice combo :)

My point exactly about optimizing. Match your parts and enjoy the ride. I feel the same way about my TI/G-sync rig.
 
So a $650 video card isn't really an issue anymore?

I tip my hat to you guys, you are doing better then me.


I don't think a $650 card is a major issue because the prices of other things have fallen, such as SSDs and monitors. I also think buying a top-tier card will be more future-proof, so you won't need to upgrade as often. In the past, I'd buy low to mid-end video cards but found myself upgrading often to keep up with newer games.

The two primary components I splurge on are the video card and monitor, since it creates the visual aspect of computing...
 
I think Pascal's 70% compression is definitely very much coming into play at 4K at 8x MSAA it explains why Fury X 4GB completely tanks at 4K and why the RX580 8GB suddenly loses to the GTX1060 6GB and GTX1080 8GB finally edges out Vega 56 8GB while Vega 64 8GB is edged out by the GTX1080Ti 11GB. Forza 7 is presumably a well multithreaded game based on how well it performs at 1080p on Vega that or the 8xMSAA is a bit favorable to AMD gpu's til they run out of or near their VRAM limits. Vega also seems to heavily favor Ryzen over Intel once as you begin upping the resolution. Pascal's compression become a advantage at higher resolutions once VRAM usage comes into play at extreme resolutions. This same test with a included Radeon Frontier Edition at 4K would be so much more revealing to know what's going on with VRAM usage. Running the same test at 2xMSAA and 4xMSAA as well at 4K would have given us a more definitive grasp of what's happening and why too. You can also see early on that Vega gains about 30FPS by reducing AA while Pascal only gains about 20FPS so it's more flexible to reduced image quality as well it appears at least here.
 
I think Pascal's 70% compression is definitely very much coming into play at 4K at 8x MSAA it explains why Fury X 4GB completely tanks at 4K and why the RX580 8GB suddenly loses to the GTX1060 6GB and GTX1080 8GB finally edges out Vega 56 8GB while Vega 64 8GB is edged out by the GTX1080Ti 11GB. Forza 7 is presumably a well multithreaded game based on how well it performs at 1080p on Vega that or the 8xMSAA is a bit favorable to AMD gpu's til they run out of or near their VRAM limits. Vega also seems to heavily favor Ryzen over Intel once as you begin upping the resolution. Pascal's compression become a advantage at higher resolutions once VRAM usage comes into play at extreme resolutions. This same test with a included Radeon Frontier Edition at 4K would be so much more revealing to know what's going on with VRAM usage. Running the same test at 2xMSAA and 4xMSAA as well at 4K would have given us a more definitive grasp of what's happening and why too. You can also see early on that Vega gains about 30FPS by reducing AA while Pascal only gains about 20FPS so it's more flexible to reduced image quality as well it appears at least here.

By far the most technical explanation I've seen in any thread regarding this.
 
Further reading:
https://www.techspot.com/news/71209-amd-vega-64-burns-past-gtx-1080-ti.html

Amd cards always seem to have "longer legs" than their nVidia counterparts.

Look how bad the Rx 390 destroys the Gtx 970. Those are historically close in performance.

It was the same story back during the gtx 680 vs Hd 7970. At launch, the gtx was favored. Different story, years later.

Keplar aged shockingly bad.

It is almost like nVidia does this on purpose with newer games to push loyalist in buying a new card every 2 years.

I have a feeling Maxwell performance will be even worse compared to Hawaii/Fiji for games released fy18.
 
The butthurt is strong in this thread

I am more amazed at the random ass conclusions people are jumping to for a game that's not even released. Both vendors usually have bugs in the first week or two.

I don't know how to desribe it. Tech news feels more and more like the "normal" news. One assine thought to the next. Ignoring actual facts...
 
I am more amazed at the random ass conclusions people are jumping to for a game that's not even released. Both vendors usually have bugs in the first week or two.

I don't know how to desribe it. Tech news feels more and more like the "normal" news. One assine thought to the next.

Oh I completely agree with you, this is way too early and way too small of a sample size to get any form of meaningful data out of, let alone come to a conclusion about anything, but the loyalists sure jumped in here fast didn't they :p
 
This is a good article worth looking at that helps put some of this into perspective as well.
https://www.techspot.com/review/1490-ryzen-vs-core-i7-vega-64-geforce-1080/page8.html

They happened to use a 6 core Intel CPU however in this Forza 7 test if I'm not mistaken. To me it's not so shocking it can perform great in new game software that can really leverage the hardware capabilities. I think AMD/Radeon has put more emphasis on the future of multithreaded load balanced gaming than Nvidia when they designed Vega and Ryzen as they what was coming where NV had to at best just preemptively guess what to expect out of AMD. I think we are nearing a point where multi-thread performance could have more emphasis than single thread performance in software due to the hardware available now that the focus is more heavily shifting in that direction.
 
So a $650 video card isn't really an issue anymore?

I tip my hat to you guys, you are doing better then me.
You can thank Nvidia for that - they can't compete in this one unreleased game so AMD can charge whatever they want for Vega 64. :troll:
 
That article is clearly fake, written in some gobbledygook made up language. :)
 
Back
Top