Forza, Vega destroys 1080ti

Are we seriously entertaining the argument that a single data point indicates a trend "If only devs would fully utilize Vega!" Uh, okay. If that is the world limbo championship low bar then sure:

My 980ti nearly matches the Vega 64. Is my 980ti a better card? Of course not. Just like Vega 64 doesn't even come close to the performance OR efficiency of a 1080ti. Everyone else can also feel free to add in your own irrelevant single game benchmarks.


Here is the truth: It doesn't matter. Vega is DOA and we all know it.
 
Are we seriously entertaining the argument that a single data point indicates a trend "If only devs would fully utilize Vega!" Uh, okay. If that is the world limbo championship low bar then sure:

My 980ti nearly matches the Vega 64. Is my 980ti a better card? Of course not. Just like Vega 64 doesn't even come close to the performance OR efficiency of a 1080ti. Everyone else can also feel free to add in your own irrelevant single game benchmarks.


Here is the truth: It doesn't matter. Vega is DOA and we all know it.



In a quite dated game...the whole thing here is, is this the full potential of DX12? Is this the future of games that were coded with XBONE X performance in mind ? Nvidia confirmed there is nothing wrong with the results, or at least that's what was reported.
 
I guess the reality is that AMD and Nvidia have enough design/architectural differences that we will always have developers having to make choices where sometimes it benefits one more than the other.

Probably those choices are more often based on business decisions than technical, so I find some of the arguments in these forums quite amusing.

But I guess some people need to show off their "knowledge superiority" somewhere.


Its possible that the dev optimized for AMD architecture, cause Forza is a console game first and for most, but this isn't the first time this developer has had issues with nV products in a previous Forza product/s actually 6 also had this problem and 3 did too, nor is it the only titles we have seen with the same type of behavior, the other games were AMD game evolved titles and were console first also, so is it a business decision yeah could be, the dev is comfortable with GCN architectures because they focus on console first. The engine developer pretty much stated the dev's choice to do something a certain way is not the way the engine was made, so most likely that is the reason, time, resources money.

I would think by now the dev would be experienced enough or have enough foresight if they did similar things as before, the same problems will occur? For us we saw the frame times and we knew the CPU usage was most likely the culprit, we really didn't even need the actual cpu/gpu usage to draw conclusions on it because we have seen it before, the dev should have more insight then us and be wary of what would happen even before it happens.
 
Last edited:
Looks like the gaming experience is great for all cards, the difference with performance on the top end does not look like it matters for the gaming experience in the end.

That being said, I think a lot of us want to know if this is showing architecture advantage when being used or if the hardware for Nvidia is not being used fully? How does it compare in IQ and game play to Project Cars II? As in is Forza 7 is giving better image quality (lighting, materials, effects, geometry, objects) or the other way around and how the performance compare between the two. The point if it could ever be remotely determined besides being somewhat subjective is if you optimized for each hardware design (Nvidia vs. RTG) which will give more? Rhetorical questions or thoughts.
 
Forza is still going to look graphically beautiful I am sure, so as long as it runs smoothly on my ultra wide I will be happy. It sure looked like it was CPU bottlenecked or something.
 


Looks like it happens on AMD GPU's too but to a lesser extent, so yeah its a dev issue.

So there is performance being left on the table on all hardware, but affects nV's more, about a 20-25% difference between utilization of nV cards from AMD cards, if those were equalized, that's the performance that is missing in the benchmarks.
 
So what is your take on others where the 1070 is equal or better than Vega64? Cant have it both ways.

Other games have other inefficiency issues? That's not 'both ways', that's the same way.

The only unique thing here is not that an issue exists, but the magnitude of the issue.
 
Lets be fair to the developer. Even with the supposed Nvidia bug, the game is still running in 4K on a 1080 above 60fps, which is impressive.

And with the Vega 56 doing so well (70+ fps at 4K) I think shows how much performance Vega has in a highly optimized modern API title.

Yes, I agree that there's likely a CPU bottleneck, but the game (if benchmarks are true) is still running great.

I'm downloading now and will confirm myself.
 
It's 100GB. Now way it's finishing tonight. Will try to do a video tomorrow.
 
That is not a game bug though, that is a Vega driver issue, cause if it was a bug in AMD's path you will see it on other AMD cards, which doesn't happen.

And those games are few and far between like one out of ever 10 games.

Thanks for a balanced viewpoint on this issue. A whole lot of Vegas erratic performance are driver related. It will take a couple more months before they are worked out. I have a Vega 56 and am awaiting my Alphacool Eiswolf GPX 120 aio gpu block. Even now my performance with undervolting and overclocking is far better than a 1070. With the Eiswolf I expect another 10% performance gain. That should put me at or slightly better than a 1080 FE.
 
Isn't this a Beta? Im not putting my chips on the table just yet.
Full game has been available since last Friday for Ultimate Edition preorders.
Are we seriously entertaining the argument that a single data point indicates a trend "If only devs would fully utilize Vega!" Uh, okay. If that is the world limbo championship low bar then sure:

My 980ti nearly matches the Vega 64. Is my 980ti a better card? Of course not. Just like Vega 64 doesn't even come close to the performance OR efficiency of a 1080ti. Everyone else can also feel free to add in your own irrelevant single game benchmarks.


Here is the truth: It doesn't matter. Vega is DOA and we all know it.

Not the whole truth. This is just the issue with DX12 rearing its ugly head again. Being lower level, DX12 requires specific coding paths for each IHV to take advantage their hardware. As mentioned with this game being developed for the Xbox, the code path probably contains a lot of specific optimizations for GCN 4 and 5 and little or none at all for Pascal.
 
Got Forza 7 running 4K Ultra Settings with 1 RX Vega 64.



No issue with CPU bottleneck here, and I'm getting great performance.

Looks like your core 0 is pegged at 100% like everyone else, meaning your performance could be even better if there were any load balancing going on.

Aside from that, you're still using dynamic quality.
 
Hmm. OK. I can try to tweak the settings. In any case, GPU usage is close to 100% so I don't see how that's a CPU bottleneck.
 
Looks like your core 0 is pegged at 100% like everyone else, meaning your performance could be even better if there were any load balancing going on.

Aside from that, you're still using dynamic quality.
You mean his CPU 4? The one dedicated to only input and has nothing to do with graphics, is affecting his performance?

GPU is running at %97 and many of the other cores are activated, how is that not multi-threaded?
 
Hmm. OK. I can try to tweak the settings. In any case, GPU usage is close to 100% so I don't see how that's a CPU bottleneck.


Did you do the update already? The one that came out Oct 3rd? That fixes some of the CPU usages already.
 


after the patch, so much better with GPU usage on nV cards. (still a bit there though you can see as the higher end cards are hitting the bottleneck)

just for comparisons same guy with demo



Huge difference is the CPU bottleneck. Still hasn't been entirely removed yet for nV cards, but its getting there.

Ironically the higher resolution of the Demo had higher CPU bottlenecks, would be nice to see with all settings equalized though.
 
Last edited:
Looks like excellent news to me! Hey, if you use the hardware properly, it will make a large difference in performance.
 
Looks like excellent news to me! Hey, if you use the hardware properly, it will make a large difference in performance.

That's been the case with GCN cards all along. They have great top theoretical power, but it is an unbelievable PITA to get anywhere near it due to how you have to program it.
Gotta get the horsepower to the wheels or it doesn't matter how much you have. I hope they revise the control model for Navi, or they will continue to have hit-and-miss performance, and struggle overall.
 


after the patch, so much better with GPU usage on nV cards. (still a bit there though you can see as the higher end cards are hitting the bottleneck)

just for comparisons same guy with demo



Huge difference is the CPU bottleneck. Still hasn't been entirely removed yet for nV cards, but its getting there.

Ironically the higher resolution of the Demo had higher CPU bottlenecks, would be nice to see with all settings equalized though.

60 fps cap versus no cap ???
 
It's not just the Vega vs 1080, same with RX 580 vs GTX 1060 - unlocked frame rate and latest update:

 
It's not just the Vega vs 1080, same with RX 580 vs GTX 1060 - unlocked frame rate and latest update:




Its much less than before was the point, if you see the 1060 it was going down to 50% GPU usage too before in the demo....
 
https://www.overclock3d.net/news/gp...-25_performance_boost_in_forza_motorsport_7/1

Our driver team spends a considerable amount of time developing optimizations and improvements for the latest games before they're launched, ensuring you have a great experience the second they're available. The work doesn't end at launch, however, as our team continually searches for further improvements in game code and our drivers, collaborating with developers whenever possible.

The fruits of this labor can be seen today in our new Game Ready driver, which introduces performance improvements of between 15-25%, depending on your system configuration, in the recently-released Forza Motorsport 7.
 
2017-10-09 (1).png 2017-10-09.png 2017-10-09 (2).png

Nvidia roars back to take the 4K crown on the GTX 1080. What a difference (a second) Game Ready driver makes! Also looks like they need a 3rd driver to fix 1080p and 1440p. :)
https://www.computerbase.de/2017-10...eiber-pascal-vega/#diagramm-forza-7-3840-2160
 
all those numbers mean shit actually, in fact everything posted in this thread actually just prove how bad canned benchmarks are, how harmful built-in benchmarks are for the industry and how much confusion can create on the end user and gamer. The recently made review of [H] about Vega64 vs 1080 vs 1080ti just prove that..

This are their results WITH 385.69 Drivers, so still no "magic drivers that improve performance" so still about 15%-25% extra performance for pascal (which also should be tested to see if what nvidia claim it's true).... too much shit about "Forza, Vega destroys 1080" and too much about dreamers wanting to see Vega in a position that isn't accurate just because AMD optimized their drivers for a built-in benchmark that doesn't represent any truth in real world gameplay.. it's nice to see once again [H] destroying and debunking built-in game benchmarks..

lol poor Anarchist, wonder what have to say now about.. =)

1507576569da7dn48w49_2_1_l.png
 
all those numbers mean shit actually, in fact everything posted in this thread actually just prove how bad canned benchmarks are, how harmful built-in benchmarks are for the industry and how much confusion can create on the end user and gamer. The recently made review of [H] about Vega64 vs 1080 vs 1080ti just prove that..

This are their results WITH 385.69 Drivers, so still no "magic drivers that improve performance" so still about 15%-25% extra performance for pascal (which also should be tested to see if what nvidia claim it's true).... too much shit about "Forza, Vega destroys 1080" and too much about dreamers wanting to see Vega in a position that isn't accurate just because AMD optimized their drivers for a built-in benchmark that doesn't represent any truth in real world gameplay.. it's nice to see once again [H] destroying and debunking built-in game benchmarks..

lol poor Anarchist, wonder what have to say now about.. =)

1507576569da7dn48w49_2_1_l.png
First the original claims were 1080p, this is 4K. Second the 1080Ti looks like SHIT here. Your argument here is actually worse than any rabid fanboi as it is as ignorant to the topic and moreso than the fanbois you condemn.
 
I hope the new drivers fixed the spikes in the GTX 1080Ti graph. Jumping from 60 fps to 160 fps and then back to 60 would give me a headache. My buddy bought a GTX 1080Ti last week and he loves the Forza series. I don't want him to have a bad experience with the game. Especially since I knew we were going to play together, so I bought the VIP version.
 
I hope the new drivers fixed the spikes in the GTX 1080Ti graph. Jumping from 60 fps to 160 fps and then back to 60 would give me a headache. My buddy bought a GTX 1080Ti last week and he loves the Forza series. I don't want him to have a bad experience with the game. Especially since I knew we were going to play together, so I bought the VIP version.
Judging by how good the 99th percentile now is with GTX 1080 there's no doubt about that.

The fact that GTX 1060 stomps over Fury X (and Polaris) is rather interesting. So, which was supposed to be the DX12 future proof card again?
 
Last edited:
Back
Top