Forza Horizon 4 DX12 Free Benchmark

Good for AMD. It's good to see them get a win. I do believe that Vega is quite powerful, perhaps just as powerful as a GTX 1080 Ti. It's held back by software and optimization like most things AMD.
Maybe AMD can rent the Nvidia driver team.
 
System configuration, including driver versions, are on page 15. I also find it odd, though, especially given that the 1080 Ti is slower than the 1070 Ti at 1920x1080. Must have something to do with how the benchmarks is calculating the numbers. Is the GPU supposed to be the final combined number?

View attachment 103779
Why are they under-clocking the 7700k? That chip should run 4.2ghz at stock... Do I smell some shenanigans?

**edit**

Just ran the benchmark at 4k "ultra" preset (same settings as the linked PDF) and my 7700K with gtx 1080 (under water mind you) averaged 67.9fps... so either my 1080 is a bit faster than a 1080ti - or AMD is stacking the deck.
FH4.png


**Edit 2**

Just tried out the 1080 "High" settings... Here's my benchmark results (top) compared to the benchmark for a Vega 64 they published in that PDF... Not sure how they're getting their numbers.
Forza Horizon 4 Benchmark Guide_09 13 2018-7.png
 
Last edited:
4790k @ 4.5, 16Gb, 970 1080p mix of high/ultra minus ssr and blur and getting 75+
runs great. I vsync to monitor and gpu usage is ~80% and cpu ~50% peaked at 70 for loading.
 
Obviously, this is the same issue as Forza 7. the 1080Ti here is slower than even 1070Ti. And all NVIDIA GPUs are slower than AMD. NVIDIA will released a driver that fixes this.
 
Obviously, this is the same issue as Forza 7. the 1080Ti here is slower than even 1070Ti. And all NVIDIA GPUs are slower than AMD. NVIDIA will released a driver that fixes this.
The demo runs fine on NVIDIA hardware. It's just AMD shenanigans. The PDF is basically an advertising brochure. Check out the game version differences. It could very well be that the game was falling back to generic rendering code at the time on NVIDIA hardware and now the current version has vendor-specific code for NVIDIA, which is needed in the DX12 world.
 
I'd like to point out that a decent V64 has regularly been in stock on Newegg for $499 for the last couple months. Specifically, the Nitro+, although it keeps selling out at that price point. It was in stock again earlier this week, but is sold out again until the next batch arrives. They are usually taking a couple days to sell out though, so an auto-notify email is usually enough to get one if you have the money on hand and you really want one.
https://www.newegg.com/Product/Prod...14202321&cm_re=vega_64-_-14-202-321-_-Product
I was just going by what you could purchase yesterday.
 
Does anyone know what the "gpu limited" stat actually means? Is that the percentage of time the GPU is limited by the CPU, or is it the other way around? Based the results I saw I assumed it's the time the gpu was limited by the CPU, since at 4k it was 4.4 and at 1080p it was 96.6 — But then Kyle's benchmark has it pegged even at 4k ultra, so I'm really not sure.
 
I'd like to point out that a decent V64 has regularly been in stock on Newegg for $499 for the last couple months. Specifically, the Nitro+, although it keeps selling out at that price point. It was in stock again earlier this week, but is sold out again until the next batch arrives. They are usually taking a couple days to sell out though, so an auto-notify email is usually enough to get one if you have the money on hand and you really want one.
https://www.newegg.com/Product/Prod...14202321&cm_re=vega_64-_-14-202-321-_-Product
Good luck finding one in stock at that price.
 
Does anyone know what the "gpu limited" stat actually means? Is that the percentage of time the GPU is limited by the CPU, or is it the other way around? Based the results I saw I assumed it's the time the gpu was limited by the CPU, since at 4k it was 4.4 and at 1080p it was 96.6 — But then Kyle's benchmark has it pegged even at 4k ultra, so I'm really not sure.
I don't know how it is calculated here, but generally GPU limited means the CPU is waiting for the GPU to send it something and CPU limited is the opposite. You generally want the former, as stuttering and desync happens when the CPU can't keep up with the GPU. The discrepancy here may be because Horizon 4 is using all of the cores on the 6950X efficiently compared to the limited number of cores on the 7700K.
 
The discrepancy here may be because Horizon 4 is using all of the cores on the 6950X efficiently compared to the limited number of cores on the 7700K.
Not saying you're wrong, but both the CPU simulation and render scores were higher with a 7700k than 6950X? It'd be nice if more games started using all the cores - I've been itching to scrap the 7700k and do a build with a 1950x, but I just can't justify it yet.
 
Can it be unlocked to the full game? Or the full version requires a separate installation and in total you're well over 56GB?
 
the gpu limited means just that, the systems cpu could push more fps if the gpu cant take them. ie the cpu could push 200 fps but the card can only do 100 therefor its gpu limited.
 
the gpu limited means just that, the systems cpu could push more fps if the gpu cant take them. ie the cpu could push 200 fps but the card can only do 100 therefor its gpu limited.
Is it possible this game is more gpu limited at 1080p than at 4k?
 
doubt it. if youre going by that first pic you posted, it was locked at 60 fps so that will change things.
Derp. I'll have to run it again.

Just ran it at 4k "ultra" preset with the unlocked frame-rate - gpu limited percentage mystery solved:
FH4-4ku.png
 
Last edited:
They're in stock right now, at that price. They're pretty consistent. They also have Vega 56's for $399. I picked up two of those.
Damn, you're right! If only crossfire support were better, I'd grab another vega 64. Good find.
 
A question for those among you who are more versed in the GPU technology than I am. Why does AMD focus on this level of "future proof"? It's nice to see when the technology is finally used they gain an advantage but isn't that level of thinking always going to shoot oneself in the foot considering how long it takes for things like dx12 adoption to happen?
Because the bulk of the games will run fine in DX9 DX10 DX11 or even OpenGL. It is about when you want something extra as a games developer.
Some of the developers make engines like Frostbite https://en.wikipedia.org/wiki/Frostbite_(game_engine)

This saves a lot of time if you have multiple platforms or multiple games not that code between platforms are 100% the same but most of it is usable (rather then starting from scratch).
Even if people think it is extra effort DX12 tends to be more effective.
 
Back
Top