Ryzen 7 1700 + 1080 ti low GPU utilization in fortnite @ 144hz

buttons

2[H]4U
Joined
Oct 12, 2011
Messages
2,174
This weekend i sold my gaming computer on craigslist to prepare for the upcoming 3000 series amd chips. The guy also wanted a monitor, so i sold him my spare 24" 144hz my kids were using with an RX 560 2gb. gaming at 144fps is not something ive ever spent any time troubleshooting or tweaking for. as long as i could max the settings and maintain 60+FPS i never felt there was an issue -- which is exactly what i was getting with the 1080ti.

However this guy is trying to maintain 144fps+ he is using esport settings (low graphic settings + long view distance) GPU utilization is 30%. When we run benchmarks the GPU hits 100%. We tried overclocking the cpu from 3ghz to 3.8ghz but it didnt seem to affect the gpu utilization. I have one 1080ti left, but its currently in my old computer an FX-6300@4ghz and im actually seeing the same behavior in fortnite.

Creative mode i get 150+ fps 100% cpu utilization 65-75% GPU utilization however, if i switch to battle royal mode. my FX-6300 utilization drops to 65% and my gpu drops to 40%. FPS drops to around 80fps. I have windows power settings set to high performance, nvidia control panel set to maximum performance. fortnite fps limiter set to unlimited, tried a couple different driver revisions and fresh install of windows. I spent a few hours google searching this topic and i see tons of people with the same issue, but never see a "fix" Any ideas?
 
Only responding because no one has yet, but I think this is one of those cases where you need an Intel CPU. The utilization numbers seem low, but I'm sure there's some reasons, memory, or perhaps there is some part of the cpu that gets hammered while the rest of the cpu is waiting.
Not very helpful or technical, but it's reasonably well documented that amd chips even ryzen to date struggle with games above 120fps.
I have read several times that if solid 144fps+ is a requirement, you need team blue.
 
You need to check per core CPU util. My bet is one core is pegged at 100% or close to it, and thats where your bottleneck is.
 
The Ryzen is using 3000 CL16 xmp set in bios. My FX6300 is using 1600. I just thought it was odd that creative mode hits 100% on my FX6300. My buyer is having other issues with the system so i am buying it back from him. I never had any issues with it for the 8 months or so i had it, and truthfully ill be glad to have my other 1080ti + AX1200 back. I know intel have better single threaded and overclock better, but on the nvidia / fortnite forums tons of intel users having the same issue.

who wants to see FX-6300 with dual 1080ti in SLI benchmarks? :-P

Memory: Gskill TridentZ RGB 3000 cl16
 
Which resolution? What happens when you drop the resolution even lower? Does GPU usage decrease or FPS increase?
 
who wants to see FX-6300 with dual 1080ti in SLI benchmarks? :-P
Maha, yes!
I just sold my fx 6300 and truth be told it was a very good budget gaming chip. Still is as far as I can tell. Still have an 8320E in the house, but it's in a crappy motherboard.
 
Which resolution? What happens when you drop the resolution even lower? Does GPU usage decrease or FPS increase?

i am gaming on a 65" vizio P series. 1920x1080@120hz THe other guy was doing 1920x1080 @ 144hz. dropping the settings to low apparently had no impact, i get it back Saturday, ill update then.
 
Maha, yes!
I just sold my fx 6300 and truth be told it was a very good budget gaming chip. Still is as far as I can tell. Still have an 8320E in the house, but it's in a crappy motherboard.

You know, i cant get over how good it does. I paid maybe $40 for it new from microcenter. I have it set to 4ghz with no power savings enabled. Seems really snappy, i personally have no problems using it... but i dont really want to be sitting on it while we march on to 3rd generation ryzen. People thought it was a paper weight before 1st gen ryzen was here.
 
do you have the setting for power profile set to max performance from balanced? In the Nvidia settings? Also, try to set the system at using 4 cores only and disabling second CCX and see if that helps as well. And I assume you have vsynch disabled in game etc... and the monitor actually set at 144hz. I've derped those a few times.
 
I wonder if disabling high precision even timer in device manager would have any effect....guess i would at least try it
 
I got the computer back today. reinstalled windows, ran updates, installed fresh drivers (all of which he claimed to try) Ran fortnite at 2560x1440 and saw GPU utilization of 90+% cpu settings on epic. fps cap set to 160 -- i often was capped, but it certainly dipped to 80s sometimes more so in battle royale mode then creative. while gaming, cpu was around 30% utilization and cpu core 10 was at 100% -- i suspect thats my limitation.

I think he was expecting fps would never dip below 144 on fortnite and for that to be a reality we may need a 5ghz intel.
 
I got the computer back today. reinstalled windows, ran updates, installed fresh drivers (all of which he claimed to try) Ran fortnite at 2560x1440 and saw GPU utilization of 90+% cpu settings on epic. fps cap set to 160 -- i often was capped, but it certainly dipped to 80s sometimes more so in battle royale mode then creative. while gaming, cpu was around 30% utilization and cpu core 10 was at 100% -- i suspect thats my limitation.

I think he was expecting fps would never dip below 144 on fortnite and for that to be a reality we may need a 5ghz intel.
Ja, you're going to be more CPU dependent if you're running at a lower resolution with low graphic settings. The guy you sold to is basically using graphic settings that are shifting all the burden onto the CPU.
 
My buddy is obsessed with Fortnite on an i7 7700. He often gets below 100fps in certain areas.
 
You are setting the settings so low the CPU is probably doing all the work and not the GPU.
 
I just downloaded Fortnite and played it on my 1080ti and 2950x Threadripper.

Game runs really good as far as I can tell.

I NEVER play this game so I have no idea what I am doing other than playing FPS games and PUBG.

Cleary there is little CPU usage and GPU usage, in fact not enough GPU usage to trigger a boost clock on my 1080ti. This game is so underoptimized for AMD its not even funny. Im using like 16% cpu and low GPU.

 
I just downloaded Fortnite and played it on my 1080ti and 2950x Threadripper.

Game runs really good as far as I can tell.

I NEVER play this game so I have no idea what I am doing other than playing FPS games and PUBG.

Cleary there is little CPU usage and GPU usage, in fact not enough GPU usage to trigger a boost clock on my 1080ti. This game is so underoptimized for AMD its not even funny. Im using like 16% cpu and low GPU.



I think with a 2950X you are supposed to turn on Game Mode in the BIOS which disables half your cores. Otherwise your gaming experience will be generally forgettable.
 
I think with a 2950X you are supposed to turn on Game Mode in the BIOS which disables half your cores. Otherwise your gaming experience will be generally forgettable.

Nah ... I use process lasso. That's about as good as it games on AMD. The game is pure shit as far as Amd optimization. If you notice lots of YouTube videos will suggest that. I know I said game runs good but after going and watching more videos last night on the subject the pattern is there. The people that made fortnite could give two pisses in the wind about AMD optimizations. Glad I dont find the game fun at all.
 
1080ti is completely bottlenecked by 1700. I have the same combo, but I play at 1800p at 60fps and noticed a number of games are CPU bound even at 1800p with everything maxed.
 
Back
Top