New RX570 4G - massive frame drops?

MrGuvernment

Fully [H]
Joined
Aug 3, 2004
Messages
21,817
Hello all,

Recently bought a RX570 4G, ages ago I had wanted one but then the ETH mining craze took off and left places with no stock.

First, here are my systems specs, thinking perhaps one piece of it could be causing my issues.


Prior i had an AMD RX550 card to hold me over and i don't recall these massive frame drops (still have the 550 so going to toss it in to see how Tf2 does)

Playing 2 games thus far

  • Team Fortress 2
  • BF1

Both games I have tested on my LG 23" 2560 x 1080 wide screen in window mode, full screen and also on my BenQ 24" 1080p screen, same, window mode, full screen, same results.

As we know Tf2 is not a GPU hungry game and a RX570 should eat it alive..and yet with cl_showfps it shows upwards of 150+ FPS and then BAM! drop down to 30-40FPS and then spike backup. I even turned off AA and AF, same issue and the Sapphire tools do not even show the GPU using 100% usage.

BF1, while more intense should be able to handle the game at Ultra with most settings @ 1080, even with all settings on medium, i am dropping from a steady 59/60 down to 30-40 out of no where.

I originally installed the latest Cats 17.11.2 and some people said they got stuttering so i dropped down to 17.11.1. No change.

I have a clean install of Windows 10 1709 on my patrio 256G SSD, my games run off the 960G Crucial.

As noted above I do have the card in an x8 slot but I would think that should still be able to handle an RX570.

Any thoughts or does anyone know of any potential issues with RX570 going on right now?


 
The issue I think it's related to the CPU being unable to feed properly the GPU those low clocked sandy bridge Cores certainly are an issue for most games.

As you guessed TF2 rely mostly on CPU single core/thread performance.. the lower settings you apply the higher CPU burden it's applied to the game..

BF1 should be decent, however same case apply, the lower settings the higher CPU burden. your card should be able to play Bf1 at ultra settings perfectly, crank the settings up.

the framedrops im pretty much confident of being due the low clock of the CPUs. also you may try to tweak with the affinity, and keep games running on a single CPU and keep the second CPU for other task, generally speaking multisocket system tend to have certain issues with games when a task goes from a CPU to other momentarily... you will not see 100% GPU usage at 1080P resolutions with those CPUs.
 
Never thought to play with the affinity! Will give that a try now to see and crank the settings, see what happens.

the CPU can do 2.9 Turbo, but will keep an eye on overall CPU usage as well when playing to see if any cores spike..

Thanks for a fresh start to review.
 
Well CPU could be a cause on TF2 but assigning the affinity to just one processors and the drops are no where near as bad! only a few drops below 60..
This was TF2 maxed out on my 2560 x 1080 LG monitor, can see the one core gets up there and stays often!
kGNXmDB

kGNXmDB.png



And GPU usage even cranked barely used.
Averages:
Loa7tvh.png



Now to try BF1
 
Last edited:
BF1 assigned to one processor, def some peaks there, drops down into the 40FPS range with everything on Ultra @ 2560 x 1080 so not bad, GPU ran a bit harder, peakjed at 100% usage but was still low on the over all average

OrDjLF9.png
 
BF1 assigned to one processor, def some peaks there, drops down into the 40FPS range with everything on Ultra @ 2560 x 1080 so not bad, GPU ran a bit harder, peakjed at 100% usage but was still low on the over all average

OrDjLF9.png
yep i think some games have pretty bad issues running on dual cpu setups, or its not the first time i have heard of it before. That and the fact you pretty limited with low cpu clocks. I bet its pretty decent at work station tasks! It is possible that nvidia drivers work better in those situations...really depends on the game a fair amount i bet. You could always try the 17.7.2 driver but for me the 17.11.2 is working nicely with wolf2 but of course its a different game.
 
That they do, and once it was mentioned i do recall that from being way back in the day even. I would have thought that this CPU could handle TF2 pretty easily, even though valve games are cpu dependent.
 
That they do, and once it was mentioned i do recall that from being way back in the day even. I would have thought that this CPU could handle TF2 pretty easily, even though valve games are cpu dependent.
Well kind of akin to Fallout 4 and my old 8350 and now R7-1800X. Fallout ran fine but man those dips and stalls with the 8350 made me put it on the back burner. All other games were fine and issue free. But I could see the end coming with new games and my 8350 beginning to falter below 75fps which till then had easily met. Then I saw others that moved from FX to the R7/5 talking about how well Fallout 4 ran, so I bought it with my LC Vega 64, about 3-6 mths earlier than planned (already bought the wife the 1600 (nonX) in May and saw how effortlessly it ran WoW over her previous 7870k). And gotta say they were right. Fallout 4 runs absolutely fantastic now. The 8350 would drop to 30fps or just under when moving, the R7 only down to mid-80s. I hypothesized that the compression Fallout was using for its files were the culprit and hence the R7s ease and far better performing. Now just have to see how it goes when I get to Concord (new game).
 
That they do, and once it was mentioned i do recall that from being way back in the day even. I would have thought that this CPU could handle TF2 pretty easily, even though valve games are cpu dependent.
you have a bad ass set up for sure! Its just that most game devs dont put any effort coding the games for systems like yours unfortunately....but i think your on the right track for getting a decent work around....its also possible that amds drivers are not coded to help setups like yours....admittedly you dual set up is a tiny minority of he user base. Personally i think its more of the way the games are coded...but i know youl get it sorted he best it can
 
First check the power supply 12 volt rail is enough to supply what it ask for , my 770GTX 4Gb calls for 46 amps on the 12v rail just to feed the card.

Next I would power down and pull the card and do a hard cmos clear like (jump and pull the battey)

where the hell you have pulled those numbers from? never in the life such card gtx 770 will need 552watts.. even a hardware modded with unlimited power bios overclocked to 1500mhz would pull that insane amount of power..
 
where the hell you have pulled those numbers from? never in the life such card gtx 770 will need 552watts.. even a hardware modded with unlimited power bios overclocked to 1500mhz would pull that insane amount of power..

They may be pulling it from the box for the GPU. They usually overestimate to cover their asses. The box for a GT 710 for instance recommends a 300 watt PSU for instance even though it can run on a potato.
 
They may be pulling it from the box for the GPU. They usually overestimate to cover their asses. The box for a GT 710 for instance recommends a 300 watt PSU for instance even though it can run on a potato.

yeah, all manufacturers recommend a PSU capacity based on many different system, and are calculated for loads and not only for GPU loads, which it's ok and understandable, but it's exactly there where the difference start, between GPU needs and system needs, he is stating that his 230W GPU actually as itself need to be fed with 552watts xD lol. it's was actually funny but I had to ask, because that's how easily a misinformed user can misinform another one.

about the power well, I can make my old 980ti to pull as much as 600W when going to a bit over 1600Mhz so the power demand it's certainly plausible in certain cards and conditions, but never on a GTX 770/680
 
but never on a GTX 770/680.. it is a MSI Twin Frozer 770GTX 4Gb OC Gaming card..

My Sapphire Tri X -290X New Edition has a 6 phase power design to deliver 375 watts of power to the new Hawaii chip that does not throttle and has 1020/1350 clocks with 2 x 8 pin connectors , I have no idea what it use's under gaming 3D clocks paired with say a Xeon X5660 clocked at 4.2Ghz under load at those clocks but the 12v rail needs to be powerful enough to feed a card that draws 375 watts alone without a draw cap could burn out sockets and it needs to be clean 12v power to give you a 100% stable platform before people bitch about drivers / platforms issues .

This has always been the golden rule of designing a true gaming platform as 100% rock solid build for any budget ..( what are the power needs for the planned build as to buy the correct power supply for the need's of said system.; (YOU ALWAYS NEED HEADROOM OVER TOTAL SYSTEM POWER DRAW.. your power supply will love you and last for 8+ years) so the 95watt X5660 overclocked should eat another 150 watts easy with it's on board water cooler and then we have those other parts that need there power also .

stop smoking whatever you are using, derailing threads with offtopic, useless misinformation, spreading ignorance at the highest levels, and please stop embarrassing yourself.. you know, you speak exactly the same as an old banned guy who loved to spread how much it loved his new sapphire 290X that doesn't throttle, pretty weird using almost exact same words coming from another guy.. =). you know the only thing left to said to complete the same guy sentences spread on a lot of threads are "this is the same PCB used on the new 390X".. you also name all modern GPUs with the XXX-GTX exactly the same, pretty weird coincidence isn't?.

Also, you know a nice fact? I have the stupid Sapphire 390X Tri-X and even overclocked at 1150mhz with its 8GB overclocked to 6500mhz the card come close to pull 360W, that's just a bit less than 30Amp on the 12V rail, and you are here still saying a card like the GTX 770 needs 46Amp HAHAHAHA, a funny joke.. not even the most powerhungry GPU on the market right now (RX VEGA64) needs that insane amount of power..
 
Back
Top