Ryzen 3 1200 with RX 570 crushed FX 8150 with RX 580 on Firestrike

OhSigmaChi

Limp Gawd
Joined
Jan 27, 2019
Messages
194
So, I was upgrading a friend's computer; he had an old iBuypower machine with an fx 4100 and an HD 7750 (awful) and 8 GB of 1033mHz RAM. This thing couldn't even spell firestrike, let alone run it (it wouldn't even run the Heaven Extreme benchmark without crashing)

I was able to get him a new PSU, FX 8150, 16GB of 1866 DDR3, and an XFX RX 580 8GB for just a hair over $200. I kept his case, mobo and HDD (he's getting an SSD right now at Microcenter).

What's blowing my mind is that his machine could not keep up with the cheapest Ryzen build I have ever done. both machines had very mild CPU/GPU overclocks, and the R3 1200 with an ASUS Strix RX 570 and 8GB of DDR4 2666, just ran away with the firestrike results...

"Talk amongst yourselves"
Ryzen 3 Firestrike.jpg FX 8150 firestrike.jpg
 
I mean, yeah. The FX Bulldozer/Excavator chips were really, really not good at gaming despite what AMDs marketing dept said. My most extreme example is when I was into Minecraft- my Surface Pro 2 (2C/4T i5, 1.6Ghz base/2.6Ghz boost, 4GB memory, HD4400 iGPU) had significantly better framerate and less lag & stutter than my 8320 @4.2Ghz all-core with 16GB and a Radeon 290X. Yikes. The FX system was hell on wheels for photo and video editing tho!
 
The bulldozer arch was OK at best but you really, really, really need to overclock them as much as you can to get the best/highest IPC you can. My FX 6300 does so much better at 4.4GHz on all cores than at stock. Problem is no matter what you do the minimum frames still sucked compared to Intel of the time
 
So, I was upgrading a friend's computer; he had an old iBuypower machine with an fx 4100 and an HD 7750 (awful) and 8 GB of 1033mHz RAM. This thing couldn't even spell firestrike, let alone run it (it wouldn't even run the Heaven Extreme benchmark without crashing)

I was able to get him a new PSU, FX 8150, 16GB of 1866 DDR3, and an XFX RX 580 8GB for just a hair over $200. I kept his case, mobo and HDD (he's getting an SSD right now at Microcenter).

What's blowing my mind is that his machine could not keep up with the cheapest Ryzen build I have ever done. both machines had very mild CPU/GPU overclocks, and the R3 1200 with an ASUS Strix RX 570 and 8GB of DDR4 2666, just ran away with the firestrike results...

"Talk amongst yourselves"
View attachment 178516 View attachment 178517
The 570 is only a little bit worse than the 580 and it has a much better CPU architecture backing it up with that Ryzen. So yeah, no surprise.
 
So what is REALLY killing me, is my oldest kiddo's computer has an Athlon X4 860K OC'd to 4.4GHz and an RX 580 8GB and it is maybe only 200pts slower in Firestrike than the newer FX chip (FM2+ vs FX).

I mean, if you're gaming there's little reason to go AM3(+) over FM2+ if you're super budget constrained.

And, yeah, I know one benchmark doesn't tell the whole story, but I did play Metro 2033 redux on both the FX and the Ryzen 3 and felt Ryzen was probably smoother. I haven't played Metro on my son's computer.
 
Iirc the IPC on Ryzen is 40% higher then the old AMD chips so you would need around 1.6Ghz higher clockspeed to compensate the difference with a 4.0 Ghz ryzen chip (this is approximative, napkin math, just to give a rough idea)
 
So what is REALLY killing me, is my oldest kiddo's computer has an Athlon X4 860K OC'd to 4.4GHz and an RX 580 8GB and it is maybe only 200pts slower in Firestrike than the newer FX chip (FM2+ vs FX).

I mean, if you're gaming there's little reason to go AM3(+) over FM2+ if you're super budget constrained.

I have some old lanparty gaming rigs I keep around for when friends come over. One has an FX8350, and one as a A10 7860K. on modern games the 8350 is a good bit better framerate. Those extras threads make a big difference. I was benchmarking Far Cry 5 specifically. I was meaning to compare to an I3 7100 as well, maybe tonight I can get those numbers and post them.
 
This is why I always point out that you should not ignore your CPU and I get extremely frustrated at the people who complain about a new game not working on their 11-year old processor. You don't need to upgrade it as often as the video card, but it shouldn't be neglected just because the loudest voices on the internet say that any old CPU is good for gaming so long as you have a strong video card. My upgrade philosophy is new video card every 2 years, new CPU or platform upgrade every 6-7.
 
No surprise here but, does not mean you need to run out and buy a new machine, regardless. :) (Like I ever need an excuse, though. :D ) A Ryzen Based XBox will be really good, looking forward to it.
 
This is why I always point out that you should not ignore your CPU and I get extremely frustrated at the people who complain about a new game not working on their 11-year old processor. You don't need to upgrade it as often as the video card, but it shouldn't be neglected just because the loudest voices on the internet say that any old CPU is good for gaming so long as you have a strong video card. My upgrade philosophy is new video card every 2 years, new CPU or platform upgrade every 6-7.


By that same logic, people that bought AMD rigs 4 years ago (pre-Ryzen) are still good for another 2 or 3. There is more involved here than just the time frames...
 
By that same logic, people that bought AMD rigs 4 years ago (pre-Ryzen) are still good for another 2 or 3. There is more involved here than just the time frames...

Yeah, I don't really put a timestamp on it, I just continually tinker with most of my systems anyway, but I generally will keep a Mobo until I have hit the end of the line with economical CPU upgrades (both kids had 10-5800k CPUs before they got Athlon x4s). I will keep them until they can't sustain 60fps on low settings or they introduce unplayable stutter.

But it looks like I will just skip FX chips all together, and get a used first gen Ryzen Mobo for them and a couple of R3 1200s. That is probably a few years worth of second hand CPU upgrades on that path.
 
I have some old lanparty gaming rigs I keep around for when friends come over. One has an FX8350, and one as a A10 7860K. on modern games the 8350 is a good bit better framerate. Those extras threads make a big difference. I was benchmarking Far Cry 5 specifically. I was meaning to compare to an I3 7100 as well, maybe tonight I can get those numbers and post them.

I'd love to see those benchmarks! What kind of GPUs are you running with the FX and A10?
 
RX570, RX580, and a GTX 1060 have been swapped around. Also a 7950 and 7970. I'll finish running numbers and try to organize them.
 
By that same logic, people that bought AMD rigs 4 years ago (pre-Ryzen) are still good for another 2 or 3. There is more involved here than just the time frames...

Need to not start with crap, you mean?

So what is REALLY killing me, is my oldest kiddo's computer has an Athlon X4 860K OC'd to 4.4GHz and an RX 580 8GB and it is maybe only 200pts slower in Firestrike than the newer FX chip (FM2+ vs FX).

I mean, if you're gaming there's little reason to go AM3(+) over FM2+ if you're super budget constrained.

They're both Bulldozer, so you should expect them both to be equally slow... ;)

[Bulldozer was a regression in performance compared to its predecessor, much like the Pentium IV, but unlike the Pentium IV, it didn't scale very far]
 
The piledriver arch seems to be better than the x1xx series, but this surprised me. Maybe try playing games instead of playing benchmarks... My 6300 FX cpu was basically on par with my newer ryzen 1600. But I'm on well threaded games mostly. Currently have an 8320 w/ 580 beside a 1600 w/ 580 and they perform nearly identical in bfv.
Motherboards are a big problem with the FX series, a bad one will have your cpu throttled all the time.
 
RX570, RX580, and a GTX 1060 have been swapped around. Also a 7950 and 7970. I'll finish running numbers and try to organize them.

You and I have a very similar stable. I have a couple of R9 290Xs laying around not doing anything... kinda want to build a "retro" crossfire rig.
 
So what is REALLY killing me, is my oldest kiddo's computer has an Athlon X4 860K OC'd to 4.4GHz and an RX 580 8GB and it is maybe only 200pts slower in Firestrike than the newer FX chip (FM2+ vs FX).

I mean, if you're gaming there's little reason to go AM3(+) over FM2+ if you're super budget constrained.

And, yeah, I know one benchmark doesn't tell the whole story, but I did play Metro 2033 redux on both the FX and the Ryzen 3 and felt Ryzen was probably smoother. I haven't played Metro on my son's computer.
Reading this it sounds like there is a problem with the 8150 system...
 
Need to not start with crap, you mean?



They're both Bulldozer, so you should expect them both to be equally slow... ;)

[Bulldozer was a regression in performance compared to its predecessor, much like the Pentium IV, but unlike the Pentium IV, it didn't scale very far]

Actually the 860K should have better IPC than the 8150 due to the newer pile driver cores. FM2+ is newer than AM3.
 
Reading this it sounds like there is a problem with the 8150 system...

It runs productivity loads like a scalded dog (I'm Southern; it means very fast). It evidently just doesn't have the IPC to outgun an Athlon X4 860 with a 4.4GHz all core OC.

Actually the 860K should have better IPC than the 8150 due to the newer pile driver cores. FM2+ is newer than AM3.

Interesting, I wasn't aware of that.
 
Actually the 860K should have better IPC than the 8150 due to the newer pile driver cores. FM2+ is newer than AM3.

Bulldozer family; AMD didn't pull much out of the core and it was slower (than their own products) at introduction.
 
Bulldozer family; AMD didn't pull much out of the core and it was slower (than their own products) at introduction.

Agreed, you can only do so much with the bulldozer arch no matter how many revisions. It definitely didn't have the jump in IPC that was claimed from the original BD cpu's.

PS: mybad the 860K had steamroller cores, which is newer than piledriver cores.

STEAMROLLER
AMD estimated that these improvements will increase instructions per cycle (IPC) up to 30% compared to the first-generation Bulldozer core while maintaining Piledriver's high clock rates with decreased power consumption.The final result was a 9% single-threaded IPC improvement, and 18% multi-threaded IPC improvement over Piledriver.
 
Last edited:
So for single core IPC performance is Athlon X4 860k > FX 8150?

In theory yes. In actual use probably a wash. The 8150 has twice as many cores so I'd stick with that. To me that would be more important than the small IPC gains.

The 860K is a more entry level cpu with less cache than the 8150.
 
Firestrike is primarily for measuring epeen anyway. How's it run your favorite game?
 
Agreed, you can only do so much with the bulldozer arch no matter how many revisions. It definitely didn't have the jump in IPC that was claimed from the original BD cpu's.

PS: mybad the 860K had steamroller cores, which is newer than piledriver cores.

I should state beforehand that I was impressed with the direction AMD took Bulldozer- the idea of 'hardware hyperthreading', essentially, looked promising- right up until the benchmarks came out.

When you look at the progression of the Bulldozer family, it looks a lot like K7 --> K10, and a lot like Zen -> Zen+ ->Zen2. In all three cases, the 'core' of the architecture changed very little, but the 'uncore' around the core changed to better support the core. For example, for those with good Zen (say 1700X) samples, good 3200C14 DDR4, and a good motherboard, even Zen 2 isn't really a performance improvement in terms of IPC. The problem is almost no one could get Zen to run like that; Zen+ improved a little, and with Zen 2, AMD threw boatloads of cache at the problem.

The problem with Bulldozer is that they did the same thing Intel did with Netburst (the P4): they relaxed IPC at the heart of the design with the intent of increasing clockspeed. Which worked! Except that they relaxed IPC so much that the K10 was faster, while Intel made a huge jump in IPC (with lower clockspeeds) with the Core 2. And in typical AMD fashion, they couldn't keep power draw under control because they were running their parts at the ragged edge out of the foundry while their competition was refining power draw.

We went from the height of the P4 era with literally no reason for an enthusiast to buy an Intel product to the height of the Core era with no reason to buy an AMD product ;).


[end trip down memory lane]
 
Need to not start with crap, you mean?

Well, crap is relative. I replaced my FX-8350 with a Ryzen 7 1700 at Ryzen launch. The FX-8350 was not a hindrance to the games I play and was actually an asset to the machine's other purpose, encoding my video collection. It served that purpose very well and the upgrade from my Phenom II X6 1090t was simply the cost of the CPU. That CPU also lasted an additional 2 years as the core of my Media Center hosting Plex before I inherited the Dell PowerEdge R515 that now serves that purpose. It should be stated that the FX-8350 is particularly well-suited for encoding Plex streams and was utter overkill in that role.

Could I have gotten a faster Intel chip? Absolutely. Would have have paid a whole lot more for it? ABSOLUTELY. I know there was faster stuff out there, but the cost/performance ratio for me was right. I got my money out of it, plus some. I can't call that "crap."
 
Could I have gotten a faster Intel chip? Absolutely. Would have have paid a whole lot more for it? ABSOLUTELY. I know there was faster stuff out there, but the cost/performance ratio for me was right. I got my money out of it, plus some. I can't call that "crap."

Let me go ahead and admit that 'crap' is a bit of hyperbole on my part. AMD was giving those CPUs away, so at those prices, well, who cares. I built a system with a used one for a sister to use for content creation and really could not complain.
 
I'd love to see those benchmarks! What kind of GPUs are you running with the FX and A10?

Here are some benches.

All tests were with Far Cry 5 and recent drivers. Resolution was 1920x1080.

i3 7100 was stock since it is locked.

R5 2600 was stock.

FX-8350 was clocked to 4.2Ghz with RAM slightly overclocked as well.

7860K was also overclocked o 4.2Ghz with RAM overclocked as well. (I forget exatly what clock speed.)

GPU was an ASUS RX 480 4GB that at stock does not maintain clocks very well. In game core sits around 1150Mhz.


yGv4WOL.png


h1VnFO0.png



I also did a mild overclock to the Ryzen 2600 of ~4.0Ghz, and then tested these GPU's.

RX480 OC was -43mV, 126% TDP, 1900 RAM. Raising TDP brought the core up to a solid 1300Mhz. Could have tried higher but fan was getting loud.

RX570 OC was 126% TDP, 1330 core, 1900 ram. Likely could have clocked higher.

1060 OC was 110 % TDP, +220 core, + 400 ram. Any higher and this card was crashing.

7970 OC was 1135 core, 1575 ram. Huge boost here.


y0zHa13.png


f6dsZ1g.png
 

Attachments

  • upload_2019-8-7_20-22-22.png
    upload_2019-8-7_20-22-22.png
    40.1 KB · Views: 0
  • upload_2019-8-7_20-25-44.png
    upload_2019-8-7_20-25-44.png
    44.4 KB · Views: 0
  • upload_2019-8-7_20-30-37.png
    upload_2019-8-7_20-30-37.png
    47.7 KB · Views: 0
  • upload_2019-8-7_20-30-48.png
    upload_2019-8-7_20-30-48.png
    49.2 KB · Views: 0
Yeah I was somewhat surprised how close it was, considering it came out in 2012.

I have four of them (from two different crossfire systems back in the day). They are actually quite capable, but the 3GB of VRAM starts to lose steam on new AAA titles with very long draw distances.
 
Back
Top