Gears of War 4 i7-6700K vs FX-8370

cageymaru

Fully [H]
Joined
Apr 10, 2003
Messages
22,091
Since we haven't had a drama thread in awhile, and there is a hurricane headed to flood my area, I figured why not post some i7-6700K vs FX-8370 benchmarks in Gears of War 4. It's the end of the world as we know it; might as well go out happy. Need a translator for these though. Well I'm sure that you'll can read charts without a translator. 1080p, 1440p, and 4K. Go to their website for the chart.

Gears of War 4 mit DirectX 12 im Benchmark (Seite 3)

GameGPU did more testing and it seems that the game will use 16 threads! Those monster Intel processors finally get a workout. Not going to lie; that i7-5960X makes my heart all fluttery inside.
Gears of War 4 тест GPU | Action / FPS / TPS | Тест GPU


upload_2016-10-6_14-12-26.png


upload_2016-10-6_14-12-46.png


upload_2016-10-6_14-13-48.png


Anyway have fun reading.
 
It seems to use 6-8 cores only effectively. Dont get fooled by Windows throwing threads around.

Core.Thread.Test_-840x390.jpg
 
The only thing I got out of all this horse shit is the fact that AMD has always been way behind. A 2600k can stay right with it.

What is the point of this thread? To try and prove AMD is still relevant in the CPU market? Big fail.
 
Wait, I was told that a dual-core i3 can take an eight-core AMD in all situations. What I see is the $150 AMD 8350 beating the $250 intel 6600 in this title, now that can't be right now can it. Might as well forget this happened, we shall chalk it up to any number of canned responses (obviously the generic term of 'console port' will be thrown around as an excuse as well as questioning the validity of the tests.) I guess this test does show the minimal performance increases intel has been peddling for years.
 
It is funny Cageymaru in your link I saw the numbers for the 4K results on the German website and there identical on each cpu and across the same graphics cards. But this should not be a surprise , the numbers of 2560*1440 were a little lower for AMD cpu .

That is when GoW 4 is still cpu bound I guess.
 
Wait, I was told that a dual-core i3 can take an eight-core AMD in all situations. What I see is the $150 AMD 8350 beating the $250 intel 6600 in this title, now that can't be right now can it. Might as well forget this happened, we shall chalk it up to any number of canned responses (obviously the generic term of 'console port' will be thrown around as an excuse as well as questioning the validity of the tests.) I guess this test does show the minimal performance increases intel has been peddling for years.

By all means, go buy that 8350 and be happy.
 
By all means, go buy that 8350 and be happy.

I wouldn't tell someone to run out and buy a FX processor today as it is obvious that they are officially 5 years old next week. But going by the numbers, I also don't see a reason to upgrade. I was just posting it to show how much Intel has done nothing to increase their lead in raw gaming power over the past 5 years. It's sad for me as I really want something to upgrade to. Power savings means absolutely nothing to me. Spreadsheets loaded just fine on my old Q6600.

A 1TB SSD is a much better upgrade for the average gamer than a new processor if you gear is less than 5 years old as you can visually see levels and maps load faster in games like Battlefield 4. A GTX 1070 / Fury X / GTX 980ti would be a godly upgrade compared to a similarly priced Intel processor. Heck I'd even go so far as to say that a new 40" TV / Monitor or a VR headset would be a much better entertainment value upgrade than a new Intel processor. If you have a library of movies then a NAS would be a better upgrade than a new Intel processor.

Yes, I understand that they went for power savings. I thought that they would use the power saving to generate less heat. And the lower temps would allow Intel processors to hit 6GHz+ over 5 years time at least.

AMD and Intel need to figure out how to get more data flowing through these processors and show consumers why they need to upgrade. Tired of twiddling my thumbs every new processor release. I want to throw some money at one of these companies for a legitimate upgrade in raw power. Legitimate means 25% or more faster at gaming and my other interests than my current CPU. I think that's not asking for much as some of my gear is 5 years old.

So you're right I wouldn't tell someone to buy a new FX processor. I surely wouldn't tell them to "upgrade" to an Intel processor from a FX processor as there is very little value in doing so. And everyone should stay away from the i3 processors. Those things need to be used only for NUC boxes.
 
CPU upgrades being meaningless when it comes to games is hardly news but I do get OP's point. Boredom is as good of a reason for some drama as anything.

I never planned to keep my 2500k for so long but here we are. Maybe things will change once Zen is released. These new multicore console SoCs are paving the way for us all.

Right now, though? Sometimes I forget all about my dev VMs when I game and it never made any visible difference. No, I don't do any I/O, hence my experience.
 
It seems to use 6-8 cores only effectively. Dont get fooled by Windows throwing threads around.

Core.Thread.Test_-840x390.jpg

Did WCCF do this test in GPU-limited scenario? LOW res, but otherwise highest settings? Otherwise, it's worthless.

But otherwise it would not be surprising, since these console games are still targeted at 7 cores.

But it stil gives a nice boost for the AMD owners (who top-out at8 cores), and the 6700k buyers (50% minimum performance increase)
 
Buying a new / used one over an i3 is still mentally retarded.

Not the greatest choice for a main rig but perfectly good for an HTPC. I bought a G3258 back in 2014 and I am very happy with it.
 
Not the greatest choice for a main rig but perfectly good for an HTPC. I bought a G3258 back in 2014 and I am very happy with it.
we are talking about cheap entry level rigs, but neither will be suitable for a mid range build.
 
Yea's it would be a terrible choice for gaming. I've seen plenty of recommendations over AMD's offering for gaming rigs, though.
 
Nobody should be recommending the current 32/28nm AMD CPUs for almost any kind of new build, the tech is just too old this day and age. But it is nice to see my 8150 (5yrs old) and my 8350 (3yrs old) still hanging in at least a few modern games.
 
It wasn't the average frame rates which scared me off of AMD cpu's but the minimum frame rates. Good to see they stay competitive here or at least playable.
 
if you're not a FPS whore and dont have a ton of cash to spend, you can get a very good, completely enjoyable gaming experience from an AMD FX-8xxx based system.
with a little OCing this system would be even better. yes, there is no real upgrade path but that isn't always needed.

 
Did WCCF do this test in GPU-limited scenario? LOW res, but otherwise highest settings? Otherwise, it's worthless.

But otherwise it would not be surprising, since these console games are still targeted at 7 cores.

But it stil gives a nice boost for the AMD owners (who top-out at8 cores), and the 6700k buyers (50% minimum performance increase)

You can look up wccftech and see if they say anything.

Consoles is more 4+3, since the last 3 cores are relatively useless due to the massive penalty of cluster to cluster communication at ~200 cycles. You still have i3s beating 6 core FX for the same reason.

I think the game either lacks some finishing polish for the PC, or someone is trying to make sure there is a sell case for some Coffee later ;)
 
Nenu I thought you might find this interesting. How is your loop working out? Happy with it? ;)

Mafia 3 unlocked frame rate CPU charts from GameGPU. Mafia 3 is another one of the new breed of console games that will use 16 threads. I hope this trend continues as that's pretty exciting! Wonder if it will use 20 if running the newest 10 core flagship from Intel?
Mafia III тест GPU | Action / FPS / TPS | Тест GPU

FX-9590 looks to be holding it's own in newer titles according to the GameGPU charts. I have my FX-9370 running at 4.7GHz while sharing a loop with my RX 480 OC'd to 1480MHz core / 8800 memory. The water temps in the shared loop are at 29c idle and peak out at 47c or so. Really depends on the game.

upload_2016-10-9_7-10-56.png


upload_2016-10-9_7-11-29.png


upload_2016-10-9_7-11-43.png



Here is my RX480 after running Black Desert Online with really high settings and a thunderstorm happening in game. I logged in, turned in some quests, waited for the rain to end, dried some fish, and put them on the auction house. Just one pump and a 240 radiator + 120 radiator. The 120 is in there because it came with the box of stuff when I purchased it. Really I don't need it. As you see the minimum idle water temp is 27; really 29c idle to be honest, and the max peak is 47c. I'm just showing this because the FX-9370 shares the same water loop and it's easier to just read the temps from the RX 480.

The Alphacool Eisbaer 240 CPU AIO Cooler Review that was done here on [H]ardocp tests an excellent cooler that if I would have gotten if I had to do it all over again. It can easily run my entire system at the same temps I bet. With that said I was looking at an Alphacool 560 Monsta radiator to make a completely silent build. Just can't decide on a theme. Probably wait for Zen's performance metrics to come out before I start.
Introduction - Alphacool Eisbaer 240 CPU AIO Cooler Review

upload_2016-10-9_7-47-21.png
 
Nenu I thought you might find this interesting. How is your loop working out? Happy with it? ;)

Mafia 3 unlocked frame rate CPU charts from GameGPU. Mafia 3 is another one of the new breed of console games that will use 16 threads. I hope this trend continues as that's pretty exciting! Wonder if it will use 20 if running the newest 10 core flagship from Intel?
Mafia III тест GPU | Action / FPS / TPS | Тест GPU

FX-9590 looks to be holding it's own in newer titles according to the GameGPU charts. I have my FX-9370 running at 4.7GHz while sharing a loop with my RX 480 OC'd to 1480MHz core / 8800 memory. The water temps in the shared loop are at 29c idle and peak out at 47c or so. Really depends on the game.

Cheers, thats great info.

I sorted a basic issue with my loop.
The delidded chip lid slid forward when applying the CPU lever. Fixing that stopped some cores jumping about 15C + brought average load temps down around 3 to 4C and allowed a small drop in VCore, coool.
The block could probably do with a clean by now but the temps are as good as I hoped. This is using a totally silent 200mm fan on my rad which doesnt really fit. My 250mm fan broke. Some more fans are on the way.
(The block is extremely fine finned and clogs a little but is great with an external silent, low speed, high reliability pump - Eheim 1048. 600L/h, 1.5m head, 10W. 12.5 years old now and still like new. Wow didnt realise that!)

The Mafia results are very interesting.
Its good to see that unclocked 4+4 and 6+6 chips can max out a 1080.
The unclocked AMD chips do well (assuming 4GHz and 4.7GHz are unclocked...).
With a bit of an overclock theres plenty of headroom for the SLI (or TitanX) crowd to get 100+ fps minimums.

ps soz about my grouchy comment.
Was busy and assumed the results for the 8370 were missing or you included the wrong images.
 
Was busy and assumed the results for the 8370 were missing or you included the wrong images.

Yeah that was a bit of a search at the start , the German website is using an animation (not sure if it is flash or some derivative) to show all kinds of stats but it does not allow "linking" so you have to visit it and that part does not need a translation :) .
 
Where I live an FX-8350 is $70 cheaper than a i5-6600 and an FX-6300 is $20 cheaper than an i3-6100. People can call the AMD chips old but that doesn't change the fact they are holding their own against chips that cost more.

The idea is that in DX12 you get performance as long as you do it right when you do it right there is hardly any difference due to the API. Multi core cpu were introduced into the consumer market was a long time ago and now we finally can "use" them "all".
 
I wouldn't tell someone to run out and buy a FX processor today as it is obvious that they are officially 5 years old next week. But going by the numbers, I also don't see a reason to upgrade. I was just posting it to show how much Intel has done nothing to increase their lead in raw gaming power over the past 5 years. It's sad for me as I really want something to upgrade to. Power savings means absolutely nothing to me. Spreadsheets loaded just fine on my old Q6600.

A 1TB SSD is a much better upgrade for the average gamer than a new processor if you gear is less than 5 years old as you can visually see levels and maps load faster in games like Battlefield 4. A GTX 1070 / Fury X / GTX 980ti would be a godly upgrade compared to a similarly priced Intel processor. Heck I'd even go so far as to say that a new 40" TV / Monitor or a VR headset would be a much better entertainment value upgrade than a new Intel processor. If you have a library of movies then a NAS would be a better upgrade than a new Intel processor.

Yes, I understand that they went for power savings. I thought that they would use the power saving to generate less heat. And the lower temps would allow Intel processors to hit 6GHz+ over 5 years time at least.

AMD and Intel need to figure out how to get more data flowing through these processors and show consumers why they need to upgrade. Tired of twiddling my thumbs every new processor release. I want to throw some money at one of these companies for a legitimate upgrade in raw power. Legitimate means 25% or more faster at gaming and my other interests than my current CPU. I think that's not asking for much as some of my gear is 5 years old.

So you're right I wouldn't tell someone to buy a new FX processor. I surely wouldn't tell them to "upgrade" to an Intel processor from a FX processor as there is very little value in doing so. And everyone should stay away from the i3 processors. Those things need to be used only for NUC boxes.

Why would they, though? I really don't think Intel has given a shit considering the competition level AMD has been over the years, including before the FX series. And now that software is finally taking advantage of more than two to four cores the FX 8*** chips are looking better, and on the gaming front. I want to give out a "whoopie" there because it just doesn't seem like a big deal this late in the game. And correct if I'm wrong because I haven't looked at any benchmarks since the last time I posted in here - but doesn't the FX series struggle compared to Intel when getting into 4k gaming?

Personally I like that Intel has worked on lowering power consumption while increasing IPC what little percentage they have each time. But I'm also pro any kind of savings, but even more important is the reduction in heat.
 
Last edited:
Why would they, though? I really don't think Intel has given a shit considering the competition level AMD has been over the years, including before the FX series. And now that software is finally taking advantage of more than two to four cores the FX 8*** chips are looking better on the gaming front. I want to give out a "whoopie" there. Of course, and correct if I'm wrong because I haven't looked at any benchmarks since the last time I posted in here - but doesn't the FX series struggle compared to Intel when getting into 4k gaming?

Personally I like that Intel has worked on lowering power consumption while increasing IPC what little percentage they have each time. But I'm also pro any kind of savings, but even more important is the reduction in heat.

When you increase the resolution, the name and model of the processor matters less and less. 100% of the bottleneck shifts to your GPU. For 4K gaming, you'd have to go way back in time to find a processor to choke your frame rate.

Now your statement is very true about Intel in my opinion where they simply don't have competition. But you notice that PC CPU sales keep dropping? That's because consumers no longer give a shit about how old their processor is. I reinstalled Windows 10 on an older Intel system a couple of weeks ago. Ran just as good as my nephew's i7-5820K on the desktop. Guess when that person will be upgrading their CPU? Never as they could care less if it cost 50 cent more to run per month than a new build. It would literally take them hundreds of years to recoup the price of a new build.
 
When you increase the resolution, the name and model of the processor matters less and less. 100% of the bottleneck shifts to your GPU. For 4K gaming, you'd have to go way back in time to find a processor to choke your frame rate.

Now your statement is very true about Intel in my opinion where they simply don't have competition. But you notice that PC CPU sales keep dropping? That's because consumers no longer give a shit about how old their processor is. I reinstalled Windows 10 on an older Intel system a couple of weeks ago. Ran just as good as my nephew's i7-5820K on the desktop. Guess when that person will be upgrading their CPU? Never as they could care less if it cost 50 cent more to run per month than a new build. It would literally take them hundreds of years to recoup the price of a new build.

Gotcha on the 4k gaming. For some reason I swear I saw something that showed Intel's higher IPC benefiting 4k.

I also remember the CPU sales argument from the past as well, and don't deny it. But it's not like decreased sales isn't going to hurt AMD as well. AMD has me scratching my head a lot more than Intel and what they've done.

I know there is no cost savings upgrading CPU's. It's just a nice thing to know - the CPU I upgraded to is using less power than my previous. The reduction in heat is much more important to me. Knowing that I was going to get an "okay" improvement in IPC from my i7 930 to the 4770k was alright. Having the itch to upgrade also played a role. A little more future-proof? Yep. Handing down the i7 930 to a buddy was also a reason to upgrade. But the number one reason why I really wanted to upgrade? The amount of f'n heat the i7 930 produced. Combine that with the 7970 I had paired with it, it was like a space heater running. It actually surprised me just how much heat that machine put out. My 4770k with the 980 is like a damn ice box in comparison.

Side note - I realize I'm comparing Intel with Intel in my upgrade example but I've always heard those FX chips produce a fair amount of heat themselves. Really the only FX chip that's really caught my eye was the 8320e because of the reduced wattage. I had contemplated upgrading my 1055t for one a couple years ago.
 
Gotcha on the 4k gaming. For some reason I swear I saw something that showed Intel's higher IPC benefiting 4k.

I also remember the CPU sales argument from the past as well, and don't deny it. But it's not like decreased sales isn't going to hurt AMD as well. AMD has me scratching my head a lot more than Intel and what they've done.

I know there is no cost savings upgrading CPU's. It's just a nice thing to know - the CPU I upgraded to is using less power than my previous. The reduction in heat is much more important to me. Knowing that I was going to get an "okay" improvement in IPC from my i7 930 to the 4770k was alright. Having the itch to upgrade also played a role. A little more future-proof? Yep. Handing down the i7 930 to a buddy was also a reason to upgrade. But the number one reason why I really wanted to upgrade? The amount of f'n heat the i7 930 produced. Combine that with the 7970 I had paired with it, it was like a space heater running. It actually surprised me just how much heat that machine put out. My 4770k with the 980 is like a damn ice box in comparison.

Side note - I realize I'm comparing Intel with Intel in my upgrade example but I've always heard those FX chips produce a fair amount of heat themselves. Really the only FX chip that's really caught my eye was the 8320e because of the reduced wattage. I had contemplated upgrading my 1055t for one a couple years ago.

My system idles at 28c. Hits 55c peak while gaming. That's with the video card and CPU overclocked to hell and back. For me that's not hot. But to another this might be a reason to upgrade. Everyone is different! :)
 
My system idles at 28c. Hits 55c peak while gaming. That's with the video card and CPU overclocked to hell and back. For me that's not hot. But to another this might be a reason to upgrade. Everyone is different! :)

I believe my 930 idled between 40 - 45c. Gaming it would hit 70c, probably a little hotter. Granted I had it overclocked pretty good to 4.0GHz, but I also had frequency stepping on so it wouldn't run at 4.0GHz when it wasn't being utilized. I actually Hard Forum members what they might have thought was causing my 930 to idle so damn hot and turns out it was the voltage being applied for the OC. All I did was overclock in the BIOS and let the mobo adjust the voltages for me which was running a pretty high voltage on the CPU. The 7970 on the other hand I didn't touch. It was just a hot GPU.

Idling at 28c and peaking at 55c gaming wouldn't have me wanting to upgrade. But you know... I was using an i7 930. It wouldn't have made much sense for me to swap over to FX at the time. Plus I bought a used 4770k for about 70 bucks or so less than retail.
 
I believe my 930 idled between 40 - 45c. Gaming it would hit 70c, probably a little hotter. Granted I had it overclocked pretty good to 4.0GHz, but I also had frequency stepping on so it wouldn't run at 4.0GHz when it wasn't being utilized. I actually Hard Forum members what they might have thought was causing my 930 to idle so damn hot and turns out it was the voltage being applied for the OC. All I did was overclock in the BIOS and let the mobo adjust the voltages for me which was running a pretty high voltage on the CPU. The 7970 on the other hand I didn't touch. It was just a hot GPU.

Idling at 28c and peaking at 55c gaming wouldn't have me wanting to upgrade. But you know... I was using an i7 930. It wouldn't have made much sense for me to swap over to FX at the time. Plus I bought a used 4770k for about 70 bucks or so less than retail.

I'm on water. :) Even on a H100i I was at the same temps. I like the custom loop for the looks and fun of building it myself. I used to be a car mechanic and stuff like this keeps me interested in PC hardware and gaming. Plus it was a pain to have one AIO cooler on the CPU and another on the GPU.
 
I'm on water. :) Even on a H100i I was at the same temps. I like the custom loop for the looks and fun of building it myself. I used to be a car mechanic and stuff like this keeps me interested in PC hardware and gaming. Plus it was a pain to have one AIO cooler on the CPU and another on the GPU.

I started out using a good looking Zalman air cooler on my 930. Then I decided I wanted to try my hand at a custom water loop myself, not to mention the OC I had on the 930 was about all the Zalman wanted to do. I figured water cooling would be added protection. After doing the water cooling loop, I actually prefer AIO's myself, lol. The custom loop was alright, great experience but I ran into a few snags along the way because it was all new to me that turned me off some. Then when I decided to do the upgrade it was a total PitA. Trying to work around the tubing removing mobos then having to redo the tube lengths for the cpu because where the cpu sits was different... Even adding/removing GPU's can be a headache. It does a fine job cooling but I installed an AIO on the 1055t and love it. It looks great, was super easy and is an excellent cooler. I also don't have my gpu cooled by the water loop. I had plans to do such but the workload just doesn't seem worth it at this point.
 
Not really sure what you are getting at. Trying to say that the FX 8370 cannot cut it? Trying to say that the Nvidia card is poor at multi threaded DX 12? Sorry but, not really sure what you are getting at?

He is just posting some stuff we already saw, 1080p is not that brilliant , but the numbers for higher resolutions are ;) . Then again if he could find 800*600 benchmarks he would have posted it as well ....
Then again everyone that buys a Titan X is buying it so they can game at 1080p ...
 
Not really sure what you are getting at. Trying to say that the FX 8370 cannot cut it? Trying to say that the Nvidia card is poor at multi threaded DX 12? Sorry but, not really sure what you are getting at?

Not trying to prove anything, this thread is about Gears of Wars 4 and i7 6700K vs FX 8370, right?. that's what I've posted, no more and no less. The surprise?a reputable site with totally different results than GameGPU as always..

He is just posting some stuff we already saw, 1080p is not that brilliant , but the numbers for higher resolutions are ;) . Then again if he could find 800*600 benchmarks he would have posted it as well ....
Then again everyone that buys a Titan X is buying it so they can game at 1080p ...

yeah, why not?. I still game at 1080P and im still going to be playing at this resolution until we have hardware powerful enough in the market to play at 1440P at the same smooth and fast +120zh/144hz with maxed settings which certainly we don't have yet.. so yes a Titan X Pascal + OC'd 6700K show to be barely enough to that task in this game.. is relevant to me and to all the amount of people who use high refresh rate panels.

Btw this same Techspot benchmark was posted by Cageymaru in the GPU section, but of course here at AMD CPU sub-forum everything what's posted have to be showing AMD being competitive right?. that's why he forgot to post the techspot results here, right?.
 
Btw this same Techspot benchmark was posted by Cageymaru in the GPU section, but of course here at AMD CPU sub-forum everything what's posted have to be showing AMD being competitive right?. that's why he forgot to post the techspot results here, right?.

The guru3d link has the same thing linked in one of my previous posts:
untitled-4.png
 
The guru3d link has the same thing linked in one of my previous posts:
View attachment 9195

Which is good but only show Average FPS and not minimums which is certainly a thing I appreciate way more even to some extend adjust settings in order to keep highest stable minimums because the experience can be certainly screwed jumping from for example 70 to 120FPS. or 70 to 144FPS. im not being hostile or any close of that, in fact one of the reasons Im more of a passive user lately in [H] is due excessive hostility but as said above if this thread is called i7 6700K vs FX 8370 in Gears of wars 4 which is a game relevant to me then any benchmark related to the mentioned products in the mentioned game will be worth to me, my post didn't attacked AMD or Intel or Anything, one image and one link about the topic, that's all.
 
Back
Top