cageymaru

Fully [H]
Joined
Apr 10, 2003
Messages
22,060
Call of Duty: Infinite Warfare. AMD FX 8370 versus Intel Core i7 5960X . Let's check out the CPU performance in this game to see what 5 years of Intel engineering gets us over our beloved FX 8000 processors.

Guru3D article tackled the topic. Check out their article for analysis of CPU and GPU performance in this title.
http://www.guru3d.com/articles_page...e_warfare_pc_graphics_benchmark_review,8.html


Things start off pretty much dead even to me. This test is with an AMD video card under DX11; the supposed kryptonite for AMD performance metrics. I doubt anyone can tell the difference between the FX-8370 and i7-5960X in this game. The higher the resolution the more sameness each processor exhibits.

index.php



Next up is the venerable GTX 1070. Again if you can tell the difference in a $200 vs a $1,200 processor then I have some oceanfront property in Indiana, USA to sell you. Indiana is in the middle of the USA for people outside the USA. :) :) :)

index.php


For the GeForce GTX 1070 we see very similar behavior tiny bit of a perf drop for the AMD processor, but trivial at best. Again no worries here, the 200 USD processor plays just as well as the 1200 USD one does.

Another new release; another game where 5 year old tech is just as good as today's technology. David vs Goliath matchup. Guess we need some Kaby Lake reviews to have a reason to upgrade?
 
This with Zen coming would make things rather weird. In the end would you need a new cpu not likely if everything is GPU bound...
 
This with Zen coming would make things rather weird. In the end would you need a new cpu not likely if everything is GPU bound...

I've been thinking of that very scenario for a long time. Luckily I could actually use the i7-5820K for jobs like rendering video. So for scenarios not involving gaming, there is a reason to upgrade. For gaming, I'm hoping that Zen / Kaby Lake delivers. Better single core performance would make up the difference in games made by Codemasters for example. To be honest all of those Codemasters games run at a nice frame rate in spite of the slower single core performance.
 
I've been thinking of that very scenario for a long time. Luckily I could actually use the i7-5820K for jobs like rendering video. So for scenarios not involving gaming, there is a reason to upgrade. For gaming, I'm hoping that Zen / Kaby Lake delivers. Better single core performance would make up the difference in games made by Codemasters for example. To be honest all of those Codemasters games run at a nice frame rate in spite of the slower single core performance.
I'm aware of that most games still do better with improved IPC , but I'm hearing nothing positive about Zen clock speed. Another month until those X370 boards are out maybe we get to know retail Zen clock speed.
 
Well if AMD can stop the leaks for a change, it means that they are doing something right. :)
 
Now do the same for StarCraft II.

Oh wait, you will not because it does not fit your agenda.

To be fair he was linking an article by somebody else. He didn't make the article comparing them with Infinite Warfare. If you have a link to a similar article where they used Starcraft II then by all means share it with us.
 
To be fair he was linking an article by somebody else. He didn't make the article comparing them with Infinite Warfare. If you have a link to a similar article where they used Starcraft II then by all means share it with us.
Oh, he does that like every not-too-CPU-demanding game release, that's what is getting annoying.
 
сod_proz_amd.png


Who would have thought, a different scene a different load. And a 5 year old dual core beating most of the FX line.
 
Last edited:
сod_proz_amd.png


Who would have thought, a different scene a different load. And a 5 year old dual core beating most of the FX line.
Picking one benchmark for yourself. Using a good multithreading game engine shows something totally different. AMD beats out Intel 8 core processors on these tests when you are less gpu limited and going SLI:

http://www.technologyx.com/featured/amd-vs-intel-our-8-core-cpu-gaming-performance-showdown/4/

Surprising results with a 5 year old processor design beating out Intel 8 core.
 
Picking one benchmark for yourself
Precisely what OP has done, fair play there.

AMD beats out Intel 8 core processors on these tests when you are less gpu limited and going SLI:
And then you proceed to pick out one benchmark that has 5960X at 3.3 and at 4.4Ghz perform exact same down to 2 fps 0.1% min difference. "Less GPU limited" my ass.

Surprising results with a 5 year old processor design beating out Intel 8 core.
In GPU bound title. 10/10 article, unsurprising Jason has top comment on that.

P. S. Sorry, 15/10. They actually tested 960 in 1080p, 970 in 1440p and SLId 970s in 4k to make sure every side remains GPU bound.
 
Not one there, several but most important was the Crysis 3 benchmark that even being DX 11 is also very threaded engine design that shows in this case the AMD superior to Intels 5960x. Still look at all those SLI benchmarks and tell me yeah that $1200 processor is the way to go here for SLI gaming! So no not one benchmark.

So if the FX 9590 is better with two SLI 970's over an Intel 5960x, how would it perform with two TitanX (Pascal)?
 
So thinking getting another 1070 for SLI would be limited by my FX 9590, the truth or fact of the matter is it may not be at all and in highly threaded games it maybe the best route to go and by far the best bang/$. With DX 12 and use of more cores the FX may be coming on it's own.
 
Not one there, several but most important was the Crysis 3 benchmark that even being DX 11 is also very threaded engine design that shows in this case the AMD superior to Intels 5960x.
http://www.techspot.com/review/642-crysis-3-performance/page6.html
http://www.tomshardware.com/reviews/crysis-3-performance-benchmark-gaming,3451-8.html

I see it struggles to keep up with stock Ivy Bridge without overclock :p

Still look at all those SLI benchmarks and tell me yeah that $1200 processor is the way to go here for SLI gaming! So no not one benchmark.
Show me Titan XP SLI benchmarks, then we'll talk :p

So if the FX 9590 is better with two SLI 970's over an Intel 5960x, how would it perform with two TitanX (Pascal)?
Obviously get rekt by 5960X, because 970s at 4k are purely GPU limited. In fact, SLId Titan XPs probably would be too. Listen, i understand, you are the poor guy that actually bought into 9590 knowing full well it was a turd. But by this time you would be through to acceptance stage, not stay in denial, wouldn't you?
 
http://www.techspot.com/review/642-crysis-3-performance/page6.html
http://www.tomshardware.com/reviews/crysis-3-performance-benchmark-gaming,3451-8.html

I see it struggles to keep up with stock Ivy Bridge without overclock :p


Show me Titan XP SLI benchmarks, then we'll talk :p


Obviously get rekt by 5960X, because 970s at 4k are purely GPU limited. In fact, SLId Titan XPs probably would be too. Listen, i understand, you are the poor guy that actually bought into 9590 knowing full well it was a turd. But by this time you would be through to acceptance stage, not stay in denial, wouldn't you?
Sorry but the denial is completely in your court. Ignorance as well in heaping loads.

First this is the AMD CPU section. So if the OP only ever posted positive wins for AMD then by God he is doing it the right way. If he was posting them in the Intel forum then yes you may have a point but since he has not then you have nothing.

Second being this is the AMD section it would be nice if the lot of you (insert any and all explicatives here) would refrain from posting here as you lack the decency to show respect to those that own AMD hardware that visit here to get info preferably free from angst and degradation.

Oh and try this... fire up your fav game... now does it feel good.... now does it affect my game play in any way... does my gameplay affect you in any way... See how asinine your posts here are now. Context, which nearly all your anti-AMD post lack, is king and it is why your arguments fall flat.

FOR FACT: I can play ever single one of my Games at >=60fps with abundant beauty. No matter what you think of my setup I enjoy it and not once have regretted having it.
 
First this is the AMD CPU section
Comparison between AMD and Intel CPUs are entirely relevant in AMD CPU Section, 0/10
. So if the OP only ever posted positive wins for AMD then by God he is doing it the right way.
He certainly cherrypicks his stuff the right way. But it does not preclude anyone from cherry picking another benchmark that disproves the OP's cherrypick. Basic stuff, really.
If he was posting them in the Intel forum then yes you may have a point but since he has not then you have nothing.
That thread would be just as relevant in Intel CPU section, but there is nobody caring enough to prove their superiority over AMD's when this very site's own reviews have done so.
Second being this is the AMD section it would be nice if the lot of you (insert any and all explicatives here) would refrain from posting here as you lack the decency to show respect to those that own AMD hardware that visit here to get info preferably free from angst and degradation.
Insert the South Park Safe Space song here
FOR FACT: I can play ever single one of my Games at >=60fps with abundant beauty. No matter what you think of my setup I enjoy it and not once have regretted having it.
So can i, and i do not even own a dGPU. Yet you will not see me posting benchmarks of StarCraft 1, Nethack and Dwarf Fortress illustrating that Intel iGPUs are totally fine for gaming, since they are not.
See how asinine your posts here are now
The asinine part is the fact that you really did resort to a safe space argument.
 
Comparison between AMD and Intel CPUs are entirely relevant in AMD CPU Section, 0/10

He certainly cherrypicks his stuff the right way. But it does not preclude anyone from cherry picking another benchmark that disproves the OP's cherrypick. Basic stuff, really.

That thread would be just as relevant in Intel CPU section, but there is nobody caring enough to prove their superiority over AMD's when this very site's own reviews have done so.

Insert the South Park Safe Space song here

So can i, and i do not even own a dGPU. Yet you will not see me posting benchmarks of StarCraft 1, Nethack and Dwarf Fortress illustrating that Intel iGPUs are totally fine for gaming, since they are not.

The asinine part is the fact that you really did resort to a safe space argument.
No it is called being a decent human being. I do not post in the Intel or Nvidia forums because I have no interest in them and as many a good parent has stated, " if you cant say something nice don't say anything at all".

As far as the OPs post, it is about one game and not any other. Using old games as fodder is pathetic and desperate. I remember the years of Skyrim posts where I should have been getting 35-40fps whereas I was actually getting 75fps (capped) with 100 mods.

If anything in this case what we are seeing with a lot of the current gen games is less reliance on IPC and having extra cores is now a boon. This age old argument you continue to foster is out-dated and irrelevant. Mentioning starcraft is as desperate as the Skyrim benches that flooded ever AMD CPU thread in forums everywhere. It is a very badly coded game that at best is relevant to about <1% of gamers and not at all indicative of any portion of games to release recently. I play Warcraft @90fps (capped) with minimums rarely hitting ~50fps. I am one of the best players in my guild and never has my framerate or PC given me issue to remain a top player.

In these posts here in the AMD forum the numbers speak to great gameplay on FX CPUs. At no point do they mention AMD matching Intels IPC or processing power. The general consensus is that those with FX CPUs or those that might be interested as to the viability of one can be pleased with said performance even against CPUs far more expensive. Does that mean that every game released from here on out will have these same results? No. But based on recent benches and the current trend, then it is safe to assume that they can be viable as gaming machines especially for those that might be fund limited.

On a bit of a side note: You are really throwing stones at us FX owners when you cant play a single game at levels even half of what we can... That is sad.
 
Not one there, several but most important was the Crysis 3 benchmark that even being DX 11 is also very threaded engine design that shows in this case the AMD superior to Intels 5960x. Still look at all those SLI benchmarks and tell me yeah that $1200 processor is the way to go here for SLI gaming! So no not one benchmark.

So if the FX 9590 is better with two SLI 970's over an Intel 5960x, how would it perform with two TitanX (Pascal)?

I think any enthusiast with half a brain would not recommend a $1,200 CPU just to game....
 
as many a good parent has stated, " if you cant say something nice don't say anything at all".
That's a bad parent, actually.
As far as the OPs post, it is about one game and not any other.
OP's post is largely taken in context of his past 5 or whatever posts that were exact same down in the narrative.
Using old games as fodder is pathetic and desperate.
You never said what games you have meant, however, so that was entirely fair on my part to turn your argument onto it's head on the lack of information.
If anything in this case what we are seeing with a lot of the current gen games is less reliance on IPC and having extra cores is now a boon.
You mean the days when 6700k is by far the best gaming CPU on the market? Not really.
Mentioning starcraft is as desperate as the Skyrim benches that flooded ever AMD CPU thread in forums everywhere.
Notice how i mention it in context of a gaming just fine on iGPU, not actually game's CPU performance. But since you are so blindfolded by your own bias you fail to even read my post.
In these posts here in the AMD forum the numbers speak to great gameplay on FX CPUs.
Sure, why would an owner of AMD CPU make posts shitting on his own CPU? Of course there will be an attempt to justify that it will be just fine. The manner they are done with, however, have little value, considering how a simple counter example of the very same game leads to this conversation.
No. But based on recent benches and the current trend, then it is safe to assume that they can be viable as gaming machines especially for those that might be fund limited.
But here's the issue: the current trend speaks quite clearly: an i3-6100 keeps up with FX-8xxx just fine in majority of games, beating it in some. And it is basically confirmed that i3-6100's performance is moving to Pentium G4620, a certain to be under $100 SKU. Newsflash: FX-8xxx are currently *more expensive*.
On a bit of a side note: You are really throwing stones at us FX owners when you cant play a single game at levels even half of what we can... That is sad.
Yep, my current position for pragmatic reasons does not allow me to shell out some more for a 1060 and play TW3 and the like. Instead i can play Dwarf Fortress at better FPS any FX owner will ever have while helping a friend do his research at better speeds than any FX owner ever would without rewriting his code. First world problems.

And finally the worst impact of these posts: they may legitimately convince some poor soul to buy an FX CPU in 2016.
 
Picking one benchmark for yourself. Using a good multithreading game engine shows something totally different. AMD beats out Intel 8 core processors on these tests when you are less gpu limited and going SLI:

http://www.technologyx.com/featured/amd-vs-intel-our-8-core-cpu-gaming-performance-showdown/4/

Surprising results with a 5 year old processor design beating out Intel 8 core.

Do you really want to go there with your immensely huge loser case? Even you would know how few benchmarks your view would actually work in. And most of those depends on either a light scene or a scripted benchmark.

Not to mention how often your showcase ends up losing to 5 year old i3 CPUs in said cases.
 
Last edited:
That's a bad parent, actually.

OP's post is largely taken in context of his past 5 or whatever posts that were exact same down in the narrative.

You never said what games you have meant, however, so that was entirely fair on my part to turn your argument onto it's head on the lack of information.

You mean the days when 6700k is by far the best gaming CPU on the market? Not really.

Notice how i mention it in context of a gaming just fine on iGPU, not actually game's CPU performance. But since you are so blindfolded by your own bias you fail to even read my post.

Sure, why would an owner of AMD CPU make posts shitting on his own CPU? Of course there will be an attempt to justify that it will be just fine. The manner they are done with, however, have little value, considering how a simple counter example of the very same game leads to this conversation.

But here's the issue: the current trend speaks quite clearly: an i3-6100 keeps up with FX-8xxx just fine in majority of games, beating it in some. And it is basically confirmed that i3-6100's performance is moving to Pentium G4620, a certain to be under $100 SKU. Newsflash: FX-8xxx are currently *more expensive*.

Yep, my current position for pragmatic reasons does not allow me to shell out some more for a 1060 and play TW3 and the like. Instead i can play Dwarf Fortress at better FPS any FX owner will ever have while helping a friend do his research at better speeds than any FX owner ever would without rewriting his code. First world problems.

And finally the worst impact of these posts: they may legitimately convince some poor soul to buy an FX CPU in 2016.
Lets try something different.

First you obviously lack any real experience, basing your whole argument on whatever benchmark done by another. That foremost makes a great deal of your input anecdotal and irrelevant.

Had you any experience with gaming or anything in general with Intel 2 cores even with HT you would know they absolutely suck for anything like gaming or heavy multitasking. It used to be we got Frametime graphs which should how erratic 2 core processors were/are. But I guess we cant be bothered with the details so...

So now to the new part:

So can you prove that playing on a FX system will not produce great gameplay? Really prove it not some random benchmark but actual gameplay experience. Doubt it.

So are you saying someone looking for a CPU should buy the i3 and not even get the i5 or i7 because obviously they give equal results?

Honestly if someone were in the position of HAVING to get a CPU now I would recommend as follows i7>i5>FX>APU>i3. If they wanted to know if a FX was a viable option for great gameplay then that answer is a FACTUAL yes. If they could in fact wait then I would advise doing so for results on Zen but I am not expecting leaps in performance (in regards to gaming much like we see in charts like the OPs).

See I have no need for bias as I try to not let it influence my recommendations. You however seem bent by them and then have the audacity to lay that label on others. And being you have no real current experience with gaming, with games like in the OP, maybe you should reconsider your desire to post as if you have some insight, which you obvious lack.
 
Comparison between AMD and Intel CPUs are entirely relevant in AMD CPU Section, 0/10

Goodbye , since you don't decide what is relevant why would you bother visiting this section anyway since it is all irrelevant, Intel is superior.
 
So thinking getting another 1070 for SLI would be limited by my FX 9590, the truth or fact of the matter is it may not be at all and in highly threaded games it maybe the best route to go and by far the best bang/$. With DX 12 and use of more cores the FX may be coming on it's own.

...

There are people screaming at the top of their lungs that my 2500K at 4.7Ghz isn't up to task with a pair of 980TIs, and at least in terms of minimum frame rate I agree. Thankfully I'm only looking for 60 FPS.

No, the FX will not keep up. Time to stop deluding yourself.
 
Goodbye , since you don't decide what is relevant why would you bother visiting this section anyway since it is all irrelevant, Intel is superior.
I see someone's offended? Ha.
Had you any experience with gaming or anything in general with Intel 2 cores even with HT you would know they absolutely suck for anything like gaming or heavy multitasking.
Absolutely correct, i do indeed have experience of actual gaming with 2 Penryn cores. Never looked back once i got that C2Q and then Skylake upgrade. But here is the problem with your post.
It used to be we got Frametime graphs which should how erratic 2 core processors were/are. But I guess we cant be bothered with the details so...
If absolute minimums of i3-6100 end up sitting above absolute minimums of 8350 it means 1 and only 1 thing: frametime graph is irrelevant, i3-6100 has consistently higher fps. Being condescending won't change that one bit.
So can you prove that playing on a FX system will not produce great gameplay? Really prove it not some random benchmark but actual gameplay experience. Doubt it.
Once again, you do not define conditions. I could absolutely fire up a BF1 Dx12 or Civ VI on 8350+Fury X and after throwing up, extract the frametime graph. But since you did not actually provide conditions, go figure.
So are you saying someone looking for a CPU should buy the i3 and not even get the i5 or i7 because obviously they give equal results?
Someone looking for a cheap CPU should get i3 or that new Kaby Pentium for games over FX any freaking day on the sole basis that in case that CPU won't do well you can actually replace it without throwing away a whole bunch of money on new memory and mobo. The fact that it will do as well or better than FX in majority of games is a nice cherry on top.
Honestly if someone were in the position of HAVING to get a CPU now I would recommend as follows i7>i5>FX>APU>i3. If
Are you seriously recommending those gimped APUs that CPU-wise only keep up with my old E5450? I would understand if that was over 2c2t Pentium, because as i've agreed, having only 2 threads sucks. But APUs? Ahahaha.
See I have no need for bias as I try to not let it influence my recommendations.
Quite the opposite, advising to get the not-even-Bristol-Ridge APUs for a gaming system in 2016 can only be done by bias. And finally, you are the last one to talk about bias with your posts in video cards section. Those were golden as well.
And being you have no real current experience with gaming
When you name yourself JustReason, just reason remains in name only, apparently.
with games like in the OP,
Oh, finally you have bothered to provide context. Well, you got me. I do indeed have no time to play Call of Duty of any sort.
maybe you should reconsider your desire to post as if you have some insight, which you obvious lack.
Sorry, but hearing that from someone advising to get A10-78xx in 2016 is just so funny, i can't help.
 
I see someone's offended? Ha.

Absolutely correct, i do indeed have experience of actual gaming with 2 Penryn cores. Never looked back once i got that C2Q and then Skylake upgrade. But here is the problem with your post.

If absolute minimums of i3-6100 end up sitting above absolute minimums of 8350 it means 1 and only 1 thing: frametime graph is irrelevant, i3-6100 has consistently higher fps. Being condescending won't change that one bit.

Once again, you do not define conditions. I could absolutely fire up a BF1 Dx12 or Civ VI on 8350+Fury X and after throwing up, extract the frametime graph. But since you did not actually provide conditions, go figure.

Someone looking for a cheap CPU should get i3 or that new Kaby Pentium for games over FX any freaking day on the sole basis that in case that CPU won't do well you can actually replace it without throwing away a whole bunch of money on new memory and mobo. The fact that it will do as well or better than FX in majority of games is a nice cherry on top.

Are you seriously recommending those gimped APUs that CPU-wise only keep up with my old E5450? I would understand if that was over 2c2t Pentium, because as i've agreed, having only 2 threads sucks. But APUs? Ahahaha.

Quite the opposite, advising to get the not-even-Bristol-Ridge APUs for a gaming system in 2016 can only be done by bias. And finally, you are the last one to talk about bias with your posts in video cards section. Those were golden as well.

When you name yourself JustReason, just reason remains in name only, apparently.

Oh, finally you have bothered to provide context. Well, you got me. I do indeed have no time to play Call of Duty of any sort.

Sorry, but hearing that from someone advising to get A10-78xx in 2016 is just so funny, i can't help.
Still not one bit of context or real experience from you.

And for the record absolute minimums do not speak to frametimes what-so-ever. A perfect example of this is the iGPU of Intel. When we used to get frametime graphs that Intels graph was erratic as all get out. Hence why the lower avg APU from AMD was always the winner because it was far more stable from one frame to the next. See how that works? I gave a concrete example and explanation to further my point. Now add to this that having less cores means a lot more than a single game performance view. You wont be streaming music nor recording your gameplay with little interference to that fps. Anyone that has ever used one knows that tabbing out is another PIA with i3s. But lets live with our head in the sand as you allude we do so your arguments have merit.

You keep trying, without success, to paint much of us as fanboys and that is why our arguments need not be considered. Yet I gave what can only be considered unbiased hierarchy of CPU purchases with i7>i5>FX>APU>i3. And have given ample reasons for how I came to that conclusion. In reality I hate Intel as a business and refuse to own anything Intel if I can help it. That being said does my hierarchy make sense? Based on your claims of my bias, it should look like this: FX>APU>Consoles>ARM>anything else not Intel THE END. However that is not the case as aforementioned.

Now looking at this bench in the OP and others over the last year, we can easily see the trend of IPC not being as big a factor as in the past. As far as 60hz monitors, CPU choice will have little bearing on the end result for 99% of users. The GPU choice will have a far greater influence. Hence the OPs point that spending 3-6times more will provide precious little to the end gaming result whereas there are a plethora of GPU benches that show what difference $30 can make.

Well using http://www.tomshardware.com/reviews/multi-core-cpu-scaling-directx-11,4768.html we get an idea of core number.
upload_2016-11-5_16-2-48.png

Just using Intels but you see why 2 cores kinda suck. But for reference this is the FPS chart:
upload_2016-11-5_16-3-38.png


BF4
upload_2016-11-5_16-4-19.png


upload_2016-11-5_16-4-41.png


Different site for easier access to current games
Now Titanfall:
upload_2016-11-5_16-11-9.png


CIV VI:
upload_2016-11-5_16-14-8.png


GoW4:
upload_2016-11-5_16-15-12.png


Well you see that For the lions share of the market and 60hz gaming AMDs FX series is doing very well. And the 2 core i3 are random at best. Hence why I stick to my recommendations. How about you validate yours, for gaming. That other crap you may do is no where near indicative of what others do and not on topic for this thread.
 
  • Like
Reactions: noko
like this
And for the record absolute minimums do not speak to frametimes what-so-ever.
They freaking do. They may not have same amount of information as entire frametime/framerate rundown, but i am not claiming that either. In fact ,your very own examples demonstrate my point: frametimes will not get worse than the absolute worst. So, if the absolute worst framerate/frametime is only 10% worse than average, it means that all framerates are at least 90% of average at all times.

When we used to get frametime graphs that Intels graph was erratic as all get out. Hence why the lower avg APU from AMD was always the winner because it was far more stable
Just saying: until Iris, Intel's GPUs were losing to APUs on average framerates as well by a significant margin, frametimes be damned. Not to mention, i did not actually see frametime analysis of iGPUs.

See how that works? I gave a concrete example and explanation to further my point.
No, you are giving concrete examples below, that prove me correct to make it funnier.

As far as 60hz monitors, CPU choice will have little bearing on the end result for 99% of users. The GPU choice will have a far greater influence. Hence the OPs point that spending 3-6times more will provide precious little to the end gaming result whereas there are a plethora of GPU benches that show what difference $30 can make.

Yes, precisely.

Now, onto your examples, because you was kind enough to demonstrate that i3-6100 is just fine for gaming. But, wait a second

You wont be streaming music nor recording your gameplay with little interference to that fps. Anyone that has ever used one knows that tabbing out is another PIA with i3s
I won't talk for recording gameplay since last time i have done that i was using a dual core Penryn and was generally going from 40 fps to 30 with it and the like. But i will call you out on your bullshit with streaming music. As for tabbing out, i have used to do that with K10 dual core with 0 impact, beyond chance of application simply crashing because well managed fullscreen was not very cool back then.

Just using Intels but you see why 2 cores kinda suck. But for reference this is the FPS chart:
First: this graph is done WITH HYPER THREADING TURNED OFF ON EVERY SINGLE ONE OF THEM, ROFL. You could not pick a worse example if you tried to stay "objective".

Now Titanfall:
Where i3-6100 delivers technically better experience than all FX chips below 9590. In practice, better experience than every non-8xxx FX chip [and by extension probably APU].

Where it delivers technically worse experience than 8350, but in practice is basically equal with those. And slightly better experience than every non-8xxx Piledriver/Bulldozer FX(and probably APU).

Finally a game where i3-6100 is clearly beaten by 8xxx chips, and your APU recommendation is still ridiculous.

And the 2 core i3 are random at best.
Quite the opposite, i see that 2 core i3s are consistently competitive with FX chips in most games.

How about you validate yours, for gaming.
You have done that for me, genius. First by talking bullcrap about frametimes and absolute minimums without realization of what these things mean and how they are connected. Then, instead of evidence, you bring up a graph of 2 core 2 thread i3 that does not exist in real life and try to pretend it is relevant to your point.
Then you bring up bunch of games where we see only 1 where FX-8xxx are by any significant margin faster than basic i3-6100 that is cheaper than FX-8350, not to mention has a platform you can actually upgrade in future to get a decent CPU instead.

And finally.


You keep trying, without success, to paint much of us as fanboys and that is why our arguments need not be considered. Yet I gave what can only be considered unbiased hierarchy of CPU purchases with i7>i5>FX>APU>i3. And have given ample reasons for how I came to that conclusion. In reality I hate Intel as a business and refuse to own anything Intel if I can help it.
Precisely: you are not guided by reason, JustReason. But even you understand that if you would align your hierarchy FX>APU>i7>i5>i3 as it is in your brain, everybody would laugh and just that.

You may do what you want, but the only thing this convo does is discredit you further. But i'll give you credit for giving tomshardware link so i could verify that you took it out of context.
 
blah blah blah
You may do what you want, but the only thing this convo does is discredit you further. But i'll give you credit for giving tomshardware link so i could verify that you took it out of context.

Person that somehow misses the whole thread purpose posts in #6 about starcraft II and something about an agenda. If anyone deserves no credit it is you derailing it from your first post on ...
 
Person that somehow misses the whole thread purpose posts in #6 about starcraft II and something about an agenda. If anyone deserves no credit it is you derailing it from your first post on ...
The purpose is exact same as previous threads: to calm FX owners. Nothing else. It is done by cherry picking benchmarks. That's it.

People that go up in arms when someone cherry picks another benchmark in which the picture is suddenly much grimmer for FX chips? Well, that defensive reaction is entirely expected, yet amusing nonetheless.

Before you answer, if you even bother. Remember Kyle's editorial and reaction to it by a certain auditory :)
 
The purpose is exact same as previous threads: to calm FX owners. Nothing else. It is done by cherry picking benchmarks. That's it.

People that go up in arms when someone cherry picks another benchmark in which the picture is suddenly much grimmer for FX chips? Well, that defensive reaction is entirely expected, yet amusing nonetheless.

Before you answer, if you even bother. Remember Kyle's editorial and reaction to it by a certain auditory :)

So what do you have against ISO 9001?
 
This is a fantastic conversation and I do not even need to unhide any posts to know who is being responded too. :) :D On a different and more relevant note, nice to see that my FX 8300 would have no issues with COD: IW if I chose to buy it. Thanks OP. :)
 
So what do you have against ISO 9001?
Literally nothing except occasional PSTD when i remember it was one of my product certification exam questions.
This is a fantastic conversation and I do not even need to unhide any posts to know who is being responded too.
Yet, it is built into Direct X12 which makes all the difference
Ah, you are that guy.
 
The purpose is exact same as previous threads: to calm FX owners. Nothing else. It is done by cherry picking benchmarks. That's it.
People that go up in arms when someone cherry picks another benchmark in which the picture is suddenly much grimmer for FX chips? Well, that defensive reaction is entirely expected, yet amusing nonetheless.
Before you answer, if you even bother. Remember Kyle's editorial and reaction to it by a certain auditory :)

If you are living in the past old benchmarks will do I guess , don't need anything else ...
 
Nothing said above disproves that the FX series is still not applicable for modern multi-threaded game engines. In fact data shows with multi-threaded games it will give you very smooth game play. Now I had to laugh (very entertaining really) about an I3 equaling an eight core FX processor.

I have an I5 6500 with a 1060, I7 6700K with Nano, FX 8350 with a 290 and a FX 9590 with a 1070 hot to mention an I7 Haswell laptop (forget the processor) with an 860m. The FX 8350 creams the I5-6500 in processor related tasks such as rendering and the I5-6500 multi-tasking ability is noticeably worst. What is even more funny is the Intel I7 6700K with hyperthreading can be slower then an I5 in certain games (yeah that is right!).

With older games, DX 11 and very limited threaded code where it is not using fully any multicore cpu - Intel does have an advantage. It is not either cpu's fault but the game code itself that is the limiting factor not taking advantage of the whole cpu processing ability. That is exactly how I see it. You just can't keep improving IPC foreever or even make a big dent into it now and expect huge gains from one generation to the next based on IPC. Performance is going to come from use of more and more cores as well as Multi-GPU's in the future. No way are you going to get 4K per eye and beyond 90fps + from using a few threads, DX 11 and high IPC cpu's. While Zen should be a big improvement for IPC, it will be more important on being a great multi-threading core that can handle many threads efficiently and not stall out due to cache inadequacies, memory conflicts, insufficient buffers etc. I could care less if Kaby Lake is 10% better in IPC when Zen will have 4 extra cores that will just plainly blow it away in real cpu work loads.

Now anyone can look at a poster posting history and from the likes of it there are many AMD haters that are really AMD Fanboys feeling hurt for some reason. Why would they post so much in the AMD section way over their so called favorite worship brand be it Nvidia or Intel? :ROFLMAO:
 
Do you really want to go there with your immensely huge loser case? Even you would know how few benchmarks your view would actually work in. And most of those depends on either a light scene or a scripted benchmark.

Not to mention how often your showcase ends up losing to 5 year old i3 CPUs in said cases.
Reading is very fundamental, I suggest you take another look. Also Zen will be more important soon anyways. I don't know how many times I was told my AMD processor just is a poor gaming cpu but yet have played dozens of games smooth as butter. I guess I can't believe my bleeding eyes and must take it from Internet experts like yourself or at least maybe self-proclaimed. Yes there are games where the lower IPC processors may have issues (have not seen too many of these, in fact zero) like ARMA (don't have) but the more modern the game the better the FX is performing is what I am seeing. The future really belong to multi-threading/cores/gpus and programmers/developers that can take advantage of the new tech fully will be the ones that will win out. The one's using old methods etc. will just die out and will become irrelevant.
 
What is even more funny is the Intel I7 6700K with hyperthreading can be slower then an I5 in certain games (yeah that is right!).

Please give a few examples.

Reading is very fundamental, I suggest you take another look. Also Zen will be more important soon anyways. I don't know how many times I was told my AMD processor just is a poor gaming cpu but yet have played dozens of games smooth as butter. I guess I can't believe my bleeding eyes and must take it from Internet experts like yourself or at least maybe self-proclaimed. Yes there are games where the lower IPC processors may have issues (have not seen too many of these, in fact zero) like ARMA (don't have) but the more modern the game the better the FX is performing is what I am seeing. The future really belong to multi-threading/cores/gpus and programmers/developers that can take advantage of the new tech fully will be the ones that will win out. The one's using old methods etc. will just die out and will become irrelevant.

We hear the same story all the time. Benchmarks doesn't show it (Assuming its not scripted to begin with so it can run 60FPS on a Jaguar/Atom/VIA). Smoother but slower. DX12/Multithreading/something future will fix it all. And then we have to watch the cherry pickings. Just as in CPU benchmarks when someone tries to say that their FX is fine based on a sub 5% case. And did I forget the i3 vs FX hypocrites?

Try Fallout 4, go to Diamond City maxed and see your FX CPU go below 30FPS as average as an example.

The funny part is when Zen is out, the same people claiming their FX CPUs are more than enough will drop them in droves for something better that they previously didn't need.
 
Last edited:
Please give a few examples.



We hear the same story all the time. Benchmarks doesn't show it (Assuming its not scripted to begin with so it can run 60FPS on a Jaguar/Atom/VIA). Smoother but slower. DX12/Multithreading/something future will fix it all. And then we have to watch the cherry pickings. Just as in CPU benchmarks when someone tries to say that their FX is fine based on a sub 5% case. And did I forget the i3 vs FX hypocrites?

Try Fallout 4, go to Diamond City maxed and see your FX CPU go below 30FPS as average as an example.

The funny part is when Zen is out, the same people claiming their FX CPUs are more than enough will drop them in droves for something better that they previously didn't need.
Well I started Fall Out 4 and got bored with it. I also doubt it would go to 30 FPS but I will look out for it if I ever get to that point, sounds like a very limited game engine.

So one game you use to justify your case which please link to such results so some verification can be done. Multi-Threading is the future or there will be virtually zero reason for most folks to update their rigs in the next 10 years. Intel has just about maxed out IPC with X86/X64. Using more cores which means more threads is the future and the poorly done game engines will go bye bye in time. Also more threads did not fix anything cpu wise other then using a cpu more fully and getting more processing done.

GTA V would be another one you could have used and of course the mother of all limited threaded game engine ARMA. Still I play GTA V just fine on my FX 9590 and it never ever goes below 30 fps or 40 fps or 45 fps unless I set the graphics options too high - it is utterly GPU limited yet with my 1070 on 3440x1440p. When I go SLI (that is if I do) I will probably be 60fps pretty much constant) meaning it does not get much better then that on my 60hz monitor (except I maybe able to OC my monitor). In essence zero cpu bottlenecks that affect my game play.
 
Literally nothing except occasional PSTD when i remember it was one of my product certification exam questions.


Ah, you are that guy.

Yep, I am that guy, the guy who understands what works in reality and does not feel the need to look down upon others who do not use what I use. Oh well, guess spending less money on a product that works very well just is not the in thing with some of you guys, enjoy.
 
Back
Top