Tweaktown's AMD FX-8350 vs Intel 4930k 4K showdown. David slays Goliath again.

cageymaru

Fully [H]
Joined
Apr 10, 2003
Messages
22,104
Tweaktown built a FX-8350 system for $300 and pit it against an Intel i7-4930k which costs $1052. Both systems were tested with GTX 780 SLI and GTX 980 SLI @4K resolutions. The AMD system bested the Intel i7 in most of the tests, which is pretty sad. Guess I'll have to wait another generation for a CPU upgrade as 4K gaming is my next purchase.

Link to article.
 
We have not been CPU bound for quite some time in the enthusiast gaming market. It is interesting to see AMD come out on top in most of those tests though.
 
I've stopped to keep watching that review when i saw the FX-8350 scoring higher than the 4930K.. when in the real life my 8350 can't even be a match of my 3770K using same system except motherboard and CPU of course.. Fail-Review..
 
There hasn't been a compelling reason for me to spend over 150 bucks on a CPU for a long long time.

Kind of wish I did something more cpu intensive, I almost feel like a normal user.
 
certainly something its extremely wrong there.. just for example my 8350 at 4.8ghz score like 9300 in Fire Strike Physics Test.. my 3770K at 4.5 score over 12000.. and that number change the whole picture of the combined test and final result..

I would like to see the validation results of their 3dmark firestrike extreme page.. because look at this stock 4930K + stock 780SLI

Total: 9037
Graphics: 9928
Physics: 14688
combined: 4018

Validation Page.

how can they score with a 780SLI and same chip 5281? and with 980SLI 8345?.. fishy.. fishy.. just by that simply reason i've stopped in that exact moment..
 
if you add streaming games online the amd cpu just is better doing that without dropping fps the way the Intel machines would do.
good is cpu becomes less of an issue and gpu more.
 
Those results are quite fishy tbh. Intel i7 cpus tend to murder amd's in 3dmark (in physics and combined score).

What is true is the higher the res you go, the more gpu dependant games become.
 
The conclusion that I am seeing is that 4K resolutions shift the bottleneck significantly toward the GPU subsystem. Maximum FPS is awesome, but if you can get a playable experience for 1/3 the cost, assuming that your primary focus is gaming and that you do not use your computer for other tasks where the Intel CPU would benefit you more, that is not a bad deal. It leaves a lot more budget on the table for better video cards and that 4K display.
 
The conclusion that I am seeing is that 4K resolutions shift the bottleneck significantly toward the GPU subsystem. Maximum FPS is awesome, but if you can get a playable experience for 1/3 the cost, assuming that your primary focus is gaming and that you do not use your computer for other tasks where the Intel CPU would benefit you more, that is not a bad deal. It leaves a lot more budget on the table for better video cards and that 4K display.

at some point fps difference wont matter as you wont notice difference.
seeing fps numbers in a benchmark vs actually using difference is such an example.
 
I've been saying for ages that I see almost no difference between my FX-8320 and my i7 4930k. I really don't understand all the whining about AMD's lower IPC.
 
http://www.3dmark.com/3dm/4000335?

These are my results. Memory and video card at stock speed (lazy as hell). CPU @4.8GHz.

This are my Skydiver results (strangely results are lower with 14.9 compared to the latest 14.7beta..)

This is my firestrike (you can run firestrike too to compare)

Both Running GPU at stock settings with my [email protected] too

do you see how im getting a similar "Overall" score to yours? even when you are using a R9 290 and im using a R9 280X both at stock settings.. but the graphic scores are completely different? almost 6500 points of difference.. now, trying running it with your CPU at FX 8350 speeds.. and you will see that theres no way in this world a FX8350 can make a overall score almost 1200points higher than a i7 4930K.. but also 3800 points less than any real world validated result of FireStrike Extreme..

probably the rest of the results are OK but i've just stopped to read the review at the moment i saw that Fire Strike Extreme result.. as a owner of both kind of Intel and AMD platforms, I know those results are completely wrong..

What part of synthetic does not ring a bell ?

Synthetic does not represent a real world performance however they are supposedly have to be consistent enough to be a comparison tool between hardware.. and that review using fire strike extreme preset does not perform according validated scores that can be found in the 3dmark page.. thats what i found fishy.. probably all the rest of results are OK..
 
It does not say if the AMD cpu was overclocked or not. It does say the Intel cpu was @ stock speeds. I think they overclocked the AMD chip and ran the Intel stock.
 
Not saying that vishera isn't a totally capable chip, but this review does feel kind of misleading in its comparison. Knowing the clocks would be a good thing to keep open to the readers. The "lol amd is cheaper" thing bugs me too because due to the games not being all that cpu bound anymore, it'd be more fair to use like a 4770k and a more budget intel board or something. The price difference then would only be like $150 at the most for mobo+cpu, and results on intel side would likely not be that much different. The same could be said as a comparison between intel's 1150 socket and 2011 socket.
 
I've been saying for ages that I see almost no difference between my FX-8320 and my i7 4930k. I really don't understand all the whining about AMD's lower IPC.

In some games I've seen cases where a lowly i3 was ahead of AMD's top CPUs. Alien Isolation as a recent example..

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Alien_Isolation_-test-alien_proz_nv.jpg


intel is destroying AMD even as a budget option when it comes to games IMHO.
 
In some games I've seen cases where a lowly i3 was ahead of AMD's top CPUs. Alien Isolation as a recent example..

intel is destroying AMD even as a budget option when it comes to games IMHO.

Luckily the article was about 4K gaming where the CPU really doesn't matter.
 
Last edited:
Synthetic does not represent a real world performance however they are supposedly have to be consistent enough to be a comparison tool between hardware.. and that review using fire strike extreme preset does not perform according validated scores that can be found in the 3dmark page.. thats what i found fishy.. probably all the rest of results are OK..

Synthetic in the real world means _optimized_ for Intel.
 
In some games I've seen cases where a lowly i3 was ahead of AMD's top CPUs. Alien Isolation as a recent example..

intel is destroying AMD even as a budget option when it comes to games IMHO.

Assuming that chart is showing FPS(?), every CPU on there was achieving 100+ FPS...

You can certainly pay more for the Intel solution if you want, but don't expect a substantial performance difference. You aren't going to see a difference between 160 FPS with an AMD FX vs 230 FPS with an Intel i7, and in truly multi-threaded workloads the 8 core AMD chips are very competitive.
 
i was rather perplexed by tweaktown's results so i thought i'd do some digging of my own. i tested three games on an amd fx 9370 system at 5ghz and an intel i7 3770k system at 4.5ghz with evga gtx 780 6gb cards in sli. the full system specs are as follows.

fx 9370@5ghz - 16gb gskill 1600 - sabertooth 990fx 2.0 - samsung 840 pro 128gb

i7 [email protected] - 16gb gskill 1600 - tpower tz77-xe3 - kingston ssdnow v300 128gb

sys.png
system-1.png


please pardon the discrepancy in driver version. the results recorded for the intel system are with the 344.11 drivers.

i tested crysis 3 and metro last light with gameplay fraps recordings, while using the built-in benchmark for tomb raider just because i haven't been very deep into the game. here are the settings run on both platforms at 4k. the crysis gameplay run is performed on the 'root of all evil' stage and consists of exiting the cave, disposing of all enemies in the area, and making you way to the elevator door at the left of the dam. the metro last light gameplay run is performed on the 'd6' stage and consists of listening to miller's speech, fighting off the invaders, and waiting for the reds' train to crash through.

crysis32014-10-0920-42-02-67.png
crysis32014-10-0920-42-09-28.png
MetroLL2014-10-0921-12-47-49.png
MetroLL2014-10-0921-12-56-96.png

TombRaider2014-10-0921-28-37-89.png


the results are pretty interesting. the amd system performed terribly in crysis 3 but performed impressively in metro last light. regardless of what the numbers indicate, in crysis 3 the amd system was a stuttery mess. the intel system was the better platform for crysis 3, gameplay was buttery smooth. conversely, though it played well, the intel system trailed the amd system in metro last light by a near 10 frame average. both systems performed very similarly in the tomb raider benchmark.

results for amd...

log.png
TombRaider2014-10-1004-20-11-67.png


....now intel...

log1.png
TombRaider2014-10-1004-55-54-67.png
 
Love the tests Rennyf77! I have found that with Crysis 3, the best strategy for maximizing the game is to use settings that allow it to run with a 60 fps average. This doesn't mean that the frames have to stay at 60 or better as frame dips inevitably happen in that game with high settings. But trying for a 60 fps target with 50 fps dips is much more enjoyable than a 30 fps run with 22 fps slideshow dips. When the frame rate tanks to 22 fps, this is more than likely causing your stuttering problem.

Great tests though. And thank you again. :)
 
Assuming that chart is showing FPS(?), every CPU on there was achieving 100+ FPS...

You can certainly pay more for the Intel solution if you want, but don't expect a substantial performance difference. You aren't going to see a difference between 160 FPS with an AMD FX vs 230 FPS with an Intel i7, and in truly multi-threaded workloads the 8 core AMD chips are very competitive.

This its Crucial for 120/144FPS gamers.. still the performance difference its there and its out of logic to compare a i3 with an FX 9590 that is just sad because i love my FX 8350.. it hurt badly all of that power consumption, temperature those flawless 5ghz with 8 "real" cores vs a 2 core 4 thread chip at 3.5ghz..

Synthetic in the real world means _optimized_ for Intel.

funny enough:D... isn't the same 3dmark optimized for AMD GPUs?. or Unigine Valley too?.. ;)
 
Can you retest the 9370 rig with ram at 1866 with CL10 ram or lower?

Is the RAM speed/timing that big of a factor for AMD?

I remember back with the Sandy Bridge generation that RAM faster than 1333 CL9 didn't make much of a difference while gaming. If I recall correctly, the same was true of Ivy Bridge and 1600 CL10. Assuming that the trend continued through the current Intel chips, then RAM faster than 1866 CL10/11 should be the uppermost point for noticeable performance gains.

I am running DDR3-2133 CL11 on my APU rig because the memory speed directly influenced IGP performance, but I was thinking of sticking with DDR3-1600 on the FX-6300 setup I am tinkering with. It is running 1333 right now because that is what I had available, but I was figuring I would buy the 1600 at some point in the near future if I find a decent sale. Is going with faster RAM worth the price increase for an AMD non-APU chip?
 
Is the RAM speed/timing that big of a factor for AMD?

I remember back with the Sandy Bridge generation that RAM faster than 1333 CL9 didn't make much of a difference while gaming. If I recall correctly, the same was true of Ivy Bridge and 1600 CL10. Assuming that the trend continued through the current Intel chips, then RAM faster than 1866 CL10/11 should be the uppermost point for noticeable performance gains.

I am running DDR3-2133 CL11 on my APU rig because the memory speed directly influenced IGP performance, but I was thinking of sticking with DDR3-1600 on the FX-6300 setup I am tinkering with. It is running 1333 right now because that is what I had available, but I was figuring I would buy the 1600 at some point in the near future if I find a decent sale. Is going with faster RAM worth the price increase for an AMD non-APU chip?

It depends on how memory intensive the game is. When I was RMAing one of my GPUs I did some testing when Corsair released their benchmarks for BF4 with different ram speeds. I thought it was a selling tactic and BS but I was able to back up their claims on my own.

With 1 GPU at 5760x1080 I went from 29FPS to 34 FPS by simply going from 1600 CL10 to 1866 CL10.

AMD IMCs cant run really high ram speeds on average but they do respond better to a combination of speed and latency. The normal setup seems to be 1866 CL10 but the best would be 2133 CL9 or something comparable in speed vs CL. Many guys love their G. Skill Tridents on AMD FXs due to the combination of decent speeds but with lower latency.
 
The gskill kit in the rig runs at cas9. I might have a cas10 1866 kit somewhere.
 
The gskill kit in the rig runs at cas9. I might have a cas10 1866 kit somewhere.

If you can up the voltage on the ram and run it at 1866 CL9 that would be almost as good as it gets unless you compare it to 2133 CL9. Anything above that speed usually is problematic. Going from 1600 to 1866 at CL9 is about 15% faster(mathematically). Going from 1600 CL9 to 1866 CL10 is only about 5% faster.
 
I've stopped to keep watching that review when i saw the FX-8350 scoring higher than the 4930K.. when in the real life my 8350 can't even be a match of my 3770K using same system except motherboard and CPU of course.. Fail-Review..

FYI: This is the guy who thinks an i5-750 at 3.8 is a major bottleneck in most games. In reality, he needs to justify his pointless Ivy purchase.
 
FYI: This is the guy who thinks an i5-750 at 3.8 is a major bottleneck in most games. In reality, he needs to justify his pointless Ivy purchase.

so i have 2x i7 2600, 2x i7 3930K, 1x FX8350, 2x FX6350, and my personal 3770K and i want to justify the purchase of my 3770K??.. what the hell have your crappy i5 750 to do in this thread?.. i5 its a bottleneck for any recent game.. you wouldn't be able to juice the max potential of any recent GPU in the market with that CPU.. are you trying to justify not to buy any better CPU? a i5 2500k will crush your i5 750 overclocked and still an it 2500K at 4.2ghz isn't enough to maintain 60FPS in the vast majority of recent games.. up to you if you want to believe or not.. =)..
 
FYI: This is the guy who thinks an i5-750 at 3.8 is a major bottleneck in most games. In reality, he needs to justify his pointless Ivy purchase.

I7 920@4ghz is on par with i7 3770k stock in multi gpu scenarios. At 4.5ghz the ivy bridge dusts the 920 when it comes to min frames. If one were looking to keep two of more gpus well fed, then araxie is right.
 
There is a lot of rambling in this thread.

SLI benefits a lot from MHZ, when comparing both at stock when the GPU is choking before the CPU does you will see the CPU with the higher clock typically edge out. In this case the FX CPU was able to edge out, just barely, with its +600mhz. If you OC the Intel the results would skew a bit. Keep in mind that results still would be rather close in gaming and the price would still be really high for the Intel set up.

Intel's HEDT CPU's are the best but that doesn't mean they crush the competition in every metric and they charge higher because they are the best. Intel's low end is where the money is at. They are very price competitive with AMD and they are typically much better except for the APU, we still have no competition from Intel on those.

Building a high end gaming computer on a budget is still clearly in AMD's court.
 
There is something horribly wrong with those FSE numbers. I dug back into my benching archives and found I surpassed their 780 SLI testing on BOTH of those rigs with a 3570k and a pair of 770's, and obliterated the 980 SLI with a pair of 290's and a 3930k by over 2000 points on both overall and GPU score with everything at stock clocks. Overclocked, the same pair of 290's (at a whopping 1050mhz) passes 10,000 easily in FSE with an FX 8150. WTF tweaktown?
 
Last edited:
That is the exact reason why i have stopped to read at the same moment i saw those FireStrike Results...
 
FYI: This is the guy who thinks an i5-750 at 3.8 is a major bottleneck in most games. In reality, he needs to justify his pointless Ivy purchase.

And? That's kinda your opinion and it's troll bait. Why else bring it up you are just trying to piss him off by going out of your way to discredit him. Grow up. Sounds like you are the one being pointless.
 
Back
Top