AMD FX-8150 Multi-GPU Gameplay Performance Review @ [H]

LOL everyone on this forums is into BF3.

Looking for BF3 BD SLI/Tri-SLI benchmarks...oh none, hey nice review.

How exactly do you bench a multi-player game properly? [H] doesn't use the canned benchmarks that you are used to. Go back to toms hardware.

If you had to make a conclusion based on the rather large list of games in this review......

I haven't seen a cpu review this well done in years.
 
Last edited:
Thanks for the review.
Looks like Bulldozer dug its own hole...and buried itself alive.
 
intel_amd.jpg


Zarathustra[H]'s condensed history of AMD's struggle with Intel seems fairly accurate.. although I wouldn't have used the term "inferior" when describing AMD's CPUs.. they just quite weren't as fast as Intel's. Looking at an old issue of boot magazine.. actually the very last boot (July/Aug 1998) before it became MaximumPC.. on pg. 45 there is a comparison between AMD's New K6-2 333MHz vs PII 333MHz.. the PII beats it in most benchmarks (games and synthetic) and the best the K6-2 could do is only tie a couple of times. It should also be noted that only once was an AMD CPU ever used in a boot/MaximumPC "Dream Machine".. the 2005 DM had dual AMD Opterons.. and that is only one out of close to 20 total DMs built. Even though AMD has been competing with Intel since the 1980s, only for a brief 2003-2006 period did AMD produce the clearly superior CPU. We all remember that time so we naturally are under the impression that AMD did it back then, it can do it again.

Just to give AMD some positive vibes as I finish writing this, this was once written about a new AMD processor core..

..the Phenom is a disappointment. It comes close to meeting a clock for clock battle at 2.4GHz or so, but still just does not do it. Phenom has not caught up with what Conroe had to offer so many months ago. As the clocks scale, Phenom begins to look even worse against the new Intel Yorkfield processors.

I run a Q6600 in my own box right now, and I would not hesitate to drop a Phenom in here. I don't think I would be losing anything real, but I don't think I would gaining either.

"AMD thinks that the hurdles it is facing now with Phenom will be smoothed out by the time it has to transition to 45nm in 2008 and will allow for ramping clock speeds that will hold Intel at bay. "

Those words were written by Kyle on November 19, 2007 after he reviewed the Phenom X4 9600. AMD was able to come back fairly well from that bad launch.. let's hope they can do it again.
 
but win8 and bio updates and patches will fix bd

Tom's did a beta run of windows 8 with its improved scheduler and saw a roughly 10% increase in performance, depending on the task. It wasn't nearly enough to bring it up to a level where it's competitive for the current asking price. And then you should also ask yourself why you should buy a chip if you need to pay another $150 and upgrade your OS to see the benefit. Stating that windows 8 will improve performance isn't necessarily a good thing for AMD nor us consumers if you consider what it's implying: The chip doesn't do well right now and might do well with other future operating systems. It performs well with Linux because it's far faster to adapt, but there too it's slacking in IPC (which ultimately also hurts it's multi-threaded performance). Phoronix wasn't impressed either.

The BIOS updates may fix certain games from crashing. But that's not exactly a silver-lining. They shouldn't be crashing at all.

The biggest fix to bulldozer would be to dump the architecture and start over. Shorter pipelines, less cache latency, more focus on FP.
 
i'm sure it was already asked, but is there any chance of getting comparable crossfire results to see if the actual problem is not the cpu and more of something having to do with the implimentation of SLI on the x990 chipset? not trying to start a conspiracy post or anything i just wonder if theres more to the story then just shitty cpu performance.
 
anyone willing to play thru CIV V all the way to some kind of an end game position deserves my undying respect - personally, I'd rather watch paint dry :p

(ps - FWIW, SMAC/X is always on my machines & still get regular play around here :))
 
anyone willing to play thru CIV V all the way to some kind of an end game position deserves my undying respect - personally, I'd rather watch paint dry :p

(ps - FWIW, SMAC/X is always on my machines & still get regular play around here :))

i've done it but i'm lazy and i cheat to make it faster.. trainer's FTW!!! but i do it in sandbox mode so the only way to win is wiping out all the AI.
 
anyone willing to play thru CIV V all the way to some kind of an end game position deserves my undying respect - personally, I'd rather watch paint dry :p

(ps - FWIW, SMAC/X is always on my machines & still get regular play around here :))

I've beaten the game several times, on the marathon setting none the less.

Thus far I've had victories by time, science, domination and culture.

I love that game. can't get in to anything else quite as well.

I'ts difficult to find the time though, and it pisses off the wifey when i spend all day on the comuter and none of it with her, so I tend to play FPS games I can join for an hour or so, and quit after a while instead.

You can always save Civ games and continue later, but often if I don't play through them the same weekend, I lose interest and never pick it back up again.
 
No offense mate, but shouldn't this topic be sent to the equestrian graveyard? Bulldozer's slow. We know it. Intel processors walk all over it. The next gen graphic cards are going to make it look even worse. FWIW, I'd like to see someone do an article explaining how this CPU ever came into existence. We'd have better performance if they would just shrink MC and release enthusiast socket G34 boards. What were they thinking?

"Designing microprocessors is like playing Russian roulette. You put a gun to your head, pull the trigger, and find out four years later if you blew your brains out."

Robert Palmer, CEO Digital, on the AMD Board.

Additionally, this CPU was primarily a server design adapted to the desktop.

At low clocks (which servers usually have) power usage is very good, and it really does seem to be a kickass server CPU.
 
The biggest thing all this tells me is CPU limiting will be very real again with the next generation of GPUs.
 
I think what we have here is an Intel faggot posting fake benchmarks!

Arma II is a game I play regularly and his review saying the only playable viewable distance is the lowest of 500 with very high detail is total bullshit. I seen another review I would believe over this one.

http://www.bit-tech.net/hardware/cpus/2011/10/12/amd-fx-8150-review/9

I have a MSI 990FXA-GD80, with dual GTX 480's and my FX-8150 is on its way from newegg.com. It's expected delivery day is this Monday the 7th.

When my CPU arrives Monday I will prove it!
 
I think what we have here is an Intel faggot posting fake benchmarks!

Arma II is a game I play regularly and his review saying the only playable viewable distance is the lowest of 500 with very high detail is total bullshit. I seen another review I would believe over this one.

http://www.bit-tech.net/hardware/cpus/2011/10/12/amd-fx-8150-review/9

I have a MSI 990FXA-GD80, with dual GTX 480's and my FX-8150 is on its way from newegg.com. It's expected delivery day is this Monday the 7th.

When my CPU arrives Monday I will prove it!
Neat, except that's a canned benchmark, which means most of the CPU work is predone.
 
I think it will be interesting to see just how well BD as a stepping stone to the future many-many core CPUs plays out.

Perhaps they will be able to ramp BD to a huge number of a cores much easier than Intel will be able to with their roadmap.

Of course someone has to come up with a reason to have so many cores in a PC. Something I have yet to come across.
 
I think what we have here is an Intel faggot posting fake benchmarks!

Arma II is a game I play regularly and his review saying the only playable viewable distance is the lowest of 500 with very high detail is total bullshit. I seen another review I would believe over this one.

http://www.bit-tech.net/hardware/cpus/2011/10/12/amd-fx-8150-review/9

I have a MSI 990FXA-GD80, with dual GTX 480's and my FX-8150 is on its way from newegg.com. It's expected delivery day is this Monday the 7th.

When my CPU arrives Monday I will prove it!

Did you really create an account to troll people as an AMD fanboy? You silly goose, everyone knows that's an endangered species and all current specimens were born around 10 years ago!
 
I think what we have here is an Intel faggot posting fake benchmarks!

Arma II is a game I play regularly and his review saying the only playable viewable distance is the lowest of 500 with very high detail is total bullshit. I seen another review I would believe over this one.

http://www.bit-tech.net/hardware/cpus/2011/10/12/amd-fx-8150-review/9

I have a MSI 990FXA-GD80, with dual GTX 480's and my FX-8150 is on its way from newegg.com. It's expected delivery day is this Monday the 7th.

When my CPU arrives Monday I will prove it!

Details, details, details............... Both reviews do conclude, however, that Bulldozer still loses to i5 in ARMA2.... So, great purchase there buddy...... Enjoy....:rolleyes:
 
not at all. these are going to be huge at worst buy. (LOOK FOLKS! TWICE THE CORES!!)

Very unlikely... A) Intel has more cash to throw at BB to push their systems, not to mention OEMs. They got their wrists slapped over that but the damage is done. B) Joe six pack stopped caring about CPU performance a few years ago, hence the advent of netbooks and other "fast enough" platforms. I hear neophytes talking about computer specs and if anything they mention RAM and HDD size, not CPU speed or cores.

Intel's and AMD's obscure model numbers didn't help matters either, but that's just a result of the MHz race being over. Besides, laptops outsell anything else these days and AMD is lagging even more there, outside of budget stuff (Llano does destroy Atom, and it's not much more expensive)... Consumes are increasingly more demanding about battery life, and this architecture looks ill suited for any kind of mobile use.
 
Last edited:
Details, details, details............... Both reviews do conclude, however, that Bulldozer still loses to i5 in ARMA2.... So, great purchase there buddy...... Enjoy....:rolleyes:

Don't feed the trolls... At least I truly hope he's not for real, maybe I'm just an optimist. :p
 
Finally..."hard" evidence that dispels the (baseless) myth that both chips have equal weight in gaming with "modern" games. Is there any chance of having usage (%) graphs added to this review, showing when and where GPU usage isn't locked at 100% (thus clearly revealing CPU limitations [for those who continue to doubt the obvious])? Might there be any chance of running these tests with one 560ti/6950 @ 1080p and lower @ lower settings, so that FX-8150 users can be assured (by an unbiased reviewer) that their chip will frequently limit slightly older (less GPU-intensive) games (e.g. Far Cry 2, which is extremely CPU-intensive) running at lower settings, as well (please pardon me if this was already done...) (they deserve to know the truth)? In my opinion, minimum framerates are the figures that we need to focus on the most...and the difference, as outlined here, is quite startlingly clear.
 
Last edited:
LOL everyone on this forums is into BF3.

Looking for BF3 BD SLI/Tri-SLI benchmarks...oh none, hey nice review.

BUT GAIZ, IN DRAGON AGE 2, BETWEEN 156 AND 161 SECONDS, BULLDOZER IS CLEARLY FASTER!!!!ONE!!

On topic though, All these people who said that a magical software fix is in the works which will increase BD performance by 40%?? I just did a quick tally of the first 2 or 3 pages of tests, and BD is 72% Slower than the 2500K On average. SEVENTY TWO PERCENT.

So with a software fix, if this improved bulldozers performance by 40%, it would still be a huge amount behind Intel Medium Level Enthusiast chip, cost $100 more, (The same amount as a processor which can REALLY run 8 threads) and use probably 200w more of power.(Numbers taken from [H] Desktop performance review) Which means for every 5 or so hours of gaming, where the CPU is fully loaded, and it doesn't seem to hard to load it up, you will be paying an extra 1KWh of power.
My Country's average rate for power is about $0.30c per unit. So its going to cost me probably $1.50 a week more to run this thing, were I to get one. $75 a year. Plus the initial cost of ownership, which is around $150 more. I really don't even see how this stacks up to a $600 CPU. let alone a $190 one.

And I have ignored the additional cooling that would be required (no 212+ here) and the New PSU I would have to buy.
AMD, if you are listening, and I know you are not, this is how you fail as a company. Anyone Heard from John Fruehe Recently?
Is AMD Still going to answer those questions we were asked to put forward?
 
How exactly do you bench a multi-player game properly? [H] doesn't use the canned benchmarks that you are used to. Go back to toms hardware.

If you had to make a conclusion based on the rather large list of games in this review......

I haven't seen a cpu review this well done in years.

Why not do BF3 like this picture. Just show us some comparisons of the built in CPU,GPU renders they have their own chart graphs. Did I ask for MP? Oh you did lol!

We could easily tell which one spikes higher the Intel or the BD. Fyrthermore that game is the most futuristic, as far as how the game is rendered. That's all I care about seeing right now, well that and the Metro 2033 (1 Chart) was the only graph I looked at in that entire review.

BF3%20perf%20g2.JPG
 
Very unlikely... A) Intel has more cash to throw at BB to push their systems, not to mention OEMs. They got their wrists slapped over that but the damage is done. B) Joe six pack stopped caring about CPU performance a few years ago, hence the advent of netbooks and other "fast enough" platforms. I hear neophytes talking about computer specs and if anything they mention RAM and HDD size, not CPU speed or cores.

Intel's and AMD's obscure model numbers didn't help matters either, but that's just a result of the MHz race being over. Besides, laptops outsell anything else these days and AMD is lagging even more there, outside of budget stuff (Llano does destroy Atom, and it's not much more expensive)... Consumes are increasingly more demanding about battery life, and this architecture looks ill suited for any kind of mobile use.

Are you sure you quotes the right thing?
 
Is the cpu's role more int. or FPU related?
Which games use mult threads..... or ???

It's not too good but using older games for this cpu is not going to work well. It's been pretty much established that this is a pretty much multi tasking or multithreading cpu design with a so so FPU so how did you really think the cpu would work on older games.

Not very well, no supprises here, The cpu's should also have been run at their native speeds.
With a supposed longer pipeline the BZ is clocked higher to make up for part of this right? Clocking them the same really isn't fair in this case. Test them as they come out of the box for real world data results if you want people to see what they can expect of it.

And even then the design is future minded for better optimized software that can make use of it's new (not old) design. You might as well have used a 6 core AMD cpu as well as this is more represented to current and older programs. This is been pretty much the accepted status of the BZ cpu. It's way to new of a design to be compared this way. This is how AMD thought out it's future, they sure didn't design this for past software even though it would be nice for it to run older single threaded software better it is what it is.

I really think the benchmarks should wait a few months for either new software including either patch's for win7 or for win8 to come out and newer software that can make better use of the design of the new cpu. It's so different that there has to be programs that have this design in mind for them to give a more realistic view of it's true performance levels if they are there to be coaxed out of this cpu. It's too soon to be running benchmarks on it I think.
It may be bad timing by AMD putting it out before windows 8, or they want it out so software designers have more of a chance to work with it so when win8 or a patched win7 and better software will really start to show up for it.
This is my view on this cpu anyway. AMD must have known how it would perform and has better info. on what direction future software is going to ask of it's hardware and designed it to take this into account. I doubt they would be that dumb not to know these things or it's design would be much different.
Also with their plans to go with APU's they may have designed it for this future to use the video part to offset the FPU workloads so they knew they didn't have to design a better one that it has on it but that is just guessing. Makes a little sense but only time will tell with this cpu. It's too early to know.
From programs that have been optimized for this cpu it can perform very well. I will not write this cpu off on current software.
 
because i don't use multi-gpu and i never will so this has little effect on my gaming.

So....with a processor thats about $60 (2500K) cheaper, and is faster then bulldozer, you would still pick up bulldozer?
 
Kyle or Brent, do you have any sense at this point just how much of the performance deficit with Bulldozer you're seeing with dual and triple-GPU setups in games' FPS has to do with driver/platform issues and how much has to do with the FX architecture? I'm asking this because we're actually seeing those wild fluctuations in frame rates on the FX platform that often indicates poorly optimized drivers. I'm also aware that these fluctuations could be indicative of cache-thrashing, core-contention issues, you name it. In other words, if the FX platform's FPS were smooth, but simply lower than the Intel platform, then I'd buy that it's all the fault of the CPU, and it's simply slower than the Intel chip, plain and simple, but when I see those sawtooth patterns in the FPS graph, I'm thinking somebody just took an Intel-optimized graphics card driver [nVidia, I'm looking at YOU :)] and tossed in onto the AMD FX platform, even though the FX chip has a radically different architecture. Or worse, the game engine and driver are both so poorly optimized for the FX chip, that a recompile of either (or both) could lead to significant improvements in FPS and smoothness of FPS.

I'm not disputing your findings in any way: it's clear the Intel chip is running these games much more smoothly and at higher FPS. What I'm trying to wrap my mind around is how much of the bad gaming performance of the FX chips has to do with real 'flaws' in the design, and how much has to do with code that is simply not compiled/optimized to run on such a different CPU architecture. I am VERY reluctant to use a car analogy, but I can't help myself here ;) - it seems a little bit like we could be trying to run 87 octane fuel through a Formula 1 race car engine, and seeing that it doesn't perform as smoothly as a Corvette street racer on 87 octane, we're saying that the Formula 1 racecar 'sucks'.

Like a few others have pointed out, the much newer Battlefield 3 engine seems to like the FX chip, but so far, I haven't seen trustworthy multi-GPU Battlefield benchmarks run on the FX. I know this is a hard thing to nail down, but in your extensive experience (you've certainly seen drivers improve for newly released GPUs - perhaps this may also be true for graphics drivers running on such a radically different CPU) would you say that 'flaws' (eg. very high cache latency, FPU sharing between pairs of cores) in the FX CPU architecture are entirely responsible for the performance gulf with Intel, no matter how optimized the code is, or do you think improved/optimized GPU drivers (and possbily optimized gaming engines) for the FX platform might allow the Bulldozer and it's descendants make significant gains down the road? Just curious for your thoughts.


i agree on all parts. until theres an apples to apples between the AMD cfx and SLI its kind of hard to say if the problem really is the processor its self. for all we know the SLI drivers are broken as hell for the 990 chipset(and honestly i wouldn't doubt that from Nvidia). all we got was half the cookie, now we need the rest of it, without it the review doesn't really tell us as customers/readers anything.


So....with a processor thats about $60 (2500K) cheaper, and is faster then bulldozer, you would still pick up bulldozer?

sure the processor may be cheaper but the entire platform sure as hell isn't. yet people seem to love ignoring that fact.
 
Back
Top