Possible explanation (excuse) for horrid single-threaded DB performance...read on!

mzs, if there was something in the pipeline like that wouldn't AMD at least make that clear? I see no reason why they would not. AMD releasing BD in the state it is in my opinion proves there is no magic patch that will fix it. This sounds like the 2900 XT all over again that the drivers would make it bitch slap the 8800 GTX.

Way to turn this into Video Card thread guys....:rolleyes:

Anyways, AMD is aware (how can't they) of this, and most likely are working with MS to get a better handle on the resource sharing/handling... They have to... DB is just a beginning of where their entire CPU line-up is headed... I'd suppose one thing they'll be concentrating on, is working with software manufacturers to push for greater optimization and utilization of new arch, so by the time BD2 comes out (Dec 2012 I hear?) it will be a fully supported platform. I'm not defending AMD, I'm just trying to give them "benefit of a doubt". They are not sitting on their hands, they can't be....I hope not anyways...:confused:
 
Way to turn this into Video Card thread guys....:rolleyes:

Anyways, AMD is aware (how can't they) of this, and most likely are working with MS to get a better handle on the resource sharing/handling... They have to... DB is just a beginning of where their entire CPU line-up is headed... I'd suppose one thing they'll be concentrating on, is working with software manufacturers to push for greater optimization and utilization of new arch, so by the time BD2 comes out (Dec 2012 I hear?) it will be a fully supported platform. I'm not defending AMD, I'm just trying to give them "benefit of a doubt". They are not sitting on their hands, they can't be....I hope not anyways...:confused:

amd most likely is looking at windows 8.

It makes sense for them in a way.

Next year bulldozer will filter into trinity and bsaicly be their laptop chip. So amd will want to make a splash with windows 8 performance that is what will get them market share and money.

i'm also sure that in 2013 mabye 14 they will most likely have bulldozer all the way down to the tablet market
 
It would be interesting to see how these things compare on other schedulers (Like under Linux for instance)...
 
I dont see how Fermi is bad. it was the first GFX card that played GTA IV with high level detail and no lag. my 5850 lagged around water where my GTX 480 didnt.
 
The XF link is live again and there is some rumblings about performing some gaming tests with the 4 half-modules, to see what kind of improvement maybe accomplished... Even chew* is in on it... Watching it patiently...
 
What is the final difference in performence between phenom 1 Agena , Phenom 2 Deneb and a Phenom 2 thulban?

This is very relevant to this discussion. How well we can gage the performance bumps in each iteration this will show us just how long it shall take for amd to get bulldozer micro architecture right and what performance benefits we shall see as the architecture matures.
 
Last edited:
No it didn't. 5970 still wiped the floor. Fermi 400-series is almost the same as this flop. Except for AMD, instead of woodscrews, you get ridiculous marketing slides.

it appears that the failure of BD has caused some serious malfunctioning on some posters, this one is a perfect example.
 
Maybe AMD released unoptimized cores. BD was so off target on release date, so maybe they released early designs that weren't fully optimized.
 
Maybe AMD released unoptimized cores. BD was so off target on release date, so maybe they released early designs that weren't fully optimized.

i just think its an l1 cache bug + size and the process problems from global foundrys. It was a 1 , 2 punch that they couldn't overcome .

Hopefully the l1 cache issue is fixedi n pile driver and well the process problems will just need to get worked out over time
 
AMD has lost a lot of enthusiasts with false promises on the Barcelona release. This is the second time AMD has been a complete let down. The performance is mediocre in multithreaded benchmark, but the power consumption is just wow. Also 2 billion transistors to achieve this result doesn't leave much room for faith they'll be able to fix it.

But who knows maybe they can pull what nVidia pulled post Fermi.

The problem with using fermi as a comparison is that fermi was never a bad chip, the only problem was power consumption. Even if AMD could clock this to 4.2-4.4Ghz it would still lose(lose in power consumption too). That's one of the reasons why the Pentium IV could never compete with Athlon64 cpu's, clock for clock the Athlon64's were too much better. Last lets not forget that even at 4.4ghz a 2600k still uses less power than stock 8150, they could just bump it up to 4ghz if Intel thought they were in any danger.
 
i just think its an l1 cache bug + size and the process problems from global foundrys. It was a 1 , 2 punch that they couldn't overcome .

Hopefully the l1 cache issue is fixedi n pile driver and well the process problems will just need to get worked out over time
No, they would not have released it if they thought it was something fixable. Also if it was something fixable they would be claiming more than 10% in piledriver(which is probably all clock speed).
 
No, they would not have released it if they thought it was something fixable. Also if it was something fixable they would be claiming more than 10% in piledriver(which is probably all clock speed).

Well 1) it seems like they did try to fix it , release revision is CO while there were some test chips out in the B1 and B2 levels.

2) I bet they tried fixing it and found it would be a huge pain to fix and it would take even months longer at which point pile driver would be out. So they just moved to the fix to pile driver

3) The claims for pile driver (which is 10-15%) could have been made before they found the respins wouldn't fix the cache problem. pile driver from what i've heard will have a 10-15% increase without clock speed. So who knows

If the cache is a big problem along with the process node then pile driver could end up being really good if the cache is fixed
 
thats all good and well, but what about

GAEMZ!!1​

??

dirt-3-eyefinity.png


deus-ex-eyefinity.png



this is the 8150 with two 6970s .

I don't think that the cpu is going to be a problem in the majority of games
 
Hmm so you would spend 250$ on CPU for games to get slower cpu than $220 competitor product that will work better if you have game that needs CPU power just because you might not be able to notice that $250 one is slower in some cases ?

Fanboy logic at it's best really.
 
And the reverse can be said about you. You are willing to spend 30 dollars less to be slower in most games just on the chance you might be faster in some games. Being a fanboy does show both ways even you try to claim that is not the case.

Bulldozer truely is a mediocre chip, but lets behonest here. Bulldozer did preform well in alot of games and in encoding. So for the most part, it performed well in scenerios that most users would be using it for.

Bulldozers flaws truely are its high power usage and lower ipc. If Amd can correct those 2 problems, then they will have a great chip on their hands.
 
And the reverse can be said about you. You are willing to spend 30 dollars less to be slower in most games just on the chance you might be faster in some games. Being a fanboy does show both ways even you try to claim that is not the case.

Ok show me all those games where BD will be faster than my 4,5 Ghz 2500K.
 
I found this cool site that has data on just such a scenario, maybe you've heard of it.

That's cool and how about more than 1 game.

Because among huge sample size of 3 games [H] tested FX-8150 is slightly faster in 1 game, slightly slower in 1 game and significantly slower in 1 game.

Fx costs 250$ and 2500K costs 220$

And then there's this small graph:

1318034683VZqVQLiVuL_9_2.png
 
That's cool and how about more than 1 game. ... And then there's this small graph:

Given the data available, it's faster in 2 out of 3 gaming scenarios when compared to a 2500K running at greater than your requested 4.5GHz.

And, judging by the video card listed in your signature, I wouldn't have guessed that power efficiency at load is something you would consider a high priority.
 
Given the data available, it's faster in 2 out of 3 gaming scenarios when compared to a 2500K running at greater than your requested 4.5GHz.

And, judging by the video card listed in your signature, I wouldn't have guessed that power efficiency at load is something you would consider a high priority.

Do you really and honestly want me to throw few dozens of benchmarks which clearly show which cpu is faster in games ?

PS. Find me a GPU with better price to performance ratio than 200 euro GTX 470 when i was buying it 14 months ago.
 

The only thing that I'll point out is that the first article dates from 2000, 11 years ago and is talking about the 2.2 and 2.4 Linux kernels which are barely used anymore.

We are now up to 2.6.39 on the 2,6 line, and there is even a stable 3.04 now (though not widely used yet.)

I don't know any details at all on how the Linux scheduler may have changed over this time, but judging how drastically other things have changed since the 2.2 and 2.4 days, I wouldn't be surprised if that article is rather out of date.
 
That's cool and how about more than 1 game.

Because among huge sample size of 3 games [H] tested FX-8150 is slightly faster in 1 game, slightly slower in 1 game and significantly slower in 1 game.

Fx costs 250$ and 2500K costs 220$

And then there's this small graph:

1318034683VZqVQLiVuL_9_2.png

We've been over the power graph before.

It is mostly irrelevant unless you fold/bitcoin mine on your CPU (which I understand most people don't anymore as this is instead done on the GPU)

At idle (where most CPU's spend 95% of their lives) BD is actually MORE power efficient than SB when both are overclocked.

It's only at heavy load it uses more power than SB (and less than i7-920) Heavy load is a very small minority of processing time. Even when playing games, CPU's are usually not loaded more than 35%. it would be interesting to see how they compare power wise at a 35% load.
 
Has anyone seen a review where they pit BD against i5,i7 under Linux yet....????

Not yet. I presume Phoronix will review them at some point (they usually do) but they also usually lag behind the Windows/Gamer sites with their reviews by quite a bit.
 
And, judging by the video card listed in your signature, I wouldn't have guessed that power efficiency at load is something you would consider a high priority.

Power efficiency is a huge issue with Bulldozer. For a CPU that's supposed to be all about overclocking, something is not right with how the power leaks. I hope it's just a bad Fab process and they will be able to fix it. But look at this 4.8Ghz power hog.

http://www.bit-tech.net/hardware/cpus/2011/10/12/amd-fx-8150-review/10

300+ more watts than the equivalent 2600k, and not to mention Bulldozer is supposed to be able to hit higher clocks to compensate for lower IPC. I really hope it's something they can fix with a stepping / Fab. But considering the CPU has at least twice as many transistors than any other CPU it competes with I highly doubt it.

This is the first AMD CPU I really cannot defend. There is nothing good about this CPU.

- I would be fine with less single thread performance, as long as it wasn't slower than Thuban.
- I would be fine with multi threaded performance as long as it didn't use twice as much power to achieve marginally better result than Thuban.

Thuban is an AMD CPU built on Global Foundries old 45nm process and it's more power efficient per unit of work than Bulldozer. If AMD was smart they would scrap Bulldozer and start a new project using Thuban or Stars as the basis of it.

I am fully certain Trinity, Piledriver or any CPU scheduled in the future that is based on Bulldozer architecture will be a failure.

People say but this makes a good Server CPU, I disagree, this makes a good server CPU if all you do is sit idle, but as soon as you start doing any real work your power consumption will go trough the roof. Intel will be much more power efficient in this arena, this is because it will complete tasks 2 to 3 times faster allowing it to sit idle not wasting power longer.

I've probably influenced various companies to purchase 100s of thousands of dolars of AMD products in the past, like Opterons based Sun boxes in 2005, but I will tell you I would never recommend a bulldozer server CPU to any of my clients or employers.

AMD may think Enthusiasts aren't important, but this is where they are wrong. People we socialize with know we're into CPU tech and they value our opinions, they may be guys putting a gaming machine together, or they may be guys responsible for ordering servers for a project. There was a TED talk covering the importance of pleasing your early adopters. AMD fucked up big time.
 
Last edited:
Power efficiency is a huge issue with Bulldozer. For a CPU that's supposed to be all about overclocking, something is not right with how the power leaks. I hope it's just a bad Fab process and they will be able to fix it. But look at this 4.8Ghz power hog.

http://www.bit-tech.net/hardware/cpus/2011/10/12/amd-fx-8150-review/10

300+ more watts than the equivalent 2600k, and not to mention Bulldozer is supposed to be able to hit higher clocks to compensate for lower IPC. I really hope it's something they can fix with a stepping / Fab. But considering the CPU has at least twice as many transistors than any other CPU it competes with I highly doubt it.

This is the first AMD CPU I really cannot defend. There is nothing good about this CPU.

- I would be fine with less single thread performance, as long as it wasn't slower than Thuban.
- I would be fine with multi threaded performance as long as it didn't use twice as much power to achieve marginally better result than Thuban.

Thuban is an AMD CPU built on Global Foundries old 45nm process and it's more power efficient per unit of work than Bulldozer. If AMD was smart they would scrap Bulldozer and start a new project using Thuban or Stars as the basis of it.

I am fully certain Trinity, Piledriver or any CPU scheduled in the future that is based on Bulldozer architecture will be a failure.

People say but this makes a good Server CPU, I disagree, this makes a good server CPU if all you do is sit idle, but as soon as you start doing any real work your power consumption will go trough the roof. Intel will be much more power efficient in this arena, this is because it will complete tasks 2 to 3 times faster allowing it to sit idle not wasting power longer.

I've probably influenced various companies to purchase 100s of thousands of dolars of AMD products in the past, like Opterons based Sun boxes in 2005, but I will tell you I would never recommend a bulldozer server CPU to any of my clients or employers.

AMD may think Enthusiasts aren't important, but this is where they are wrong. People we socialize with know we're into CPU tech and they value our opinions, they may be guys putting a gaming machine together, or they may be guys responsible for ordering servers for a project. There was a TED talk covering the importance of pleasing your early adopters. AMD fucked up big time.
I still don't see how the power consumption is so different between bit-tech and [H].

1. [H]'s testing shows that the 8150 @ 4.6GHz system uses 452W at load, whereas bit-tech shows 586W at load: a difference of +134W
2. [H] shows that the i7-920 @ 4.0GHz system uses 505W at load, but bit-tech's 920 system on pulls 411W at load with a GTX590: a difference of -94W

Why such differences?
 
I still don't see how the power consumption is so different between bit-tech and [H].

1. [H]'s testing shows that the 8150 @ 4.6GHz system uses 452W at load, whereas bit-tech shows 586W at load: a difference of +134W
2. [H] shows that the i7-920 @ 4.0GHz system uses 505W at load, but bit-tech's 920 system on pulls 411W at load with a GTX590: a difference of -94W

Why such differences?

Different CPUs (of the same model) have different overclocking properties. When reaching an OC limit power consumption starts growing exponentially, it sky rockets because you start getting a lot of current leakage.

[H] overclocked to 4.6Ghz and Bit-tech overclocked to 4.8Ghz also, 200mhz extra may have required 134watts extra.

Every single review that dug into BD's power consumption confirmed the same thing. BD is insanely power hungry. When you factor in ~2 billion transistors required to deliver the performance it delivers, things make sense and the results are very disappointing.

If you compare the performance it delivers and power consumption and transistor count to sandy bridge CPUs. Bulldozer gets laped. Thuban is clearly built on a better architecture than BD.

At a time when AMD needed to do some catching up to Intel, AMD went two steps back.
 
Last edited:
Different CPUs (of the same model) have different overclocking properties. When reaching an OC limit power consumption starts growing exponentially, it sky rockets because you start getting a lot of current leakage.

[H] overclocked to 4.6Ghz and Bit-tech overclocked to 4.8Ghz also, 200mhz extra may have required 134watts extra.

Every single review that dug into BD's power consumption confirmed the same thing. BD is insanely power hungry. When you factor in ~2 billion transistors required to deliver the performance it delivers, things make sense and the results are very disappointing.

If you compare the performance it delivers and power consumption and transistor count to sandy bridge CPUs. Bulldozer gets laped. Thuban is clearly built on a better architecture than BD.

At a time when AMD needed to do some catching up to Intel, AMD went two steps back.

Agree completely. Assuming Bit-Tech's tests were done right, they probably had a better Core i7 than the [H]. The variability within CPU's with the identical part and stepping number can be huge, that's why launch day reviews are sometimes sketchy, as it's difficult for companies to resist the urge to cherry pick the CPU's / GPU's that get sent to reviewers.

As far as FX goes, all the data I have seen suggests that its power usage currently goes up VERY fast with clock.

To repost a smaller version of my chart from earlier:
6238187253_59c29c23a1.jpg


In the lowest clocked server parts, they got 16core parts at 85W TDP... Thats close to 5W per core!

But crank up the frequency and the power use goes up FAST.

This is a typical symptom of process immaturity. Hopefully we will see this get much better as GloFo's manufacturing processes improve.
 
In the lowest clocked server parts, they got 16core parts at 85W TDP... Thats close to 5W per core!

How about performance / watt? watts per core is not that meaningful when BD performs less per core than Phenom II and way less per core than SB at the same frequency.
 
How about performance / watt? watts per core is not that meaningful when BD performs less per core than Phenom II and way less per core than SB at the same frequency.

You could plot that as well, but my point with this chart was simply to point out how quickly the power consumption goes up with clock speed within the bulldozer family, not to compare it to other chip designs or vendors.
 
You could plot that as well, but my point with this chart was simply to point out how quickly the power consumption goes up with clock speed within the bulldozer family

I am with you on that.

not to compare it to other chip designs or vendors.

Maybe not in your chart however you have made several comments on this saying that BD is more energy efficient than SB and Phenom II.
 
it'd be interesting to see how would disabling 1 core from each module impact power consumption.
 
Ok show me all those games where BD will be faster than my 4,5 Ghz 2500K.

Basically, it is just winrar, POV rendering, and single pass encoding. The fanboys can't find anything else because there isn't anything else.
 
... What games see any appreciable difference between BD and a 2600K or even an i7 and a i3?

Civilization games, Paradox games, Dwarf Fortress, Aurora, Total War games, Supreme Ruler games, SC2, Space Empires, etc. In other words, games that aren't console fps ports with light pathfinding and basic AI, but strategy games that need lots of processing power(especially IPC) to run quickly and well.
 
Maybe not in your chart however you have made several comments on this saying that BD is more energy efficient than SB and Phenom II.

At idle when overclocked :p

It would be really interesting to see how the power use figures look over a week of typical use which would include a lot of idle/desktop/web/game time, a an encode or two (maybe an album and a movie), a bit of gaming and a couple of movies, but I don't have this data.
 
Back
Top