"AMD's Bulldozer server benchmarks are here, and they're a catastrophe"

Thoughts are... this is bad... I haven't bought/owned an AMD CPU for myself since the Athlon 64 X2 days way back when on Socket 939, but competition existing is a good thing. AMD's really sliding and this will hurt the market as a whole. :(
 
I would not say they are a catastrophe but they are not as good as AMD had originally promised. I mean "50% more performance in 33% more cores at the same power draw"

Although I believe that was revised close to BD launch to 35% more performance with 33% more cores at the same power draw.
 
Looks like instead of developing an awesome CPU, they've invented time travel...into the past. I was hoping server wise they would be great :(
 
Don't get your feelings hurt mate. Most of us here are AMD veterans. Haven't seen these specific results posted yet so I linked the story.
 
Wow. I kinda thought they would have done pretty well with their new server chips since theyve had such success in that market for a while. Looks like AMD's failure is now complete. Sad day indeed for AMD fans such as me. Oh well, nothing to do now but save up for Ivy Bridge while hoping against hope that Piledriver will be something competitive.
 
Don't get your feelings hurt mate. Most of us here are AMD veterans. Haven't seen these specific results posted yet so I linked the story.

okay, the benchmarks showed the new chips to be faster, pretty much, than the chips they replaced, then the author goes on the say that's not good enough because they did it by adding more cores and that doesn't count, therefore it's a disaster.

That's a bit of a leap. Essentially the title of his article, and of your subsequent thread, that you thought the community needed to see, is hyperbole.

No it isn't a catastrophe, and neither was the desktop performance. The hardware community has completely gone of the deep end about bulldozer, and it's pretty shameful to see so many people parroting the same things. People who have no practical experience with a product, but random hardware sites use words like catastrophe and disaster, and the snowball keeps growing.


tl/dr
 
okay, the benchmarks showed the new chips to be faster, pretty much, than the chips they replaced, then the author goes on the say that's not good enough because they did it by adding more cores and that doesn't count, therefore it's a disaster.

That's a bit of a leap. Essentially the title of his article, and of your subsequent thread, that you thought the community needed to see, is hyperbole.

No it isn't a catastrophe, and neither was the desktop performance. The hardware community has completely gone of the deep end about bulldozer, and it's pretty shameful to see so many people parroting the same things. People who have no practical experience with a product, but random hardware sites use words like catastrophe and disaster, and the snowball keeps growing.


tl/dr

I have not read the article. I have 2 running AMD machines. (One older one has been donated.) I have 2 running Intel machines.

AMD brought about the "disaster" due to marketing, not technology. If they'd labelled the cores as being 1/2 the number they list on the packaging, I bet no one would say they'd failed. However, they made some very big claims for performance/core and overall power use...and failed to meet their self-proclaimed achievements. That's a fail. You could reasonably argue that the technology is pretty good. You could even have a sound basis for saying it's a step better than Phenom II. But you cannot say that they met the numbers or claims that they so publicly claimed for Bulldozer.

Yeah, a snowball it is. However, up there at the top of the hill are the AMD marketers. They may've been making a snowman and it just kinda rolled on down away from them.

Too bad. It is, however, entirely self-caused by AMD.
 
"If they'd labelled the cores as being 1/2 the number they list on the packaging, I bet no one would say they'd failed."

Who cares about core count? If it were 100 core or a single core it's about performance vs. cost. If it's one big core and performed like the 8150 would it be an awesome processor since it's a single core vs. 4 or 6 core intel's?
 
Who cares about core count? If it were 100 core or a single core it's about performance vs. cost. If it's one big core and performed like the 8150 would it be an awesome processor since it's a single core vs. 4 or 6 core intel's?

That was the whole argument from AMD marketing. I mean delivering 2 times the core count of their competitor at about the same cost. A lot of us (including me) assumed that meant that AMD would do very well in highly multithreded tasks with this approach. We never expected the cores to be weaker than the previous generation and the cache to be much slower than the previous generation.
 
That was the whole argument from AMD marketing. I mean delivering 2 times the core count of their competitor at about the same cost. A lot of us (including me) assumed that meant that AMD would do very well in highly multithreded tasks with this approach. We never expected the cores to be weaker than the previous generation and the cache to be much slower than the previous generation.

My point was to the fact c3k said things would be different if they called their modules as cores instead of each module as 2 cores like they currently do. That wouldn't change benchmarks, that wouldn't change the fact that AMD misrepresented performance, or won't change anything about the chips.
 
I have not read the article. I have 2 running AMD machines. (One older one has been donated.) I have 2 running Intel machines.

AMD brought about the "disaster" due to marketing, not technology. If they'd labelled the cores as being 1/2 the number they list on the packaging, I bet no one would say they'd failed. However, they made some very big claims for performance/core and overall power use...and failed to meet their self-proclaimed achievements. That's a fail. You could reasonably argue that the technology is pretty good. You could even have a sound basis for saying it's a step better than Phenom II. But you cannot say that they met the numbers or claims that they so publicly claimed for Bulldozer.

Yeah, a snowball it is. However, up there at the top of the hill are the AMD marketers. They may've been making a snowman and it just kinda rolled on down away from them.

Too bad. It is, however, entirely self-caused by AMD.

AMD doesn't market, it's one of the primary issues a ton of people have complained about for years. Bulldozer doesn't meet expectations, and all of the sudden it was the AMD marketing machine that caused it.

it is an 8 core processor with some shared resources between pairs of cores. If it was pretend cores like you seem to think, then they wouldn't ever show benefits, which isn't true.

It's a completely new way to do things, they struggled to meet design goals, and the product was not as high a performer as man would like.

The snowball is FUD, it's the community going batshit over something that isn't real. AMD wasn't on my TV telling me how great their processors were, they weren't in my magazines, or on my radio.

AMD has a product that can potentially nip at intels heals for the first time in recent memory. It's a power hog, and like with any completely new architecture it's had some bumps along the way. That's a rational way to look at it. All this catastrophe nonsense is just that. People saying how AMD marketing magically was effective ONCE in the entire history of the company, and it was just before the bulldozer product launch is more bs.

it's not a shitty product, it's pretty unremarkable, power hungry at high clock speeds, but it has shown potential.
 
As with all things bulldozer, you get opinions and benchmarks both ways.

http://www.crn.com/news/data-center...qbfpGLrKVCpDjNJ4Cmhhw**.ecappj02?cid=nl_alert

The ars troll title is a little ridiculous. If you read the review, its an aggregation of random stuff including Anand's inconclusive testing, its not that great and kinda biased. :\ The frustrating thing about bulldozer are the conflicting reports. I've seen 2 or 3 reviews on legit sites that praise the Interlagos quite a bit. :|
 
arstechnica is a metrosexual's xbit/XS. In other words, complete troll bullshit with journalism so bad it couldn't even be in a PC magazine
 
AMD doesn't market, it's one of the primary issues a ton of people have complained about for years.

They do not advertize but they do market. They are pretty happy to point out the "Experience the world’s first native 8-core desktop processor." on their product literature ... which considering its performance this to me is just as bad as when Phenom 1 was released and they would not stop talking about their native quad core and how much better it was to be native even though they were losing nearly every single benchmark to core2quads. I was a serious AMD fanboy back then and it made me sick to keep hearing and reading that crap.
 
Last edited:
They do not advertize but they do market. They are pretty happy to point out the "Experience the world’s first native 8-core desktop processor." on their product literature ... which considering its performance this to me is just as bad as when Phenom 1 was released and they would not stop talking about their native quad core and how much better it was to be native even though they were losing nearly every single benchmark to core2quads. I was a serious AMD fanboy back then and it made me sick to keep hearing and reading that crap.


What do you want them to say, that there stuff is crap? When was the last time a company was pushing a product and told you it wasn't good? Just look at actors pushing their movies, all of them are so awesome. Do you boycott everything?
 
Folks have gotten a bit obsessed over the core count thing. A four "module" part does not have eight fully independent cores, but is close enough that calling it "eight core" is not a big stretch. The real problem is the poor IPC - why in the hell they'd pull a Pentium 4 with the pipeline stages is anyone's guess, we all know how well that worked for Intel :rolleyes: Also seems like the decode logic may not be robust enough to keep two "cores" properly fed ...

What most on the desktop side expected was a product that could beat/match/very-nearly-match Intel (Sandy Bridge), what we got barely beats out the previous gen at best, and sometimes loses. That's a fail any way you slice it, not to mention Bulldozer is bigger (more expensive to produce). Interlagos is much better, but still behind Intel.

arstechnica is a metrosexual's xbit/XS. In other words, complete troll bullshit with journalism so bad it couldn't even be in a PC magazine

Their "work" consists more of commentary that I often disagree with than good reporting/testing, I'll give you that.
 
What do you want them to say, that there stuff is crap? When was the last time a company was pushing a product and told you it wasn't good? Just look at actors pushing their movies, all of them are so awesome. Do you boycott everything?

No one said anything about boycotting but you? :confused:
 
arstechnica is a metrosexual's xbit/XS. In other words, complete troll bullshit with journalism so bad it couldn't even be in a PC magazine

dude BD is slower than phenom how is that trollish? did you read the hardocp review?
 
dude BD is slower than phenom how is that trollish? did you read the hardocp review?

Yeah, performance kinda sucks. But it's bad performance is overblown. Just like this review it's not a catastrophe. From review "The desktop Bulldozer benchmarks were a horror show performance for AMD. The newest and greatest architecture often failed to beat its predecessor, let alone the Intel competition. There were no such disasters when looking at server workloads."

No such disasters? Catastrophe?
 
Last edited:
Arstechnica is a wannabe tech site, I would wipe my butt with their site if I was able to.
 
I'm surprised people are not used to the trolls by now.

Don't let this last salvo of fake benchmarks from the Intel trolls get to you.

Let the trolls have fun while they can.

Don't be afraid of the Intel trolls or fear you might be proved wrong with the official benchmarks. Post what you think and why you think Bulldozer is going to be a winner.

That article required a lot of work. As Bulldozer approaches the Intel trolls are getting desperate.

Plowing right through Intel! What do you have on this trolls?

Don't feed the trolls people. Just Bulldoze them in a few months.

Intel trolls are having their fun while they can. Only a few more weeks until Bulldozer plows right through them!

It is called trolling. These Intel trolls can't stop the Bulldozer!

Ok Intel trolls, you had your fun. yawn.

Lots of Intel trolls in this thread trying to spread bad info.

13yi7b6.jpg
 
BD (FX & Opteron) is a capable, reasonably current CPU. It will get the job done both on the desktop and in the rack. The question that seems to evade most observers is that it had been widely heralded by not just fanboys but by AMD staff (and we know who we're talking about) as a trendsetting microprocessor that would restore AMD's long lost competitiveness on the high end. From that perspective it is not just a disappointment but a facepalm epic phail. Rarely is there a CPU that would have benefited the manufacturer more by not being released than BD.

"Wait for the next stepping... Wait for Piledriver... Wait for..." gimme a break, guys!
 
Yeah, performance kinda sucks. But it's bad performance is overblown. Just like this review it's not a catastrophe. From review "The desktop Bulldozer benchmarks were a horror show performance for AMD. The newest and greatest architecture often failed to beat its predecessor, let alone the Intel competition. There were no such disasters when looking at server workloads."

No such disasters? Catastrophe?

It is a catastrophe when you consider the time/money invested into bulldozer/interlagos. They also go on to state that the performance, albeit slightly better than previous opterons, isn't that competitive in performance-per-watt even if priced well. Throw in that we'll be seeing SB server chips soon and it is catastrophic. They also went on to state that the interlagos chips may be an upgrade over previous gen opterons, but that's not the market they should be aiming at as it doesn't increase your market share (duh. replacing an AMD chip with an AMD chip is easy. it's replacing an Intel chip with an AMD one that they should be aiming for).

Server benchmarks are tricky and difficult to read and interpret. I think anandtech decided to try to purge server benchmarks of their "picky" nature and produce their own benchmarks. They weren't quite as harsh on interlagos as the arstech article, but they too were unimpressed and reached much the same conclusions.

Future revisions of the chip don't equate to massive architectural changes -- something that bulldozer needs in order to stay competitive. Therefore hoping that the inevitable bulldozer/interlagos revisions will be able to improve the performance of the chips is hopeless. You may see some bug fixes and maybe a slightly lower power draw and vcore, but you won't be seeing the IPC performance that we as enthusiasts have been asking for on the desktop space. I think that maybe chip revisions may help the interlagos in the server space more than the desktop, but still not enough to catch sandy-based xeons.

Is bulldozer/interlagos better than PhenomII 6100 opterons? Yes, although that's a far more resounding yes in the server space and even there it's just "good." But what most tech sites have been missing in their reviews is that AMD has stated that we'll be seeing a 10-15% increase in performance annually. I think that's by far the biggest disappointment when you consider that they'll need more than 10-15% to catch their own Phenom II's in IPC and another 30%+ on top of that to get to Sandy Bridge with Ivy Bridge right around the corner.

The sky isn't falling, but we have to remember that Intel had enough money and "means" (illegal, if you guys recall) of staying afloat after their ship sank during Pentium 4. AMD isn't in the same position and doesn't have the money to do that. They're 100% committed to this new architecture -- and right through Piledriver which will make some improvements but won't be an overhaul. This may be dandy for their figures against their own previous gen/revision Bulldozers and Phenoms, but they're not competing against themselves here; they're competing against Intel.

Now you can claim that it's not a catastrophe, but in order to do so you'd have to subsequently assume that Ivy Bridge won't do what Sandy did to Nehalem and that AMD will sometime soon decide to undergo a massive restructuring of their bulldozer architecture. As of right now, I think both of those claims aren't only highly optimistic, but they also include a failure to read AMD's own charts and public statements.

TL/DR version: Either completely restructure bulldozer arch or dump it altogether if they want to see this through, and preferably with the quickness.

EDIT: All that said, I wouldn't mind buying a bulldozer 8120 for the price of current Thubans, that ~$170 range, because at that price range they start to look attractive in comparison to both Intel and Phenom II's. But with 2 billion transistors and low yields they'd be taking a loss with each chip sold. Again, architecture problem and not a GloFo one.
 
Last edited:
Therefore I ask the question, why is a hack like Peter Bright writing these stories instead of Jon Stokes?
 
Therefore I ask the question, why is a hack like Peter Bright writing these stories instead of Jon Stokes?

Yea, I think several people said the same thing in the comments. I'd have to agree that the article wasn't up to par. The conclusion could have been more clear and the title should certainly be changed to reflect the main tone of the article, even if it's a bit more harsh than Anand's was it still doesn't reflect their current title. They're usually a great site, though not as focused on hardware like [H] or AT, but they have some great journalists, like Stokes, and great articles too :)

Now if only they'd quit focusing so much on crApple...
 
What do you want them to say, that there stuff is crap? When was the last time a company was pushing a product and told you it wasn't good?

No but bragging about this to me is in bad taste. To me it's just there to take advantage of the average joe who never seen the benchmarks.

Just look at actors pushing their movies, all of them are so awesome. Do you boycott everything?

I would not pay to see the bad movie. Just like I would not pay the current prices for the current BD. You do have a good point though. I mean I see a lot of advertising where a company brags about some feature but in reality the feature is nothing at all to get excited about compared to their competition. In most of those cases I am not as annoyed/offended as I am by these statements from AMD since generally I do not care as much about their products as I do CPUs.
 
Bright is a hack looking for page hits and that article shows it (the title should have given it away)....Stokes is the one who should have written it. Ar's has some good writers but there are a couple very unprofessional ones on there as well.
 
The power usage for these kind of makes them a no go.

I'm kind of unsure why anyone would really buy them? G34 is end of lifed after this and you don't really gain much from magny cours unless you're going up to 16 core. The only reason to get one is if I guess you're cpu bound with a bunch of 12 core magny cours chips.

It's just such a weird product. It's like they created it in a vacuum without paying attention to anyone's use cases. It's not very good as a desktop chip, the power usage and upgrade path makes it an iffy server chip, and it's maybe OK as a rendering/workstation chip?

Very strange.
 
The power usage for these kind of makes them a no go.

I'm kind of unsure why anyone would really buy them? G34 is end of lifed after this and you don't really gain much from magny cours unless you're going up to 16 core. The only reason to get one is if I guess you're cpu bound with a bunch of 12 core magny cours chips.

It's just such a weird product. It's like they created it in a vacuum without paying attention to anyone's use cases. It's not very good as a desktop chip, the power usage and upgrade path makes it an iffy server chip, and it's maybe OK as a rendering/workstation chip?

Very strange.

I think if they could get 5-10% IPC improvement along with getting power usage down it would make a very good server chip...still not sure about desktop though? Just my 2 cents...
 
Back
Top