AMD Engineer Explains BD Fiasco

i think by handcrafting, he meant the organization and design of the transistors and how they're arranged on the die, not actually building them by hand.
 
I'd wager that the next iteration of Bulldozer will be significantly better than Bulldozer is today. What does that mean? Well I don't know. As I said before it will probably be at least as good as Phenom II was to Phenom I. That doesn't say much but when the AMD fan boys were prepped and ready for Phenom II to pay off and kill Intel's Penryn and Yorkfield processors, I knew better. I expected improvement just as I do from Bulldozer's successor. However you can only take evolutionary steps so far. You don't make incremental adjustments and get revolutionary results.

Definitely agree with that.

It would most likely take AMD to follow Intel's footsteps in approaching chip design. For the most part, if looking back at what Intel has done going from Core 2 to i7 (1366) to i7 (1155), they took what made the processors worked well and scrapped what didn't. Then they redesigned the CPU going into Sandy Bridge. The core arrangement connected via a ringbus within the CPU seems to be the most elegant in CPU design I've seen so far.

Comparing to Bulldozer/FX-series, it seems more similar to traditional arrangements, and is more crowded so to speak, especially with all that cache in-between the modules.

In my opinion, instead of going down the path of just incremental improvements like what was done with Phenom I to Phenom II, start from scratch again. Keep the Bulldozer modules but improve their per core performance and IPC. Rearrange the modules in such a manner that's more efficient in its design similar to how Intel created Sandy Bridge. I don't say copy it directly but when you read this, Intel did something right when they moved away from Nehalem's design.

As for the socket fiasco that is AM3+, I don't know. I prefer LGA style sockets and AMD does use them for the Opteron but my issue isn't with the socket itself but rather the mounting holes and the distance to the DIMM slots. What they really need to do is go to a symmetrical four hole design, and if they could match Intel on this precisely allowing us to use the same exact thermal solutions for both camps that would be fantastic. I'd like to see the distance between the DIMM slots and the CPU socket increased, but I don't know if that's possible without replacing Hypertransport. They can bypass the need to do so by emulating Sandy Bridge's approach with an integrated I/O controller, but I don't know if this is feasible. It's certainly possible, but financially it may not make sense for AMD to even try that route. Perhaps some kind of signal booster could be designed to allow HT to stretch farther? AMD may have investigated this and deemed it too costly. and it certainly might be the case. It is after all additional hardware and margins in this business are very thin already.

Again, this probably points out to the huge disparity of financial backing Intel has over AMD. For Intel, they can weather the costs when designing a new CPU, AMD not so much.

It does make me wonder though: Where is all the money that AMD is earning being spent on?

Seven years on Bulldozer and the money and time spent into it didn't seem to help much. Intel is working on a "tick-tock" schedule but they can afford doing that. Seven years seems like an awfully long time to work on a single CPU architecture only to have it match or be outperformed by its previous generation except on memory bandwidth. Intel churns out a new CPU architecture within a year or two, right?

As I understand it, the rumor is that AMD will create Bulldozer's successors around socket FM1. And AGAIN AMD screwed up with FM1 as it has the same exact flaws as AM2, AM2+, AM3 and AM3+ all do. The devil is in the details and AMD doesn't seem to care about those details. Again they had the opportunity to create a new design and failed to do so. They just changed the pin count of the socket itself. :rolleyes: In all fairness if compact self-contained water cooling units take off the way heat pipe coolers did 5 years or more ago then it won't be a real problem anymore. So far we have every indication that this might happen due to the sheer volume of coolers following that basic design being released all the time. Am I a fan of this, no. Will it get the job done, yes. AMD may have seen the writing on the walls with that and said; "screw it, we'll let heat sink and fan makers worry about that." AMD may feel the approach they've taken with sockets is the nicer approach with regard to upgrades and making the customers happy. For the most part it seems to be the case as everyone with an AM3 / AM3+ board seems to be really happy they've got upgrade options.

And if you still end up needing a new motherboard because you don't get the proper BIOS update from your motherboard vendor, then it's not AMD's fault, it's your board makers fault. (You AM2+ guys should remember how this game is played.) Secretly I'd bet AMD wants most of the motherboard vendors to avoid updating their BIOS to some degree forcing people into buying motherboards now that AMD is back in the chipset business. But this way they can save face and make the motherboard vendors the bad guys in all of this.

BIOS updates can only go so far. You also have to worry if the motherboard maker can support the CPU's thermal envelope and power requirements. For example, my current AM2+ board can only support 95W CPUs, but the newest BIOS adds Phenom II X6 support which most are 125W. That rules my motherboard out of contention for the X6 processors at that point.

On the other hand, you can expect a socket 1155 board to support Sandy Bridge and upcoming SB-E and Ivy Bridge with a new BIOS update regardless of the board. You only have to worry about what PSU to use to supply all the parts with power.

I wonder why such a big difference between both AMD and Intel boards.

it just feels like if I was the CEO of AMD (the new one, not Ruiz), and I saw the reviews and criticism the company is taking for their FX-series CPU, I would literally just scrap everything old including the sockets and start over again. Begin with a better version of Hypertransport that doesn't require memory slots to be so close to the CPU socket. Redesign a better internal core arrangement with increased IPC and per core performance. Redesign the Bulldozer module so that it performs better for both single and multi-threaded applications not just multi-threaded applications alone.

But, that would most likely take a major financial undertaking and I'm not quite sure AMD can afford that unless Intel decides to get friendly and offer them a loan.
 
If Microsoft was able to give Apple a loan in 1997, I wouldn't be surprised if Intel does the same. Intel is so big that if AMD goes away, Im certainly sure that the U.S. government and the FTC will ask Intel to split in half, as it's a monopoly.

Intel's approach to the Ringbus works very well for them, AMD already used it before on the R520/R580/R600/RV670 series, so I think that the expertise is there. I think that the Ringbus alone would help to low down the transistor budget on the uncore part, as I think that 800+MM is a little too much for the uncore alone. This chip is almost a Cypress GPU in terms of transistor budget and size.
 
If Microsoft was able to give Apple a loan in 1997, I wouldn't be surprised if Intel does the same. Intel is so big that if AMD goes away, Im certainly sure that the U.S. government and the FTC will ask Intel to split in half, as it's a monopoly.

Intel's approach to the Ringbus works very well for them, AMD already used it before on the R520/R580/R600/RV670 series, so I think that the expertise is there. I think that the Ringbus alone would help to low down the transistor budget on the uncore part, as I think that 800+MM is a little too much for the uncore alone. This chip is almost a Cypress GPU in terms of transistor budget and size.

Yep, Allowing AMD to survive is much cheaper for Intel then letting them die.
 
I wonder why such a big difference between both AMD and Intel boards.

This question has an easy answer. Intel much more strictly controls the specifications for motherboards which will use their chipsets. Intel's VRD specifications are also considerably more robust than AMD's are. From a BIOS standpoint manufacturers also have an easier time because motherboard manufacturers don't have to support so much legacy microcode.
 
Didn't Intel recently give AMD an enormous sum of money? I figure AMD finally pumped out Bulldozer to just get it done with and is now using that money for its' APU projects and to refine Bulldozer into something useful. Next iteration will actually be worthwhile.
 
Intel is so big that if AMD goes away, Im certainly sure that the U.S. government and the FTC will ask Intel to split in half, as it's a monopoly.

I see this comment all the time, but a monopoly isn't illegal, just abusing it is. If AMD went away, it's not automatic that the government would step in and do something.
 
I see this comment all the time, but a monopoly isn't illegal, just abusing it is. If AMD went away, it's not automatic that the government would step in and do something.

Ofcourse they would. Proving guilt in the case of monopolies is the easiest thing in the world and gives the government a bunch of money.
 
Didn't Intel recently give AMD an enormous sum of money? I figure AMD finally pumped out Bulldozer to just get it done with and is now using that money for its' APU projects and to refine Bulldozer into something useful. Next iteration will actually be worthwhile.

Only due to litigation. Intel was fined for anti-trust practices.
 
If it's true, it's sad. Not only are they getting the disadvantage of automated tools (let's say up to 20% bigger and slower), but they aren't getting the benefits. The benefits should be faster implementation and faster changes.

But they didn't get that. Fist, it took forever to get out. Second, if you read the end of Tomshardware review, Piledriver will have IPC increase of less than 5%, the rest of the improvement coming from clock speed. They are just making changes to things like "structure size increases". It sounds a lot like the minor changes they made to K8 all these years. With these tools, you'd expect them to re-design parts of it and fix the actual bottlenecks like ATI did after the 2900XT, but clearly they aren't doing that by what was said about Piledriver.


Athon 64 > Athlon X2 > Phenom > Phenom II > Phenom II X6 > Llano

Over five years of products, with just small, relatively simple improvements to the core itself, but mostly faster clocks and/or more cores. I think this is the same crap they got planned with future iterations of BD, despite these automated tools.
 
AMD and ATI can share common components, its much quicker to design processes and you dont need to QA as hard (as the program should have done that for you) and you dont need as skilled of engineers to do it for you (as you have the process to make the things).

Maybe if AMD had a monopoly on CPU's, they wouldn't need to QA as hard, and qouldn't need as skilled of engineers, but clearly, they need to QA more and have better engineers (or at least use the ones they have more)
 
i think by handcrafting, he meant the organization and design of the transistors and how they're arranged on the die, not actually building them by hand.

Well....yeah, obviously. It's physically impossible to "build" transistors by hand in nanoscale.

Nevertheless, organizing and laying out every single of those 2 billion transistors seems unlikely. I'm guessing some small pieces were designed, then duplicated many times.
 
Well....yeah, obviously. It's physically impossible to "build" transistors by hand in nanoscale.

Nevertheless, organizing and laying out every single of those 2 billion transistors seems unlikely. I'm guessing some small pieces were designed, then duplicated many times.

That's how it should be. A huge percentage of the die is cache. Cache is... cache. You build it once and then lay it down a whole lot of times.

Other things like IO, memory addressing etc can have some things hand tweaked, but they are usually usable in more than one core - IE you build it once and have it as part of a "standard library" you can reference in the future. This is pretty much how Intel made the atom (they just used currently existing schema's with slight changes).

Intel only actually hand tweaks specific parts of their cores moving forward as a lot of work is already done. To actually create a 2 billion transistor CPU by automation... Crazy. Even for a company as large and as big as Intel.
 
I do think that Bulldozer has a lot of power to untap in its future iterations like the K7-K8-K10 etc. I think that AMD should take some lessons in transistor density from ATi. Those engineers really knows how to pack transistor and keep the die size in check. You have seen several GPU's like the RV670, RV770, RV870 and Cayman which are pretty much double of its previous iteration in terms of computing units and yet, the die size and power consumption never doubled up.
 
If Microsoft was able to give Apple a loan in 1997, I wouldn't be surprised if Intel does the same. Intel is so big that if AMD goes away, Im certainly sure that the U.S. government and the FTC will ask Intel to split in half, as it's a monopoly.

Well, AT&T was broken up in 1984 not because it was a monopoly (which is legal), but because it was *abusing* its monopoly, which is illegal. But I do agree that the disappearance of AMD on the CPU front would lead to greater federal regulation and oversight.
 
Well, AT&T was broken up in 1984 not because it was a monopoly (which is legal), but because it was *abusing* its monopoly, which is illegal. But I do agree that the disappearance of AMD on the CPU front would lead to greater federal regulation and oversight.
If Intel pulls a "1984 AT&T" kind of move, then yes they MAY be broken up. It all depends on who is running the FTC at that time.
 
I see this comment all the time, but a monopoly isn't illegal, just abusing it is. If AMD went away, it's not automatic that the government would step in and do something.

Esp true if ARM should become Intel's main competitor.
 
It does make me wonder though: Where is all the money that AMD is earning being spent on?
As far as I know, it's normal to screw up once per major architecture redesign. Intel has produced plenty of CPUs that turned out to be terrible:
486 -> PPro
PPro->Itanium
PIII->Netburst
Nahalem->Larabee

The problem IMHO is that AMD is a fabless design company trying to pretend it owns a foundry. They didn't have any business making big money bets on the early success of GF's 32mn process OR their untested new architecture. They should never sell a single piece silicon at a loss in order to gain market share in a highly competitive market. They're going to have to scale back their spending and try to find a new niche in some growing but under-invested area.

It's going to be tough because there are already too many designs chasing after the same highly valuable fab space.
 
Last edited:
I just realized that they may have decided not to compete with Intel in that regard anymore and instead opted to move more resources to Llano and other Fusion based CPU / GPU chips and go after the mobile market with a vengeance. This also could explain why Bulldozer took so long to design and produce.
That would be really disappointing if AMD had decided to shift their focus from desktop processors to the mobile market :(

i think by handcrafting, he meant the organization and design of the transistors and how they're arranged on the die, not actually building them by hand.

Definitely, its about designing the layout of the transistors where you want to put them as close as possible and in a way that reduces the metal path that connects them. The metal connectors has their own capacitance which affects the power consumption and performance, among other things.
 
What is AMD supposed to live on between now and the "few years" it will take to unscramble the automated mess they made of BD? Their market cap is less than 4% that of Intel, and many would argue that's mostly video cards. They had to use software to design the overbloated overtransistored POS because they can't afford real R&D. With BD selling like turdcakes, GloFo unable to get 32 nm right let alone 22 nm, and their share price drooping continuously (it's 10% of what it was a few years ago) where exactly is BD and AMD going? * FLUSH *

BD doesn't matter as much as you think, the worst that happened is that in 6 months they get another go at it, even _if_ AMD beat the 26000 the market share wouldn't have changed that much, it is just impossible for AMD to produce large amount of anything on 32nm.

There R&D is alright , the comment of the ex engineer were more based towards the latter stages of the development where they are now relying on a program when they used to rely on engineers to finish the design.

Bulldozer is going forward, on other websites (xs) Chew spilled some of the beans on this there wasn't enough to go on to do a die shrink on Thuban (conversation between him and an AMD engineer).

In some benchmarks Bulldozer really beats the 26000 that is still a sign that not everything is bad, on [H] forums there just to many people that can't judge what it does do and what it doesn't and it just becomes one word that they can deal with and it tends to be sucks, failure or some other form of expression which their mind can deal with.

Unlike what you read here, it still seems that within the framework Bulldozer still works good enough for companies like Cray.
 
Last edited:
As far as I know, it's normal to screw up once per major architecture redesign. Intel has produced plenty of CPUs that turned out to be terrible:
486 -> PPro
PPro->Itanium
PIII->Netburst
Nahalem->Larabee

The problem IMHO is that AMD is a fabless design company trying to pretend it owns a foundry. They didn't have any business making big money bets on the early success of GF's 32mn process OR their untested new architecture. They should never sell a single piece silicon at a loss in order to gain market share in a highly competitive market. They're going to have to scale back their spending and try to find a new niche in some growing but under-invested area.

It's going to be tough because there are already too many designs chasing after the same highly valuable fab space.

Until the Athlon64, AMD always sold their processors below cost, that was the only way they could move them. In fact, performance (or lack thereof) wasn't the largest problem, it had more to do with the fact that AMD did not mass-produce chipsets. Did you know that prior to the ATi acquisition, AMD has only produced 4 (yes 4) chipsets: 750, 760, 760MP/MPX and 8111 (the list would be 5, but the 640 was basically a VIA VP2).

Intel offered OEMs a CPU and a chipset/IGP chipset, and all the OEMs had to do go to ODMs such as; Foxconn, Asus(Pegatron, now) and Quanta, and they would build the boards, saving OEMs lots of money (definitely so, since Intel gave them SEVERE discounts on the CPU prices for buying an Intel chipset). (Which in itself is funny, because since Intel has successfully locked anyone else from producing chipsets for their CPUs, they have eliminated the reason for giving those discounts.)
 
BD doesn't matter as much as you think, the worst that happened is that in 6 months they get another go at it, even _if_ AMD beat the 26000 the market share wouldn't have changed that much, it is just impossible for AMD to produce large amount of anything on 32nm.

There R&D is alright , the comment of the ex engineer were more based towards the latter stages of the development where they are now relying on a program when they used to rely on engineers to finish the design.

Bulldozer is going forward, on other websites (xs) Chew spilled some of the beans on this there wasn't enough to go on to do a die shrink on Thuban (conversation between him and an AMD engineer).

In some benchmarks Bulldozer really beats the 26000 that is still a sign that not everything is bad, on [H] forums there just to many people that can't judge what it does do and what it doesn't and it just becomes one word that they can deal with and it tends to be sucks, failure or some other form of expression which their mind can deal with.

Unlike what you read here, it still seems that within the framework Bulldozer still works good enough for companies like Cray.

As I have said repeatedly I have no interest in servers, and low to mid end consumer CPUs. My sole interest is the highest performing CPU I can affordably purchase. From that perspective, FX sucks more than a dirty old ho. So if AMD overnight got 90% market share in servers and totally dominated the netbook segment I would still call FX a total disappointment.
 
Until the Athlon64, AMD always sold their processors below cost, that was the only way they could move them. .
Below cost or below what Intel was selling them for? My understanding is that profits from the clones/K5/K6 basically saved AMD from going under as US government spending on semiconductors declined and the Japanese / Koreans forced memory prices down.

In fact, performance (or lack thereof) wasn't the largest problem, it had more to do with the fact that AMD did not mass-produce chipsets.
OK, but back then AMD was a company that made things. Today, AMD doesn't make anything; they just design things. They aren't going to compete with Intel by producing designs that require parity or superiority in manufacturing process. If they had come out with a terrible integrated scalar/vector processor 2 or 3 years ago, it wouldn't have mattered because the market was (and still is) underserved, both at the low end (mobile) and the high end (clusters/supercomputers).

Unfortunately, Intel has already had very successful launches of Atom and MIC. The window of opportunity is rapidly fading.
 
xbitlabs is financed by Intel marketing, they've been publishing dirty-tricks "reviews" for years.
 
xbitlabs is financed by Intel marketing, they've been publishing dirty-tricks "reviews" for years.

Yeah! It's a conspiracy! BD on just one core actually outperforms IBM Watson but it's Intel who's paying off the reviewers! Let's start an Occupy Intel protest! Freedom from Intel! :rolleyes:
 
I think most successful small companies break forward cause they're passionate about their products, seems AMD has lost that, and if they don't take the right steps and start to fix that, they'll be reduced to even smaller company after this.
Here's hoping their (somewhat) new CEO is passionate enough to see that.
 
Yeah! It's a conspiracy! BD on just one core actually outperforms IBM Watson but it's Intel who's paying off the reviewers! Let's start an Occupy Intel protest! Freedom from Intel! :rolleyes:

Yeah I have noticed they even stoop so low making people post stupid topics in the AMD forum ....
 
Who is to blame? Ruiz? Meier? Both?

EDIT:
@aviat72. Thanks for your post. Very interesting.

Everybody involved in this fiasco are idiots including the two you named.

I think most successful small companies break forward cause they're passionate about their products, seems AMD has lost that, and if they don't take the right steps and start to fix that, they'll be reduced to even smaller company after this.
Here's hoping their (somewhat) new CEO is passionate enough to see that.

Oh sure Passion is all that AMD is missing, if they had some passion AMD would of ripped Intel to shreds this round.:rolleyes:
 
Last edited:
Welcome to the age of...Corporate mentality. Reduce costs by cutting manpower, sell the branding not the product, and milk the company for bonuses before you get fired. That all works great when you have the largest portion of the market share, but when you are behind in the race, you need innovation and loyalty.
 
BTW when this was first posted months ago it was dismissed as intel trolls propaganda and revenge of ex-worker ;)
 
AMD and ATI can share common components, its much quicker to design processes and you dont need to QA as hard (as the program should have done that for you) and you dont need as skilled of engineers to do it for you (as you have the process to make the things).

Im assuming the biggest advantage is that since many common components are shared due to the processing combining them in a fusion APU is even easier than ever.

[Twilight Zone theme] Machines designing machines. They themselves will replace their god. Soon they won't even need us any more. [/theme] :eek:
 
Welcome to the age of...Corporate mentality. Reduce costs by cutting manpower, sell the branding not the product, and milk the company for bonuses before you get fired. That all works great when you have the largest portion of the market share, but when you are behind in the race, you need innovation and loyalty.

Problem with that mentality for AMD is they don't have the branding to float them. They need a good product and they dont have one.
 
Well, AT&T was broken up in 1984 not because it was a monopoly (which is legal), but because it was *abusing* its monopoly, which is illegal. But I do agree that the disappearance of AMD on the CPU front would lead to greater federal regulation and oversight.

The current and recent (at least since Clinton IMO) past administrations and regulators have allowed the banks, cell phone companies, ISP's, etc. to consolidate like crazy.

I mean shit, they haven't even tried to break up the banks and they fucking destroyed a huge chunk of the economy for personal gain. The government actually made them _bigger_ as a "solution" to help prop up balance sheets and prevent bank failures.

This administration is incredibly pro mega corp/monopoly.

Unless Intel does some absolutely insane monopoly abuse they'd likely be allowed to remain a single company and play at least some of its old dirty tricks and bullying to other OEMs.
 
BTW when this was first posted months ago it was dismissed as intel trolls propaganda and revenge of ex-worker ;)

I still remember the flak I got from collecting C Maier and Mitch Alsup comments and posting them here and at xs.
 
It's amazing how much this so-called (by many) "disgruntled ex-employee" knew of and predicted the performance of Bulldozed almost 1.5yrs ago!!!! Here is a quote he posted in Apr of 2010, on Macrumors forum:

On paper bulldozer is a lovely chip. Bulldozer was on the drawing board (people were even working on it) even back when I was there. All I can say is that by the time you see silicon for sale, it will be a lot less impressive, both in its own terms and when compared to what Intel will be offering. (Because I have no faith AMD knows how to actually design chips anymore). I don't really want to reveal what I know about Bulldozer from my time at AMD. I know less about bobcat. From what I can tell it's just the latest name for a project that had been kicking around since 2005 when we acquired new design teams in two new locations. None of these designers had any experience with design in the <10W range. I don't know any of the people currently working on bobcat, but given the price it is likely to sell for, it's not going to make AMD a lot of money (and it will be competing not only against Intel, but against numerous ARM variations).


Man, there's been a LOT of people eating crow since Oct 12... Ouch...:eek:
 
As I have said repeatedly I have no interest in servers, and low to mid end consumer CPUs. My sole interest is the highest performing CPU I can affordably purchase. From that perspective, FX sucks more than a dirty old ho. So if AMD overnight got 90% market share in servers and totally dominated the netbook segment I would still call FX a total disappointment.

I have know interest in overpriced processors like Gulftown. I just want an affordable processor that's COMPETETIVE and overclocks nicely. AMD blew it end of story.
 
I have know interest in overpriced processors like Gulftown. I just want an affordable processor that's COMPETETIVE and overclocks nicely. AMD blew it end of story.

Note "affordably purchase" then note my shiny new 2600K in my sig! :D
 
Back
Top