Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
I'd wager that the next iteration of Bulldozer will be significantly better than Bulldozer is today. What does that mean? Well I don't know. As I said before it will probably be at least as good as Phenom II was to Phenom I. That doesn't say much but when the AMD fan boys were prepped and ready for Phenom II to pay off and kill Intel's Penryn and Yorkfield processors, I knew better. I expected improvement just as I do from Bulldozer's successor. However you can only take evolutionary steps so far. You don't make incremental adjustments and get revolutionary results.
As for the socket fiasco that is AM3+, I don't know. I prefer LGA style sockets and AMD does use them for the Opteron but my issue isn't with the socket itself but rather the mounting holes and the distance to the DIMM slots. What they really need to do is go to a symmetrical four hole design, and if they could match Intel on this precisely allowing us to use the same exact thermal solutions for both camps that would be fantastic. I'd like to see the distance between the DIMM slots and the CPU socket increased, but I don't know if that's possible without replacing Hypertransport. They can bypass the need to do so by emulating Sandy Bridge's approach with an integrated I/O controller, but I don't know if this is feasible. It's certainly possible, but financially it may not make sense for AMD to even try that route. Perhaps some kind of signal booster could be designed to allow HT to stretch farther? AMD may have investigated this and deemed it too costly. and it certainly might be the case. It is after all additional hardware and margins in this business are very thin already.
As I understand it, the rumor is that AMD will create Bulldozer's successors around socket FM1. And AGAIN AMD screwed up with FM1 as it has the same exact flaws as AM2, AM2+, AM3 and AM3+ all do. The devil is in the details and AMD doesn't seem to care about those details. Again they had the opportunity to create a new design and failed to do so. They just changed the pin count of the socket itself. In all fairness if compact self-contained water cooling units take off the way heat pipe coolers did 5 years or more ago then it won't be a real problem anymore. So far we have every indication that this might happen due to the sheer volume of coolers following that basic design being released all the time. Am I a fan of this, no. Will it get the job done, yes. AMD may have seen the writing on the walls with that and said; "screw it, we'll let heat sink and fan makers worry about that." AMD may feel the approach they've taken with sockets is the nicer approach with regard to upgrades and making the customers happy. For the most part it seems to be the case as everyone with an AM3 / AM3+ board seems to be really happy they've got upgrade options.
And if you still end up needing a new motherboard because you don't get the proper BIOS update from your motherboard vendor, then it's not AMD's fault, it's your board makers fault. (You AM2+ guys should remember how this game is played.) Secretly I'd bet AMD wants most of the motherboard vendors to avoid updating their BIOS to some degree forcing people into buying motherboards now that AMD is back in the chipset business. But this way they can save face and make the motherboard vendors the bad guys in all of this.
If Microsoft was able to give Apple a loan in 1997, I wouldn't be surprised if Intel does the same. Intel is so big that if AMD goes away, Im certainly sure that the U.S. government and the FTC will ask Intel to split in half, as it's a monopoly.
Intel's approach to the Ringbus works very well for them, AMD already used it before on the R520/R580/R600/RV670 series, so I think that the expertise is there. I think that the Ringbus alone would help to low down the transistor budget on the uncore part, as I think that 800+MM is a little too much for the uncore alone. This chip is almost a Cypress GPU in terms of transistor budget and size.
I wonder why such a big difference between both AMD and Intel boards.
Intel is so big that if AMD goes away, Im certainly sure that the U.S. government and the FTC will ask Intel to split in half, as it's a monopoly.
I see this comment all the time, but a monopoly isn't illegal, just abusing it is. If AMD went away, it's not automatic that the government would step in and do something.
Didn't Intel recently give AMD an enormous sum of money? I figure AMD finally pumped out Bulldozer to just get it done with and is now using that money for its' APU projects and to refine Bulldozer into something useful. Next iteration will actually be worthwhile.
AMD and ATI can share common components, its much quicker to design processes and you dont need to QA as hard (as the program should have done that for you) and you dont need as skilled of engineers to do it for you (as you have the process to make the things).
i think by handcrafting, he meant the organization and design of the transistors and how they're arranged on the die, not actually building them by hand.
Well....yeah, obviously. It's physically impossible to "build" transistors by hand in nanoscale.
Nevertheless, organizing and laying out every single of those 2 billion transistors seems unlikely. I'm guessing some small pieces were designed, then duplicated many times.
If Microsoft was able to give Apple a loan in 1997, I wouldn't be surprised if Intel does the same. Intel is so big that if AMD goes away, Im certainly sure that the U.S. government and the FTC will ask Intel to split in half, as it's a monopoly.
If Intel pulls a "1984 AT&T" kind of move, then yes they MAY be broken up. It all depends on who is running the FTC at that time.Well, AT&T was broken up in 1984 not because it was a monopoly (which is legal), but because it was *abusing* its monopoly, which is illegal. But I do agree that the disappearance of AMD on the CPU front would lead to greater federal regulation and oversight.
I see this comment all the time, but a monopoly isn't illegal, just abusing it is. If AMD went away, it's not automatic that the government would step in and do something.
As far as I know, it's normal to screw up once per major architecture redesign. Intel has produced plenty of CPUs that turned out to be terrible:It does make me wonder though: Where is all the money that AMD is earning being spent on?
That would be really disappointing if AMD had decided to shift their focus from desktop processors to the mobile marketI just realized that they may have decided not to compete with Intel in that regard anymore and instead opted to move more resources to Llano and other Fusion based CPU / GPU chips and go after the mobile market with a vengeance. This also could explain why Bulldozer took so long to design and produce.
i think by handcrafting, he meant the organization and design of the transistors and how they're arranged on the die, not actually building them by hand.
What is AMD supposed to live on between now and the "few years" it will take to unscramble the automated mess they made of BD? Their market cap is less than 4% that of Intel, and many would argue that's mostly video cards. They had to use software to design the overbloated overtransistored POS because they can't afford real R&D. With BD selling like turdcakes, GloFo unable to get 32 nm right let alone 22 nm, and their share price drooping continuously (it's 10% of what it was a few years ago) where exactly is BD and AMD going? * FLUSH *
As far as I know, it's normal to screw up once per major architecture redesign. Intel has produced plenty of CPUs that turned out to be terrible:
486 -> PPro
PPro->Itanium
PIII->Netburst
Nahalem->Larabee
The problem IMHO is that AMD is a fabless design company trying to pretend it owns a foundry. They didn't have any business making big money bets on the early success of GF's 32mn process OR their untested new architecture. They should never sell a single piece silicon at a loss in order to gain market share in a highly competitive market. They're going to have to scale back their spending and try to find a new niche in some growing but under-invested area.
It's going to be tough because there are already too many designs chasing after the same highly valuable fab space.
BD doesn't matter as much as you think, the worst that happened is that in 6 months they get another go at it, even _if_ AMD beat the 26000 the market share wouldn't have changed that much, it is just impossible for AMD to produce large amount of anything on 32nm.
There R&D is alright , the comment of the ex engineer were more based towards the latter stages of the development where they are now relying on a program when they used to rely on engineers to finish the design.
Bulldozer is going forward, on other websites (xs) Chew spilled some of the beans on this there wasn't enough to go on to do a die shrink on Thuban (conversation between him and an AMD engineer).
In some benchmarks Bulldozer really beats the 26000 that is still a sign that not everything is bad, on [H] forums there just to many people that can't judge what it does do and what it doesn't and it just becomes one word that they can deal with and it tends to be sucks, failure or some other form of expression which their mind can deal with.
Unlike what you read here, it still seems that within the framework Bulldozer still works good enough for companies like Cray.
Intel bet on the fact that they'd be able to scale the clock speed past 5GHz and they never achieved anything beyond 3.8GHz with any reliability.
Below cost or below what Intel was selling them for? My understanding is that profits from the clones/K5/K6 basically saved AMD from going under as US government spending on semiconductors declined and the Japanese / Koreans forced memory prices down.Until the Athlon64, AMD always sold their processors below cost, that was the only way they could move them. .
OK, but back then AMD was a company that made things. Today, AMD doesn't make anything; they just design things. They aren't going to compete with Intel by producing designs that require parity or superiority in manufacturing process. If they had come out with a terrible integrated scalar/vector processor 2 or 3 years ago, it wouldn't have mattered because the market was (and still is) underserved, both at the low end (mobile) and the high end (clusters/supercomputers).In fact, performance (or lack thereof) wasn't the largest problem, it had more to do with the fact that AMD did not mass-produce chipsets.
FYI: That's not true. Intel ships a 4.4GHz Xeon today and has been for a while.
http://www.cpu-world.com/news_2011/2011071901_Xeon_X5698_is_shipped_in_HP_and_Dell_servers.html
xbitlabs is financed by Intel marketing, they've been publishing dirty-tricks "reviews" for years.
Yeah! It's a conspiracy! BD on just one core actually outperforms IBM Watson but it's Intel who's paying off the reviewers! Let's start an Occupy Intel protest! Freedom from Intel!
FYI: That's not true. Intel ships a 4.4GHz Xeon today and has been for a while.
http://www.cpu-world.com/news_2011/2011071901_Xeon_X5698_is_shipped_in_HP_and_Dell_servers.html
Who is to blame? Ruiz? Meier? Both?
EDIT:
@aviat72. Thanks for your post. Very interesting.
I think most successful small companies break forward cause they're passionate about their products, seems AMD has lost that, and if they don't take the right steps and start to fix that, they'll be reduced to even smaller company after this.
Here's hoping their (somewhat) new CEO is passionate enough to see that.
AMD and ATI can share common components, its much quicker to design processes and you dont need to QA as hard (as the program should have done that for you) and you dont need as skilled of engineers to do it for you (as you have the process to make the things).
Im assuming the biggest advantage is that since many common components are shared due to the processing combining them in a fusion APU is even easier than ever.
Welcome to the age of...Corporate mentality. Reduce costs by cutting manpower, sell the branding not the product, and milk the company for bonuses before you get fired. That all works great when you have the largest portion of the market share, but when you are behind in the race, you need innovation and loyalty.
Well, AT&T was broken up in 1984 not because it was a monopoly (which is legal), but because it was *abusing* its monopoly, which is illegal. But I do agree that the disappearance of AMD on the CPU front would lead to greater federal regulation and oversight.
BTW when this was first posted months ago it was dismissed as intel trolls propaganda and revenge of ex-worker
As I have said repeatedly I have no interest in servers, and low to mid end consumer CPUs. My sole interest is the highest performing CPU I can affordably purchase. From that perspective, FX sucks more than a dirty old ho. So if AMD overnight got 90% market share in servers and totally dominated the netbook segment I would still call FX a total disappointment.
I have know interest in overpriced processors like Gulftown. I just want an affordable processor that's COMPETETIVE and overclocks nicely. AMD blew it end of story.