Will AMD's Bulldozer plow through Intel's Sandy Bridge?

Feels like a weird time for the CEO to leave. Fusion, HD6000 series just came out, and even a 10% clock for clock efficiency boost with BD would be a win because of cheap per integer core cost. Desktop fusion will also debut late this year. The future looks brighter than at any point in recent AMD history.
No one knows what happened yet, but the pooch was seen entering the restroom with an E.P.T. in hand and angrily mumbling something about Dirk Meyer.

:p
 
Feels like a weird time for the CEO to leave. Fusion, HD6000 series just came out, and even a 10% clock for clock efficiency boost with BD would be a win because of cheap per integer core cost. Desktop fusion will also debut late this year. The future looks brighter than at any point in recent AMD history.

I wonder if he has a personal reason? I hope it's not politics, AMD isn't that well off.

Maybe he is now busy silently buying amd stocks for all his savings before their value quadruple ;)
 
If you ever want to know how a ship is doing, watch the rats. :D When they jump ship it's usually best to follow. :cool:

normally I would agree with you but this does seem to be an odd time to do so. still its disconcerting. SB has not turn into much (hell I just went i7 myself) and they have some pretty good stuff coming out the line (late better then never?)
 
No one knows what happened yet, but the pooch was seen entering the restroom with an E.P.T. in hand and angrily mumbling something about Dirk Meyer.

:p

Apprently it was a disagreement with the board regarding the direction he wanted to take AMD. Something about the board thinking it wasn't profitable enough (see anandtech article)
 
Apprently it was a disagreement with the board regarding the direction he wanted to take AMD. Something about the board thinking it wasn't profitable enough (see anandtech article)
Yeah, a disagreement. Dirk hadn't finished his 5 year plan to run AMD into the ground. ;)

AnandTech said:
The implication being that Dirk’s plan for AMD wouldn’t result in significant growth, establish market leadership and generate superior financial returns. The question is what was Dirk’s plan and what direction does AMD’s Board of Directors believe it should be headed in instead?

Anand was very kind in his op-ed, and pretty much ignored how late everything is, and how aimless growth in core products and new markets was handled. AMD is in the worst position of any major processor manufacturer as far as tablet and handheld strategy. Dirk Meyer just wasn't a very good CEO. He had a mess of course (largely of his own making since he was COO in charge of AMD's CPUs prior to that), but couldn't even start to turn things around.
 
I do not see that article saying anything about 6 core i7s. It just said that bulldozer would have similar performance to i7. It does not mention 6 core or even SB.
 
LOL

0) no, fuad is claiming "nearly" as fast (is it with the 4m/8c or 8m/16c version?)
1) it's fuad
2) it's a hope post

The lack of any kind of details is disturbing, and not worth anything even by fuad standards.

It wouldn't be surprising to hear that the desktop AM3+ 4m/8c BD is close to 6c/12t i7 in something heavily threaded like rendering or encoding. Too bad 1/9/11 happened. Nevar forget! Most software, including games, do not load 4 threads, nevermind 8. By the rosiest "up to" projections, BD is only about 10% faster per core than K10.5. Wanna see how BD performs vs SB in demanding games? Be generous and add 10%. Then notice that 50fps + 5fps (for example) is still lower than 75fps. :p
 
When they acquired ATi I was impressed with their forward thinking and bold move. But when they sold off Imageon and Infineon I was shocked. I mean WTF, I think most every one knew the mobile market was going to blow up. Maybe it wasn't known it was going to as much as it did, but still. At the time I thought the reason was that they didn't have the capital to invest in R&D in all the product categories that they had and they were already deep in debt. So they dumped what they figured was least important to their core business and kept what they thought would make money and cement the company. If the board is pissed now then maybe they should have found the investors back then so they didn't have to sell off those parts. But hind sight is always 20/20.
 
There are a few games which do actually support 8 threads. However GPU limitations come into play very quickly negating the advantage of their multi-core performance. Lost Planet is a perfect example of that.
 
I do not see that article saying anything about 6 core i7s. It just said that bulldozer would have similar performance to i7. It does not mention 6 core or even SB.

The line before it similar i7 it says performance comes close to new i7 6 core.
 
There are a few games which do actually support 8 threads. However GPU limitations come into play very quickly negating the advantage of their multi-core performance. Lost Planet is a perfect example of that.
I chose the word "load" carefully.

~10%-20% utilization of additional (real/logical) cores beyond 2 or 3 isn't much of a load. It is an impressive feat of optimization though. ;)

eta:
152knbd.jpg

"That's a lot of sand"
 
Last edited:
http://www.fudzilla.com/processors/item/21512-bulldozer-to-come-close-to-core-i7

So the Bulldozer is reported to be at least comparable with the six-core i7's.

The part of that story that most interest me is the little bit about yields. I had my worries that GlobalFoundries would get the 32nm High-K process right without too much fuss. Hopefully that means they can up the clockspeed, since Bulldozer seems to be built for that. I've seen some places speculating that Bulldozer is a 50 stage pipeline and that the designs relies heavily on predictors, so they need high speed to minimize any misses.

A lot of information can be found here. It's a nice read.
http://www.realworldtech.com/page.cfm?ArticleID=RWT082610181333&p=1
 
Last edited:
AMD is in the worst position of any major processor manufacturer as far as tablet and handheld strategy. Dirk Meyer just wasn't a very good CEO. He had a mess of course (largely of his own making since he was COO in charge of AMD's CPUs prior to that), but couldn't even start to turn things around.
Uh, what? I thought that was the one market they were doing quite well in, their low power processors, Fusion/Bobcat line etc look poised to be much stronger than Intel's offerings, unless I am totally lost on this topic.
 
Uh, what? I thought that was the one market they were doing quite well in, their low power processors, Fusion/Bobcat line etc look poised to be much stronger than Intel's offerings, unless I am totally lost on this topic.
Fusion/Bobcat != tablet and handheld

Tablet and handheld devices tend to use much lower power processors because even Bobcat would produce too much heat/use too much energy in those applications.
 
Uh, what? I thought that was the one market they were doing quite well in, their low power processors, Fusion/Bobcat line etc look poised to be much stronger than Intel's offerings, unless I am totally lost on this topic.
Handhelds need a 1W or less processor. Bobcat doesn't scale down that much. A future version may, but it's not coming out any time soon.

Bobcat isn't low power enough to go into tablets either, without either major losses in performance or horrible battery life. It would have few, if any, advantages and CPU performance wouldn't be one.

eta: idle power is also critical and bobcat isn't close to the 100mW range.
 
Last edited:
Fusion/Bobcat != tablet and handheld

Tablet and handheld devices tend to use much lower power processors because even Bobcat would produce too much heat/use too much energy in those applications.
They are starting to ship tablets with dual core CPUs. I would think that the lower wattage Zacate processors might fit well with higher performance tablets and netbooks.
 
The ipad's ARM Cortex A9 CPU has a TDP of 2W (and the Samsung Galaxy Tab is ~3W I believe), so putting even a 9W CPU in a tablet seems illogical. They really aren't meant for that.
 
There are a few games which do actually support 8 threads. However GPU limitations come into play very quickly negating the advantage of their multi-core performance. Lost Planet is a perfect example of that.

Personally I could care less if BD took the gaming crown. It is gonna play games more then good enough if I'm willing to buy the GFX cards and monitors to go with it. What I am really hopping for is it shines on the software that really needs the horse power. 3D rendering, video editing/composting, ect.. With quad channel RAM and a real-core advantage, I pick this to be BD's strong point. I'll take less 1-2 thread performance for greater 6+ thread performance at a comparable price to the equivalent Intel offering.
I'm hoping AMD takes the workstation crown.
 
The ipad's ARM Cortex A9 CPU has a TDP of 2W (and the Samsung Galaxy Tab is ~3W I believe), so putting even a 9W CPU in a tablet seems illogical. They really aren't meant for that.

Dirk Mayer gave a presentation in Q4 of 2010 IIRC stating that Bobcat would scale down to 1W. He was probably referring to some un-announced chip granted, but he seems to believe the architecture can do it.

But off hand I'd say your right, x86 simply cant fit in that space.
 
If AMD sticks with 941 pin and does not offer 1207+ pin socket for the desktop market, then I'll be moving to Intel.:(
I have to agree about the ho hum 6000s, being so humdrum.:rolleyes:
 
If you mean the 6000 series GPUs, can't really fault ATI for that one. 32nm got cancelled and 28nm isn't ready, so they had to make some sacrifices.
 
Dirk Mayer gave a presentation in Q4 of 2010 IIRC stating that Bobcat would scale down to 1W. He was probably referring to some un-announced chip granted, but he seems to believe the architecture can do it.

But off hand I'd say your right, x86 simply cant fit in that space.

Yeah, I'm just referring to the already announced parts and their TDP. I have no doubt that by slimming it down a bit a bobcat CPU could be very tablet-worthy. The current stuff is definitely a no-go, though.
 
If AMD sticks with 941 pin and does not offer 1207+ pin socket for the desktop market, then I'll be moving to Intel.:(
I have to agree about the ho hum 6000s, being so humdrum.:rolleyes:
Why do you want more pins? More memory channels? You know there is almost no performance benefit right now for going past 2 memory channels on desktop apps right? As long as AMD and Intel are willing to keep throwing huge amounts of L3 on these chips it won't be an issue either believe it or not. Only servers really need 4 channels or more.

Yea it'd be nice if the 6xxx IGPU's were better but they're good enough already for internet and desktop stuff. You won't be able to fit a high performance GPU into that price range or TDP envelope right now or for a while. Discrete GPU's aren't going away for a long time.
 
Last edited:
If you mean the 6000 series GPUs, can't really fault ATI for that one. 32nm got cancelled and 28nm isn't ready, so they had to make some sacrifices.

Yeah, none of the cards recently released (from Nvidia or AMD) have been earth shattering, but they're good cards none the less.
 
Dirk Mayer gave a presentation in Q4 of 2010 IIRC stating that Bobcat would scale down to 1W. He was probably referring to some un-announced chip granted, but he seems to believe the architecture can do it.
I see no conflict with his December 27, 2010 statements. He said AMD will be sitting out tablets and handhelds for "the forseeable future." Apparently not what the board wanted to hear since he was canned 2 weeks later. It's possible that if he had just evaded that question that he would still be CEO.

Too bad really. His place in processor history is pretty tarnished now. This is one of the 2 guys behind DEC's Alpha, and lead development of AMD's K7 and K8.

The blame behind AMD's downfall is much due to Meyer, nearly as much as it was due to Ruiz.

If I worked in AMD's server group, especially in upper management, I'd be pretty fearful of losing my job. Unwinding Meyer's damage isn't done yet.
 
Yes. My issue with these is wattage.:mad:
Still about on par with NVIDIA. Like was said, they were planning on being able to die shrink them down, so it's hard to re-engineer the design after the fact. I think that once a 28nm version is available we will see some serious performance and power improvements at once.
 
If AMD sticks with 941 pin and does not offer 1207+ pin socket for the desktop market, then I'll be moving to Intel.:(
I have to agree about the ho hum 6000s, being so humdrum.:rolleyes:

Why do you want more pins? That statement makes no sense.

I would think you would want more memory bandwidth and you are going to see a significant increase in memory bandwidth. Server will see 50%.

Think about it this way there are 2 ways to get a 50% increase in memory bandwidth:

Add a third channel
Improve the memory controller

The first way sometimes works and sometimes does not (there are plenty of intel benchmarks that show memory throughput being about the same between 2 and 3 channels.)

But adding efficiency has a much better chance of giving you an increase in performance, and you can do it with 2 DIMMs. What if you bought 3 DIMMs for your 3 channel system and the memory throughput was the same? You can't guarantee that 3 channels will give better throughput but I can GUARANTEE that 3 channels will increase your memory cost by 50%.
 
Why do you want more pins? That statement makes no sense.

I would think you would want more memory bandwidth and you are going to see a significant increase in memory bandwidth. Server will see 50%.

Think about it this way there are 2 ways to get a 50% increase in memory bandwidth:

Add a third channel
Improve the memory controller

The first way sometimes works and sometimes does not (there are plenty of intel benchmarks that show memory throughput being about the same between 2 and 3 channels.)

But adding efficiency has a much better chance of giving you an increase in performance, and you can do it with 2 DIMMs. What if you bought 3 DIMMs for your 3 channel system and the memory throughput was the same? You can't guarantee that 3 channels will give better throughput but I can GUARANTEE that 3 channels will increase your memory cost by 50%.

I have to agree. I couldn't care less about how many pins a socket uses. It doesn't really make any sense to be concerned with that unless you are an engineer. As for memory bandwidth, on the server side I'd like to see some improvement in overall bandwidth. As for the desktop market, an increase would be nice but as we can see comparing Intel and AMD processors, 50% (or nearly) more bandwidth doesn't equal 50% more performance. That is to say that Intel processors with nearly 50% more memory bandwidth aren't 50% faster than AMD's processors with less. Even comparing Nehalem and Lynnfield we don't see near a 33% (or 1/3rd) difference in performance despite Nehalem having considerably more bandwidth than Lynnfield.
 
Think about it this way there are 2 ways to get a 50% increase in memory bandwidth:

Add a third channel
Improve the memory controller

Large caches can also be tuned to mitigate bandwidth bottlenecks, both by allowing the memory controller to focus on burst operations, and by using the cache as an alternate source for long sequential reads.
 
I have a question about bulldozer in high-performance scenarios.

They say the advantage with this over-provisioned, asymmetric multithreading is that you are getting higher IPC, or actually more processing on the die real estate. That's great for processor "value", but is this the right approach for peak performance?

On a 4-core processor, how will it deal with 8 threads that are all fighting for floating-point crunchin'? Are 4 of the 8 threads subordinate and low-performing, or is it balanced in each core, or is there no advantage in that case to running 8 threads? Moreover, how do you know that two FP-heavy threads won't be assigned to the same core by the OS?
 
Last edited:
My similar question is, if all the stuff that supports an integer core can be beefed up a bit to support two integer cores with a small increase in die area, what about 3 or 4 or even higher? What is the bottleneck in handling instructions to integer cores and routing the results so that adding more integer cores no longer results in gains?
 
I have a question about bulldozer in high-performance scenarios.

They say the advantage with this over-provisioned, asymmetric multithreading is that you are getting higher IPC, or actually more processing on the die real estate. That's great for processor "value", but is this the right approach for peak performance?

On a 4-core processor, how will it deal with 8 threads that are all fighting for floating-point crunchin'? Are 4 of the 8 threads subordinate and low-performing, or is it balanced in each core, or is there no advantage in that case to running 8 threads? Moreover, how do you know that two FP-heavy threads won't be assigned to the same core by the OS?
That sounds more like a question about hyperthreading
 
Back
Top