AMD Bulldozer Details Leaked From Gigabyte

Status
Not open for further replies.
I have a real hard time believing that AMD BD Turbo mode will go from 2.8-3.8 Ghz or 3.0-4.0 Ghz on desktops. (It would most likely have to jump its CPU voltage from .150V-.200V simply for those speeds stable. :(

There wouldn't be anything suprising in turbo pumping so many volts after the insane turbo settings in X6 series.
 
the site posted that the pics are fake now.
OMG, shocking!!!!!!!11 :p

Update 2:
We have been contacted by GIGABYTE and they confirmed that the chat conversation and all the details are fake.

The problem isn't that it's fake, but it's an obviously bad fake just based on real leaks and official info. No one should have been fooled by it unless they wanted to be fooled. And that's just how it goes. ;)
 
I still love how all the amd fan boys disregard anything showing bulldozer losing to sandy bridge but when there is an outrageous "leaked benchmark" that shows bulldozer owning. Its instantly considered true. I really wonder who the fan boys here are :rolleyes:

I mean saying pxc is an intel fanboy just because he understands that what these pictures showed were highly improbable is just low. In the end he was right. It was fake.
 
come on guys, he is a skeptic ;) just read his quotes.

Indeed ;) finally someone noticed his quotes

I hope it is coming out next month.
I hope the performance is true.

Lately had an itch for an all AMD Crossfire setup but waiting to see what this brings.
Would do an AM3 board now but this shaky support for Bulldozer has me thinking twice.
 
I don't understand all of this attachment to bd being god like, or crappy, from both sides, it is what it is, what you wish or hope happens won't change it, it's good to hope BD is competitive, god knows we need AMD around, just quit gouging each others eyes out over what you want to happen.
 
At this point, do you guys honestly think the Bulldozer will be faster then the CPU that was announced last week? IIRC, it'll be released this year.


no bulldozer can't compete with LGA-2011 in any way due to one primary reason, and thats is Quad channel DDR3 which LGA-2011 will be using. AMD is still limiting quad channel DDR3 to the server market which makes sense but also hurts them performance wise. the cost of having to use quad channel DDR3 would push bulldozer well outside of being affordable for the large majority of consumers and AMD can't afford to do that. hopefully AMD decides to split the market so that they do release quad channel DDR3 consumer boards some time in 2012 to actually compete with ivy bridge and the enthusiast market but who knows.


I still love how all the amd fan boys disregard anything showing bulldozer losing to sandy bridge but when there is an outrageous "leaked benchmark" that shows bulldozer owning. Its instantly considered true. I really wonder who the fan boys here are :rolleyes:

I mean saying pxc is an intel fanboy just because he understands that what these pictures showed were highly improbable is just low. In the end he was right. It was fake.

when has there been any leaks showing the bulldozer losing to sandy bridge? and don't say the last leaks that came out showing piss poor performance because the idiot that leaked them failed to mention the processor was stuck in power saving mode (800mhz) instead of clocking up during the benchmark. so far there hasn't been any real benchmark leaks outside of the real numbers released for the quad socket opteron bulldozer rig running F@H which pretty much spanked anything that exists at this time.
 
Last edited:
no bulldozer can't compete with LGA-2011 in any way due to one primary reason, and thats is Quad channel DDR3 which LGA-2011 will be using. AMD is still limiting quad channel DDR3 to the server market which makes sense but also hurts them performance wise. the cost of having to use quad channel DDR3 would push bulldozer well outside of being affordable for the large majority of consumers and AMD can't afford to do that. hopefully AMD decides to split the market so that they do release quad channel DDR3 consumer boards some time in 2012 to actually compete with ivy bridge and the enthusiast market but who knows.

Well, considering i7 on 1156 is as fast as i7 on 1366, I doubt having dual, triple, or quad channel will make a difference to 99.9% of desktop users, the processors are just not utilizing that much mem bandwidth, probably partially due to the fact that we are limited to 1 socket with 4 cores-8cores connected by an already fast IC and a big shared L3 cache.
 
I don't understand all of this attachment to bd being god like, or crappy, from both sides
Don't forget that's amplified beyond reality by just a couple of people. ;) I mean the most common wish I've read here is that BD competes with SB. The most common counter is that won't. Each side has often presented reasons for believing either one. Pretty uncontroversial I thought.

The other stuff comes from, well I'll be nice. It's normally great to hear the arguments from either point of view. With the little info on BD, it usually becomes unproductive name calling by the same couple of people.

FWIW, I have said that BD will probably be a decent server CPU, at least per socket with 2x8c processors.
 
Well, considering i7 on 1156 is as fast as i7 on 1366, I doubt having dual, triple, or quad channel will make a difference to 99.9% of desktop users, the processors are just not utilizing that much mem bandwidth, probably partially due to the fact that we are limited to 1 socket with 4 cores-8cores connected by an already fast IC and a big shared L3 cache.


it makes a difference depending on what you are doing. but the performance difference between quad channel and dual channel makes the difference between dual and triple look like child's play. AMD proved it with the G34 socket and magny-cour which is why Intel is pushing to beat AMD to the consumer market with quad channel DDR3. the core limit isn't the problem at all either. it really just comes down to what the user does. but either way most benchmarks will favor the higher memory bandwidth even though it only effects a small portion of applications in the real world. like rendering, photo editing, video editing, and some high memory usage games. hell just think what you could do if you made a ram drive out of it with say 16GB or even 32GB of ram.
 
Read the few posts above yours.

Ah... I did see that.
I just meant that I hope it is good or at least competitive.
Really, if I can get similar performance to my 920 I would switch just to have something different and go AMD for awhile. If it is competitive with Sandy Bridge well then, no brainer for me. I am itching for an all AMD Crossfire gaming machine and lord knows that the games today don't even need that horsepower.
My laptop does all of my "work"/surfing/whatever.
 
it makes a difference depending on what you are doing. but the performance difference between quad channel and dual channel makes the difference between dual and triple look like child's play. AMD proved it with the G34 socket and magny-cour which is why Intel is pushing to beat AMD to the consumer market with quad channel DDR3. the core limit isn't the problem at all either. it really just comes down to what the user does. but either way most benchmarks will favor the higher memory bandwidth even though it only effects a small portion of applications in the real world. like rendering, photo editing, video editing, and some high memory usage games. hell just think what you could do if you made a ram drive out of it with say 16GB or even 32GB of ram.

read some reviews :) even for video editing / rendering / photo editing and rendering, dual vs triple shows no difference for desktop chips, G34 and Magny-cores is a different story, it's not a desktop setup, it's 12-16 cores per chip, that generally end up doing memory intensive work, which is why you have 16 and 32 and 64 / 256gb of memory in servers, and only 8-12GB in even high end desktops.( I have 12, only time I ever got up near 8-10gb was when I was running Vegas premier encore and Photoshop at once, while rendering on 1 and editing on the others :p.

Show me one instance on desktop use, where triple channel shows a difference over dual.
 
no bulldozer can't compete with LGA-2011 in any way due to one primary reason, and thats is Quad channel DDR3 which LGA-2011 will be using.
What? If triple channel memory offers no benefit why do you think quad channel memory will some how be a huge benefit?

The vast majority of apps people use aren't even vaguely bandwidth limited and haven't been for years.

AMD proved it with the G34 socket and magny-cour
That is for server work loads!! Games, F@H, Firefox, movie editing, etc. will not be bandwidth limited at all!!

which is why Intel is pushing to beat AMD to the consumer market with quad channel DDR3
Intel is pushing as a feature for their xxxxxtreme overpriced OC'ers platform to make suckers who actually shell out for it think they are getting something worth while. That is it.
 
We're still arguing about this intel vs amd crap even though its off-topic and the original topic is now a moot point because it was centered around a fake screenshot?

Triple-channel RAM is only useful for a few things, and none of them involve regular consumer use of a PC or even "extreme" gaming. Sandy Bridge is KING and it doesn't have a triple-channel platform. Nobody cares about triple-channel at home. Nobody.
 
That was in the post before the updates were added. It's a fake, dude. Deal.

Or its not.. But you seem pretty set on your opinion anyway.. I was unsure of its authenticity before, but after the gigabyte response I believe it to be real now (that or I am just saying that to contradict you, either way).. In another month or 2 we will see what the real score is. In the mean time speculation is fun.
 
What? If triple channel memory offers no benefit why do you think quad channel memory will some how be a huge benefit?

Triple channel does not have much impact on 4 core / 8 threaded processors however who is to say quad channle has little to no benefit on 8 core / 16 threaded processors?
 
If the work load remains the same then quad channel will continue to have no benefit. There is no sign that anything is coming any time soon that will fully stress dual channel much less triple or quad channel memory for the desktop any time soon. Its a stupid waste of money, just like triple channel was.
 
If the work load remains the same then quad channel will continue to have no benefit. There is no sign that anything is coming any time soon that will fully stress dual channel much less triple or quad channel memory for the desktop any time soon. Its a stupid waste of money, just like triple channel was.

So, when games, programs, other things start using more than 4GB of RAM to like 16GB->128GB would Quad Channel be considerable?
 
Last edited:
No. Amount of RAM used by a program won't necessarily mean that it'll make proper use of or need triple or quad channel bandwidth.

Given that some of the most bandwidth intensive programs for the desktop are games, but use the memory on the GPU and not the main system RAM to do most of the necessary work, and the tendency for CPU manufacturers to toss huge amounts of L2/L3 cache on die with CPU these days as well as continuing improvements in RAM speed over time its possible you may never need anything more than dual channel RAM for the desktop.
 
If the work load remains the same then quad channel will continue to have no benefit.

If the workload is the same an 8 core will provide no benefit as well however both AMD and Intel are bringing 8 core processors to the desktop market this year.
 
Triple channel does not have much impact on 4 core / 8 threaded processors however who is to say quad channle has little to no benefit on 8 core / 16 threaded processors?

considering that a computer can only perform as fast as it's slowest / weakest link, do you think that the current memory bandwidth is holding ANYTHING back on the desktop? and how much would this memory bandwidth hold back an 8 core / 16 thread chip =)? We'll see I guess.
 
If the workload is the same an 8 core will provide no benefit as well however both AMD and Intel are bringing 8 core processors to the desktop market this year.

Yes but what does this have to do with anything? We're talking about bandwidth not the dearth of multi threaded software that actually will make use of 4 or more cores/threads.
 
Yes but what does this have to do with anything? We're talking about bandwidth not the dearth of multi threaded software that actually will make use of 4 or more cores/threads.

I believe they go hand and hand. I mean when we have software that can use 8 to 16 threads I believe it will need more bandwidth than what current dual channel ddr3 can do. Although to be honest faster DDR3 or possibly DDR4 (new platforms) will be available by then.
 
HyperTransport 3.1 (3.2 GHz, 32-pair) - 409.6 Gbit/s - 51.2 GB/s
^AMDs new version of HT for Bulldozer
PC3-17066 DDR3-SDRAM (triple channel 192-bit) - 409.6 Gbit/s - 51.2 GB/s - 2,133 MHz DDR3-2133
PC3-17066 DDR3-SDRAM (dual channel 128-bit) - 273.1 Gbit/s - 34.1 GB/s - 2,133 MHz DDR3-2133
 
Last edited:
You can go ahead and open up 4 or 5 or however many games you like or software of choice and fully load up a 8 thread CPU today and bandwidth still won't be an issue before the CPU itself or another component (GPU) causes a bottleneck.

seronx: yes they use HT as a bus between the northbridge and CPU in desktops, where it might actually have some benefit since 1 or more video cards and other common devices can use lots of bandwidth. Most of the benefit is for the server space, where they have CPUs with 2 or more independent HT buses for badly needed fast inter chip communication.
 
You can go ahead and open up 4 or 5 or however many games you like or software of choice and fully load up a 8 thread CPU today and bandwidth still won't be an issue before the CPU itself or another component (GPU) causes a bottleneck.

seronx: yes they use HT as a bus between the northbridge and CPU in desktops, where it might actually have some benefit since 1 or more video cards and other common devices can use lots of bandwidth. Most of the benefit is for the server space, where they have CPUs with 2 or more independent HT buses for badly needed fast inter chip communication.

Whoa...I just got mind fuddled

always thought QPI and HT were related to Memory xD

Well now that I got that out of the way......


I got confuzzled how the NB <--> SB and CPU <--> Mem/CPU <---> NB but now i got that fixed

but technically there would be a bottleneck if the devices controlled from HT try to ask for 40-50GB/s of bandwidth from a Dual Channel
 
Last edited:
Yes but they don't in the real world. There are all sorts of pathological cases you can make to break most anything but if you don't actually encounter them in the real world, rarely or commonly, then they aren't worth mentioning except as curiosities.
 
I believe they go hand and hand. I mean when we have software that can use 8 to 16 threads I believe it will need more bandwidth than what current dual channel ddr3 can do. Although to be honest faster DDR3 or possibly DDR4 (new platforms) will be available by then.
Not sure its truly relevant, but there are people out there running F@H on 4p Magny-Cours rigs that see a big impact when not running them at the full quad channel, so theres definitely applications out there that feel the memory bandwith when talking about highly-multithreaded applications. The rub is that Magny Cours is not a monolithic 12-core/quad channel, but 2 6-core/dual-channel dies so running "dual" channel is like running 2 cpus with a single channel each. I wonder if a monolithic 12-core would see the same issue
 
Not sure its truly relevant, but there are people out there running F@H on 4p Magny-Cours rigs that see a big impact when not running them at the full quad channel, so theres definitely applications out there that feel the memory bandwith when talking about highly-multithreaded applications. The rub is that Magny Cours is not a monolithic 12-core/quad channel, but 2 6-core/dual-channel dies so running "dual" channel is like running 2 cpus with a single channel each. I wonder if a monolithic 12-core would see the same issue
More than likely not, because the SRQ/Crossbar would be able to mitigate bandwidth requests from the 12 cores, especially since the L3 Cache would help reduce ram request dependence.

Whoa...I just got mind fuddled always thought QPI and HT were related to Memory xD
Well now that I got that out of the way...... I got confuzzled how the NB <--> SB and CPU <--> Mem/CPU <---> NB but now i got that fixed, but technically there would be a bottleneck if the devices controlled from HT try to ask for 40-50GB/s of bandwidth from a Dual Channel

That actually would only happen if:
1. Graphic cards streamed instead of cached data.
2. SSDs/HDDs had no kind of cache subsystem.
3. Nothing in the system at all was cached/precached.

In that situation, you would be actually more I/O and CPU execution limited than bandwidth limited. The answer would be to add more cores (look at Xeon 7k/E7, Magny-Cours, POWER 6/7, and Itanium HPC systems).
 
Last edited:
Or its not.. But you seem pretty set on your opinion anyway.. I was unsure of its authenticity before, but after the gigabyte response I believe it to be real now (that or I am just saying that to contradict you, either way).. In another month or 2 we will see what the real score is. In the mean time speculation is fun.
lulz, that will teach reality!

The image is an obvious fake. The information in the image is an obvious fake.
 
lulz, that will teach reality!

The image is an obvious fake. The information in the image is an obvious fake.

And you have a bulldozer that you have tested to know it fake?
Posted via Mobile Device
 
And you have a bulldozer that you have tested to know it fake?
Do I need one if the hoaxer who made that image didn't have one either? :p

It might not really sink in, but the fake has many mistakes on it, even ignoring the benchmark scores. It's a complete fabrication.
 
If the work load remains the same then quad channel will continue to have no benefit. There is no sign that anything is coming any time soon that will fully stress dual channel much less triple or quad channel memory for the desktop any time soon. Its a stupid waste of money, just like triple channel was.

Bashing triple channel on the internet never gets much notice because X58 features triple channel, and X58 is "extreme." And all the "extreme" enthusiasts that own X58 are extra "extreme" for owning it, and derive "extreme" satisfaction from owning something that is "extreme." Bashing triple channel undermines that satisfaction, and that just isn't cool bro.
 
Status
Not open for further replies.
Back
Top