5.1Ghz Bulldozer OC on Air.

When Bulldozer is released people like yourself and AMD_Gamer are going to be held to much higher standards in regards to fact checking past posts I can promise you.

Anyone who makes a declaration on how this product is going to perform will be held to a higher standard.

I prefer the wait and see approach myself.
 
Last edited:
In Supreme Commander 2 the AMD fan boys get segregated into their own cult, as no one who spends $200 on an Intel CPU want's these guys slowing down their game to the level of AMD speed. AMD = sim score fail in multi core CPU extensive games like Supreme Commander and Supreme Commander 2. But if all you are doing in Plants Vs Zombies all day, I guess you're right in saying it really doesn't matter what CPU you get, It's your money.

BTW bragging about a 87" screen and a 120" screen tells us nothing and means nothing as long as the resolution is the same. That 120" screen wont be taxing you even 1% more over the 87".

I think you are right about Intel's CPU's being faster in games like SupCom etc, I will check into it but it has to be run @ 1080p with 4-8x MSAA for me to compare Intel VS AMD CPU's. I don't play those type of games. They simply lack enough action for my gaming tastes, plus if I wanted strategy I'll just play some Chessmaster and download some chess strategy books to read. I'm a FPS guy or Bethesda RPG guy, I enjoy the adrenaline/action of those games. I'm not into boring games.

The 87-120" HD projector simply means that I will notice jaggies from no AA or a (laughable) 800p resolutions unlike most people here gaming on 20-30 inch lcd's. An HD projector becomes 3x the size of the average PC user, and the pixels are larger and therefore running 4-8X MSAA looks great on these screens, and eliminates jaggies. That is what I was trying to explain to everyone, not brag. When I get 5 or 6 HD projectors running simultaneously in AMD eyenfinity in a few years when they take over the GPU market, I'll start bragging :rolleyes: then you can try flaming me again, maybe I'll cry or the eyefinity setup will make me cry too. :D
 
Yeah but the people who play supcom2 should be shunned by everyone. What an absolute piece of garbage compared to TA and supcom.


completely agree.. supcom 2 and supcom 1(since they both use the same cpu coding in the engines) are not examples of how much better intel processors are than AMD processors, they are an example of how multithreaded gaming was in its infancy, only THQ instead of learning from others that perfected THQ's original multithreaded coding decided to continue using their out dated multithreaded coding in supcom "how to ruin a game series to make a few extra bucks" 2.
 
In a 1hr match if the Intel CPU is rendering the game at +2 and the AMD CPU is rendering the game at -4, the Intel CPU clearly has more headroom. I'll take a 4 core i7 over a 6 core AMD any day of the week in Supreme Commander 1 or 2.

You guys should stop being such fanboys and just jump on the best hardware available at the time of purchase rather than bending the truth and making excesses about your hardware decisions.

BTW teletran8, who are you kidding with that 720p projector? At that resolution I could be running a Celeron-A and still be in the fight. Not to mention with all the blur that baby will produce, I doubt that AA would even be a requirement at that point.
 
...You guys should stop being such fanboys and just jump on the best hardware available at the time of purchase rather than bending the truth and making excesses about your hardware decisions...

Maybe there are people who don't just go for the 'best hardware at the time of purchase'. Maybe some people actually like to save money, if it costs a few frames in a game, who in the f*** really cares? I could see if you were getting paid to play these games like this.
 
Exactly if you are getting 60fps whats it matter. I know if I can save $50 and still get 60 fps I'm gonna do that
 
Actually I agree that I like AMD because I have consistently saved hundreds of dollars allowing me to stay in budget and have a built computer rather than waiting for enough money to finish a rig. This may not be the case for those with expendable budgets but I appreciate being able to game for relatively cheaper than my buddy who keeps bragging about his Intel CPU and his nVidia graphics. I love the fact that I am able to buy a crossfire setup for less than $400, while he will have to spend $500-600 for SLI which may have a small 5-10% margin of better performance. Therefore, in my mind, I am getting better price/performance. I believe the same will be true with Bulldozer. I think AMD is also making good strides with CPU/GPU on die combos.

There will always be fanboys, but honestly, I could care less. I've built decent systems with both Intel and AMD but AMD had consistently been very versatile for me and extremely over-clockable.
 
BTW teletran8, who are you kidding with that 720p projector? At that resolution I could be running a Celeron-A and still be in the fight. Not to mention with all the blur that baby will produce, I doubt that AA would even be a requirement at that point.

No man you have your assumptions all wrong. Letting the bulb warm up for 5 minutes and then using the Zoom slider, and the lens dial. Will eliminate any blur >.<. That's like saying your LCD stinks or colors is off with the factory settings without actually calibrating it whatsoever at all.

Just because the projectors resolution output is rated @ 1280x720 doesn't restrict me from using 1920x1080p source content or even higher content internally, the projector doesn't downscale a Blu-Ray or a game with 1080p res setting the way you think it does. Rather it improves the visual color clarity of 1280x720p output by having the source content I use always be above 720p.

The way the RGB color model works is that the pixels know a certain amount of what colors should be 100 percent of the time. If they don't know the exact pixel color of a certain pixel then the DLP chip has to make an educated guess of what color to output by the ones next to them. If it doesn't know, the DLP chooses the color by the known pixels around it in a mix of the RGB color model shades. So thereby increasing the internal res in a game or having a blu ray with more information, there is less guessing from the projector on pixel colors, because it knows what color to pick a higher percentage of the time with a 1080p source. And the image color detail becomes sharper. Yes I know it's not technically 1080p on screen but the internal res or source is 100% 1080p. It's basically a wicked sharp 720p projector using 1080p source content.

Certain Playstation 3 games on it look great like Skate 3, and then other games like Need for Speed Hot Pursuit, Dirt3 show the PS3s limitation with no AA and jaggies. But those same games on a PC with HDMI out and AA on and 8xMSAA or 16XCSAA. NFSHS and Dirt 3 look AMAZING for the PC! You cannot even notice jaggies unless you look really closely in between the cars spoiler, that's the only place I noticed them on my PC setup. Where as on PS3 I notice them all around the car and everywhere else.


Back on topic though:
The only time I have seen a SB get to 5.1 Ghz or better is with Water Cooling. So this engineering sample running a Bulldozer 8 Core on the same AIR cooling solution I already have is very impressive to me. AND GET THIS! AMD is not happy with these ES chips! So there's a good bet that the official ones will be more stable or just flat out clock higher than the one in thread starters video clip! I'm really excited and I think the people who aren't excited are all just a bunch of AMD haters or benchmark buffoons.

SB 2600k @ 5.0 Ghz water-cooled can't break 10 pts in CB 11.5, it does pull a highly respectable 9.83. As seen in this video.
[ame="http://www.youtube.com/watch?v=UoZmzsWo_L4"]http://www.youtube.com/watch?v=UoZmzsWo_L4[/ame]

For me it's cool if AMD doesn't win every benchmark, because quite frankly most benchmarks don't have anything to do with how I use my computer anyhow. :p Sysmark / Hyper Pi anyone? :D

Yet I'm still quite sure AMD FX-8130P will surpass 2600k in the CB Multi processing benchmark with a 10+ score. AMD has even said that it will have superior positioning to Intel from the position of the Phenom X6's of today.
S01290828.jpg

And I know it will also be a great CPU for FPShooters or FP-RPG's or Racing games. (because my X3 has turned out to be awesome @ 4 GHZ! It's neck and neck or outperforms a OC'd i7-920 in the applications I use. Now imagine an X8 @ 5+ GHZ lol! That thing will have some legs!) High end Gaming is one of the main reasons I own and love the PC platform. I see BD being a great investment for PC gamers who enjoy OCing, and Video enthusiasts on a budget, who want the most out of their budget like myself.

Whew! :rolleyes:
 
Benchmarks are useless, but you use them? Cinebench is yours. Funny thing is, SuperPI is great for testing single threaded performance, where AMD goes down the drain. But yeah anything where intel wins is useless and where AMD wins is the best benchmark ever.

BTW why get a massive projector with a low resolution? If I would get a projector it would rather be one with a native 1920x1080 resolution. http://www.newegg.com/Product/Produ...75&IsNodeId=1&bop=And&Order=PRICE&PageSize=20
 
Provided this isn't fake. As I predicted Bulldozer due to it's longer pipeline should be able to clock higher which can give it the ability to make up in clock what it can't in IPC.

That's what they said about the Pentium IV. You do realize that Sandy Bridge can clock that high too? Even if it couldn't go beyond 4.8GHz or so, a 200MHz or 300MHz clock speed advantage in favor of Bulldozer / Zambezi probably wouldn't be enough to overcome the IPC performance difference.
 
That's what they said about the Pentium IV. You do realize that Sandy Bridge can clock that high too? Even if it couldn't go beyond 4.8GHz or so, a 200MHz or 300MHz clock speed advantage in favor of Bulldozer / Zambezi probably wouldn't be enough to overcome the IPC performance difference.

This kinda reminds me of P4s. They clocked very high(although hot) like 4ghz if I remember correctly back in the day. But a 3ghz Athlon 64 would laugh at it.
 
Anyone got any benchies comparing a 4ghz four core PII vs 4ghz 2500k? I would assume that would give us an idea of how far behind AMD is in IPC, and they are claiming what, a 20% improvement in BD from PII?

So then in theory:

Take PII score
add 20%
should = approximate BD score (four core)

Then see how far behind SB it is. We can then use that to project how much speed BD would have to make up in order to match SB. If it would take a 5ghz BD chip to match a 4.5ghz SB chip of equal cores, then BD may be ok if it clocks high without a lot of trouble.

If a 5ghz BD can equal a 4.5ghz SB in gaming applications, I think that would be a win. Especially if they can get that routinely out of the first generation product. Why is it a win? Because then I can do some e-peen flexing and say I have an 8-core processor at 5ghz :) And in reality, that would be pretty good for the first gen of a new architecture.

On the other side of things, I can't help but point out that these are not stock offerings (yet). The stock 2500k and BD clocks look much lower than what [H] users will get out of it. So maybe a 4.2ghz turbo BD is designed to compete with a 3.7ghz turbo intel processor.
 
Benchmarks are useless, but you use them? Cinebench is yours. Funny thing is, SuperPI is great for testing single threaded performance, where AMD goes down the drain. But yeah anything where intel wins is useless and where AMD wins is the best benchmark ever.

BTW why get a massive projector with a low resolution? If I would get a projector it would rather be one with a native 1920x1080 resolution. http://www.newegg.com/Product/Produ...75&IsNodeId=1&bop=And&Order=PRICE&PageSize=20

It's great for testing single threaded performance of processors made 5-10 years in the past. Those instructions are ancient.

That's what they said about the Pentium IV. You do realize that Sandy Bridge can clock that high too? Even if it couldn't go beyond 4.8GHz or so, a 200MHz or 300MHz clock speed advantage in favor of Bulldozer / Zambezi probably wouldn't be enough to overcome the IPC performance difference.

Which IPC performance difference?
 
Some of us remember farther back than that. :)

As do I. ;)
Anyone got any benchies comparing a 4ghz four core PII vs 4ghz 2500k? I would assume that would give us an idea of how far behind AMD is in IPC, and they are claiming what, a 20% improvement in BD from PII?

So then in theory:

Take PII score
add 20%
should = approximate BD score (four core)

Then see how far behind SB it is. We can then use that to project how much speed BD would have to make up in order to match SB. If it would take a 5ghz BD chip to match a 4.5ghz SB chip of equal cores, then BD may be ok if it clocks high without a lot of trouble.

If a 5ghz BD can equal a 4.5ghz SB in gaming applications, I think that would be a win. Especially if they can get that routinely out of the first generation product. Why is it a win? Because then I can do some e-peen flexing and say I have an 8-core processor at 5ghz :) And in reality, that would be pretty good for the first gen of a new architecture.

On the other side of things, I can't help but point out that these are not stock offerings (yet). The stock 2500k and BD clocks look much lower than what [H] users will get out of it. So maybe a 4.2ghz turbo BD is designed to compete with a 3.7ghz turbo intel processor.

I personally would call a 5ghz BD beating a 4.5ghz SB a lose because what are the chances of 5ghz being a stable and sustainable overclock.
 
I personally would call a 5ghz BD beating a 4.5ghz SB a lose because what are the chances of 5ghz being a stable and sustainable overclock.

It could, however I believe this will be difficult battle for AMD to win at. I mean more cores at higher frequency with less mature silicon. I know AMD has PD SOI however I am not convinced this gave AMD an advantage at 45nm. With that said the i7 2600K is a 95W part and it includes video. The top end BD will be > 95W and not have an integrated GPU so to me its possible but still difficult to accomplish.
 
It could, however I believe this will be difficult battle for AMD to win at. I mean more cores at higher frequency with less mature silicon. I know AMD has PD SOI however I am not convinced this gave AMD an advantage at 45nm. With that said the i7 2600K is a 95W part and it includes video. The top end BD will be > 95W and not have an integrated GPU so to me its possible but still difficult to accomplish.

I have this feeling that AMD is nearing the point where they may have to go back to Bulk SI. It just seems like SOI is not doing very well...
 
It's great for testing single threaded performance of processors made 5-10 years in the past. Those instructions are ancient.

Then how come Intel doesn't have issues with it while AMD still does? 19 seconds on an x4 955 @ 3.6ghz versus 14 seconds on my stock(2.66ghz) W3520
 
Then how come Intel doesn't have issues with it while AMD still does? 19 seconds on an x4 955 @ 3.6ghz versus 14 seconds on my stock(2.66ghz) W3520

Who cares (besides you and maybe 1000 other people worldwide, if that)? Or is your argument going to be that the performance in SuperPi is similar to some other workloads, and therefore it's merit is proved..
 
Who cares (besides you and maybe 1000 other people worldwide, if that)? Or is your argument going to be that the performance in SuperPi is similar to some other workloads, and therefore it's merit is proved..

But that everyone cares about Cinebench is ok now?

And again, SuperPi tests the single thread performance of a CPU which is important to many people in determining how good the CPU is compared to other CPUs.
 
But that everyone cares about Cinebench is ok now?

And again, SuperPi tests the single thread performance of a CPU which is important to many people in determining how good the CPU is compared to other CPUs.

I didn't mention anything about Cinebench. But since you brought it up, yes Cinebench is without question the better software. At least it represents a modern practical need. For those using SPi to test single threaded performance do it for reasons beyond practicality. There is no legitimate workload that it can simulate to determine a modern architecture's behavior in the things consumers buy processors for.
 
Anyone got any benchies comparing a 4ghz four core PII vs 4ghz 2500k? I would assume that would give us an idea of how far behind AMD is in IPC, and they are claiming what, a 20% improvement in BD from PII?

Anandtech CPU Benches

3.4ghz PII versus 3.3ghz 2500k. Benchmarks should scale mostly linear with clocks, so I don't know that a 4ghz to 4ghz comparison is necessary. Looks like PII is 20% slower than SB in most of the tests, though, so a 20% improvement might just catch the 2500k, but it won't pass it.
 
I personally would call a 5ghz BD beating a 4.5ghz SB a lose because what are the chances of 5ghz being a stable and sustainable overclock.

I'm only saying it's a win if it does happen, maybe that wasn't clear enough in my post. IF they can pull it off on a 1st gen product then that would be great. Of course if they cannot then naturally it's a lose, but I didn't think that would need to be specifically stated.

So I'm still curious, how far behind is AMD PII architecture at equal cores and clocks? Closest I could find would be a 980be vs 2500k since they both peak at 3.7ghz (turbo for the 2500k). According to this bench:

http://www.anandtech.com/bench/Product/362?vs=288

What would be the best benchmarks to compare? Looking at the Cinebench R10 single-thread, the 2500k is 36% faster than the 980.

Comparing the 980 to the 910 (3.7ghz vs 2.6ghz):

http://www.anandtech.com/bench/Product/362?vs=85

The extra freq boost increases the score by a crappy ratio (42ish% boost = 30% more points).

So then would BD need a 40% speed boost to make up the 36% deficit? I.e. would it end up needing to be significantly faster than a stock 2500k to equal it's performance?

My math skills aren't the greatest but I just came up with needing a 5.0 ghz PII to match a 3.7 ghz 2500k. Assuming a 20% increase in IPC over PII (could be pulling that out of my arse but I swear I saw it somewhere), could then a 4.2 ghz be roughly equal to a 3.7ghz 2500k?

Considering the rumored retail frequencies, that doesn't sound like the math would be too far off. Then again if it's correct, then it's going to take about a 5.2ghz BD to rival a 4.5ghz SB. And if SB goes to 5.0 easy, then BD will need to be about 5.8 to match. Calculating at a rough 1.16:1 IPC ratio (SB:BD)

That's a lot of speculation on my part based on no physical hardware to test. As a result I'm quite likely to be way off. But it took my mind off my regular job for a few minutes, which is nice :D
 
Anandtech CPU Benches

3.4ghz PII versus 3.3ghz 2500k. Benchmarks should scale mostly linear with clocks, so I don't know that a 4ghz to 4ghz comparison is necessary. Looks like PII is 20% slower than SB in most of the tests, though, so a 20% improvement might just catch the 2500k, but it won't pass it.

Just saw your post after posting mine. I looked at doing the lower clock but figured the 2500k turbo's to 3.7 so that would be the better (more reliable) mark to check. I also only picked on the single thread benchmark to emphasize IPC and what it would take to generate equal performance. If you feel the 3.4 PII would be a better route to measure then please let me know why you feel that way. Thanks for the post :D
 
This kinda reminds me of P4s. They clocked very high(although hot) like 4ghz if I remember correctly back in the day. But a 3ghz Athlon 64 would laugh at it.

those were teh days oh well
 
It doesn't really matter anyway. I have an upgrade path to bulldozer but I'm not going to bother upgrading until something actually taxes my system. I hope BF3 does, but I doubt it will be too hard on my system/ I had my 1055t clocked at 3.7ghz but I've put it back down to stock now. 1080p gaming, - 1055t and a 6950 maxes out everything, photoshop opens in a few seconds...

Unless they start bringing out some decent PC titles that push the hardware it doesn't really matter if bulldozer is slower or faster than SB, it's all overkill.
 
Why is all of this centered around single threaded benchmarks when almost nothing is single threaded anymore?

For an accurate look at performance, you have to take into account multi-threaded performance which is going to be inpacted not only by IPC, but also by efficiency of inter-core communication as well as how the cache is shared between the different cores.

As it is now, we really have no good idea of how BD is going to stack up against SB.

If AMD gets there RAM throughput up, as well as getting their IPC and inter-core communication efficiency up, then BD might just be what you want to go with.

One other thing to take into account, is even though AMD chips tend to use higher voltage, thay also have a good track record of running at much lower temps then Intel chips do.... (I'm comparing PII to the first gen i7 chips - try running an i7-920 at 3.2Ghz on the stock cooler... ain't gonna happen. PII x2 unlocked to x4.. no sweat).
 
those were teh days oh well

Yeah agreed. AMD had a massive lead back then. A64 was eating P4s and Pentium Ds for breakfast lunch and dinner. But it seems they got lazy and just kept using the same architecture thinking it would beat anything Intel had. I mean I remember AM2 coming out and the 939 CPUs beat the AM2 cpus(The FX-59 I think on 939 beat the FX-60 or 62 on AM2 quite easily). AMD really could have kept dominating if they would have come up with a better architecture back then.
 
Yeah agreed. AMD had a massive lead back then. A64 was eating P4s and Pentium Ds for breakfast lunch and dinner. But it seems they got lazy and just kept using the same architecture thinking it would beat anything Intel had. I mean I remember AM2 coming out and the 939 CPUs beat the AM2 cpus(The FX-59 I think on 939 beat the FX-60 or 62 on AM2 quite easily). AMD really could have kept dominating if they would have come up with a better architecture back then.

Its Cyclical. Intel was going to use the same core design till about now if they could have gotten Prescott as high as they thought they could (about 4.5GHz). AMD on the other hand didn't need to throw billions into the pot for a new architecture and already had to drop their original successor to the Athlon64 X2.

If the K9 didn't fail, you wouldn't have seen a K10 come out with so few tweaks in half of the normal development time. It was a rush job to pass the time till BD hit, just like the CD was for Intel.
 
"I'm only saying it's a win if it does happen, maybe that wasn't clear enough in my post. IF they can pull it off on a 1st gen product then that would be great. Of course if they cannot then naturally it's a lose, but I didn't think that would need to be specifically stated."

"So I'm still curious, how far behind is AMD PII architecture at equal cores and clocks? Closest I could find would be a 980be vs 2500k since they both peak at 3.7ghz (turbo for the 2500k)...."

C

I think a new gen desktop part for amd can only be winning, just begs to question "How much winning?"

If it's a good amount faster than the PII clock for clock then it solidifies BD as a competitive product, I think if the difference between BD -> SandyB closes the gap that's between PII -> I7 lga1366/1155 then at least they're headed the right way. Amd gained alot of capitol or at least got themselves out of alot of debt due to their gfx side more than selling alot of cheap PII's, so they might have been able to improve BD a little.

I take leaked benchmarks as a grain of salt, until i see legit performance #'s/price range it means nothing yet. It might not be the fastest but it's definately not going to be slow :)

It's also rumoured it may be possible for some BD based Am3+ cpu's to be backwards compatible with the latest 890 chipset Am3 m/b's through a bios update, if this holds true it will be nice to not upgrade my whole system.

Amd has headed towards affordability of late in both cpu/gpu area's, If it's expensive it will be fast as heck and if it's slower/affordable it may not dictate outright failure. We will see.
 
Back
Top