Inside AMD Sources Say AMD Withdrew from BapCo Because Bulldozer Can't Compete

I dunno man. All I know is that, historically, when companies sit on specs and such when an impending release is coming it hasn't been good. The last recent for me was Nvidia's 480/470 lines. By contrast, IIRC, Intel was showing peeks of Nehalem well before release. I remember seeing the Computex shot of 8 threads.

Just sayin...


Well, the Fermi we got my not have been what we (or Nvidia) had hoped, but the GTX480 was still the fastest GPU on the market when it was launched.

So this may or may not be a valid comparison.

AMD also has a long history of being tight lipped before official launches.
 
They showed it off at E3, and even had press event on there (I believe); but no benches. They even created a kinda corny teaser site.

Yea I'm not going to give details, and I can't prove anything. I'm only going by what my foggy memory remembers, and it's got a big damn part of history in it regarding computers. I just mean that I always seem to remember more "leaks" and shots of products that kick ass vs hearing next to nothing about products that, well, don't do so good. I feel like it's more of an intangible pattern than anything else.
 
Zarathustra[H];1037430272 said:
AMD also has a long history of being tight lipped before official launches.

I give you that one. For a large part of that history in my head, I was only paying attention to Intel. I sure hope he's wrong though.
 
Zarathustra[H];1037430272 said:
Well, the Fermi we got my not have been what we (or Nvidia) had hoped, but the GTX480 was still the fastest GPU on the market when it was launched.

So this may or may not be a valid comparison.

AMD also has a long history of being tight lipped before official launches.

AMDs eyefinity launch is a great example. They had a real game-changer and NOBODY knew a thing about it until it launched. Going by the logic of "if its good there will be more leaks" then everyone and their mother should have known about Eyefinity. AMD just doesn't leak stuff the way that many other companies do.
 
and even going back further there was the HD 4000 series which they were pretty tightlipped about for performance and caused Nvidia to drop prices like mad on launch day
 
All this hope... waiting to be crushed. :p Benchmark day should be fun.
 
All this hope... waiting to be crushed. :p Benchmark day should be fun.

you need to let up on the anti-AMD. I'm not hoping, I'm being realistic. This rumor sounds like bullcrap to me because it doesn't make any sense with the FACTS that we actually know. Do I think Bulldozer is gonna kill sandy bridge? No. But do I think its gonna be equal IPC as Phenom? No! Come on- we have no hard facts on performance here. Lets reserve judgement until we have facts, not unsubstantiated bullcrap spewed by some biased FUD generator. Speculation is getting out of hand. AMD will release benchmarks and data when they are ready.
 
All this hope... waiting to be crushed. :p Benchmark day should be fun.

AMD does sort of have a history (at least more of a history than Intel) of keeping things under wraps before launches, so it's harder to tell just what's going on. Intel's engineering samples in particular tend to tell a much better tale of actual retail performance than AMD's ever seem to. Honestly, I'm not all that optimistic for Bulldozer. Beating out, or even matching, Sandy Bridge seems like a stretch. Still, I don't see any reason to blindly support a shady article written by a dubious author.

I'm on schedule to overhaul my system around the time Bulldozer should be out, so we'll see what happens. My last two builds have been Intel, so I've been itching to make an AMD rig. Hopefully Bulldozer delivers. If not, no skin off my nose, I'll finish off the Intel hat trick. It's confusing to me why so many people seem so eager to dismiss a product they haven't even seen.
 
Meh people can have their opinions really, in the end it doesn't matter. If someone wants to say BD will be crap then cool, good for them. If it doesn't come with proof then I'm not going to waste time debating on whether their opinion is good or not.

Personally I think we should be seeing some real-world info in about a month if the [H] event goes well, and if BD isn't available commercially until September then so be it. If the BD numbers suck next month then most people who want to upgrade will just go with SB.

By September we may actually see some previews of IB performance, so in turn those who may have upgraded to BD will instead wait on IB. If IB is to launch in March 2012 then that's what, 6 months from BD?

For me personally, chances are good that if BD sucks I will either go SB or IB depending on when I can afford it. I am hoping BD will be worth getting but if it takes an 8-core BD to equal a 2500k and does not surpass it significantly, then it won't be worth the extra $100 to me.

It really all comes down to two numbers: Performance and Price.
 
How 'bout we run SYSmark on the best A8 Fusion and see how poorly it fairs against Intel's current i3 2 core 4 thread? We can agree that Fusion is designed for 90% of users, the mainstream users, with a better balance of CPU and GPU performance right? When the i3 trumps the Fusion chip by 50%, you'll see why AMD left.

Nevermind, I see it's about Bulldozer. It could very well be true.
 
Last edited:
It just makes no sense for them to design something barely better than a Phenom II, and charge premium for it (they say the 8 Core will be over $300). I just don't see it.
 
It just makes no sense for them to design something barely better than a Phenom II, and charge premium for it (they say the 8 Core will be over $300). I just don't see it.

Not knowing how much IPC has improved, I still say it can make sense. There will be more cores, higher frequency, better turbo and it will be competitive with the i7 2600K in low threaded applications (using the more advanced turbo and higher frequencies) and it should easily beat the i7 2600K in 6+ threaded applications.
 
How 'bout we run SYSmark on the best A8 Fusion and see how poorly it fairs against Intel's current i3 2 core 4 thread? We can agree that Fusion is designed for 90% of users, the mainstream users, with a better balance of CPU and GPU performance right? When the i3 trumps the Fusion chip by 50%, you'll see why AMD left.

Nevermind, I see it's about Bulldozer. It could very well be true.

I don't think its so much the ratio of CPU:GPU in the benchmarks rather than maybe AMD and nvidia seem to think that the HD3000/2000 is overrepresented. Someone else already mentioned this once? makes sense to me.
 
It's been... 8 years now (I think? Someone correct me if I'm wrong), but AMD was fairly tight-lipped about the Athlon 64's performance if I recall. Then they released it and everyone was shocked at how much better it's performance was vs. the Pentium IV.
 
It's been... 8 years now (I think? Someone correct me if I'm wrong), but AMD was fairly tight-lipped about the Athlon 64's performance if I recall. Then they released it and everyone was shocked at how much better it's performance was vs. the Pentium IV.

When "Hammer" was being releases, we knew about AM64 and the memory controller on die. We had no idea about anything else like IPC or clocks until NDA was lifted.

I remember because I was anticipating "Hammer" like I am anticipating "Bulldozer". Funny thing is also I remember Hammer being delayed by a Quarter a few times and all the Intel trolls saying pretty much the same stuff we see today.

lol read some of the comments about Hammer from 2003: http://techreport.com/discussions.x/4090
 
When "Hammer" was being releases, we knew about AM64 and the memory controller on die. We had no idea about anything else like IPC or clocks until NDA was lifted.

I remember because I was anticipating "Hammer" like I am anticipating "Bulldozer". Funny thing is also I remember Hammer being delayed by a Quarter a few times and all the Intel trolls saying pretty much the same stuff we see today.

lol read some of the comments about Hammer from 2003: http://techreport.com/discussions.x/4090
To be honest, not matter how good AMD does anything, you're always going to have those that rag on AMD, just because they are AMD and not Intel/Nvidia. I have no problem with the people that are not sure of the performance, but those that just go on the rumors and say, "Bulldozer is going to suck!", and "AMD is going to fail because they suck!"

On the obverse side, the people that say: "Bulldozer is going to be 50%+ faster than SB!" are just handing the anti-AMD side more ammo to fire...

Has AMD been late before? Yes, several times. Has their lateness caused to company to implode? No. I can wait till July/September. If Zambezi is a flop, I'll still buy an AM3+ board, because I won't have to change much around.
 
Like all great myths, there is probably some grain of truth to this.

However, I still think it has more to do with Llano and Fusion products than it does Bulldozer. The complaint made by AMD insinuates that the GPU accelerated benchmarks are not factored into the final score. Bulldozer does not have an integrated GPU yet, and that version may not be out until the next version of Sysmark is released.

Nvidia also left but declined to comment as to why. They can no longer build integrated video chipsets for Intel, and since Sysmark is used by businesses and governments to establish a baseline performance for desktop machines that rarely get upgraded discrete gpu's, there is zero reason for them to be a part of BapCo as long as it doesn't help sell their products.

Via also left, but their cpu's are so underperforming that their actions only fan the flames from both sides as possible proof that Intel's performance is overwhelming or that Intel bullies around everyone else and gets their way. Via's chips rarely see desktops anyways since they are preferred for low power embedded applications, so sysmark doesn't help sell their chips either.

Honestly, I feel that Nvidia and Via's silence only shows that they were looking for any excuse to leave BapCo, and AMD's departure was that excuse. In the end, all 3 of them win by making Intel look like the bad guy.
 
When "Hammer" was being releases, we knew about AM64 and the memory controller on die. We had no idea about anything else like IPC
False. AMD was demonstrating low clock speed Hammer systems for anyone who would listen, way before it was released (first appearance was at IDF running A0 silicon @ 800MHz). I still have a magazine with it on the cover where it was tested nearly a year in advance. This is Anand's testing on hardware provided by AMD 2 months before Opteron started shipping: http://www.anandtech.com/show/1099/4

Just several months ago AMD did the same thing with Bobcat. Months before it was out, AMD allowed independent testing.

And over the years AMD eagerly provided pre-release desktop hardware to sites (particularly with the first Athlon as it coincided with the rise of hardware review sites)... back when it was in the performance lead.

I wish it were more complex to analyze, but AMD provided access to pre-release hardware when it had a winner, and doesn't when it had a loser.
 
I wish it were more complex to analyze, but AMD provided access to pre-release hardware when it had a winner, and doesn't when it had a loser.
Can you clarify this? Engineering Samples are pre-release hardware also, and those are always sent out.
 
Can you clarify this?
Context is CPU availability to review sites/reviewers (mentioned at least twice in that post!), places where the public can get an official, and often competently done, whiff of performance.
 
False. AMD was demonstrating low clock speed Hammer systems for anyone who would listen, way before it was released (first appearance was at IDF running A0 silicon @ 800MHz). I still have a magazine with it on the cover where it was tested nearly a year in advance. This is Anand's testing on hardware provided by AMD 2 months before Opteron started shipping: http://www.anandtech.com/show/1099/4

Just several months ago AMD did the same thing with Bobcat. Months before it was out, AMD allowed independent testing.

And over the years AMD eagerly provided pre-release desktop hardware to sites (particularly with the first Athlon as it coincided with the rise of hardware review sites)... back when it was in the performance lead.

I wish it were more complex to analyze, but AMD provided access to pre-release hardware when it had a winner, and doesn't when it had a loser.

I'm not expecting a launch until early September, so we're still more than a couple months out.

I do think what they choose to show (and not show) at the upcoming HardOCP event will be rather telling though. If they're still keeping it under serious wraps at that time, then it may be time to worry.

The only thing AMD could stand to gain by hiding a winner this long is giving Intel less time to formulate a new timetable. It seems plausible that if Bulldozer is roughly matching Sandy Bridge, but not exactly blowing it away, they'd be a bit nervous tooting their own horn months in advance, knowing Intel could potentially push up their newer tech with such an advanced warning, which AMD wouldn't have a response too.
 
AMD/Nvidia should make their own "not for profit" benchmarking program, and see why Intel doesn't want to join :p
 
I do think what they choose to show (and not show) at the upcoming HardOCP event will be rather telling though. If they're still keeping it under serious wraps at that time, then it may be time to worry.
IMO that's just going to be a gaming oriented event, where CPU performance won't matter much. See: pretty much any modern [H] head to head CPU gaming evaluation. AMD isn't dumb in that regard and seems to have chosen a "perfect" event. ;)

Intel's and AMD's strategies are already set. Pricing for any models or launch dates to counter new products can be made with relative ease. Intel's OEM only i3-2125/2130 CPUs seem to be made to counter Llano in as much as HD Graphics 3000 can compete against the 400SP GPU in Llano, which is 50-100% faster than HD Graphics 3000. Prior i3 models used the 1/2 performance (6EU vs 12EU) HD Graphics 2000 GPU.

AMD's refusal to give a launch date (btw, hello "early Summer", we're in it now) may very well give Intel less time to aim some mid level i5 at BD FX, but the BD performance in the leaks doesn't look look competitive at the high end at all in typical tasks, which will translate to bad review scores in desktop apps. Expect it. ;)

But at least BD FX now has a range of dates when it should become available assuming the B2 stepping fixes AMD's concerns with performance.
 
Last edited:
IMO that's just going to be a gaming oriented event, where CPU performance won't matter much. See: pretty much any modern [H] head to head CPU gaming evaluation. AMD isn't dumb in that regard and seems to have chosen a "perfect" event. ;)

Intel's and AMD's strategies are already set. Pricing for any models or launch dates to counter new products can be made with relative ease. Intel's OEM only i3-2125/2130 CPUs seem to be made to counter Llano in as much as HD Graphics 3000 can compete against the 400SP GPU in Llano, which is 50-100% faster than HD Graphics 3000. Prior i3 models used the 1/2 performance (6EU vs 12EU) HD Graphics 2000 GPU.

AMD's refusal to give a launch date (btw, hello "early Summer", we're in it now) may very well give Intel less time to aim some mid level i5 at BD FX, but the BD performance in the leaks doesn't look look competitive at the high end at all in typical tasks, which will translate to bad review scores in desktop apps. Expect it. ;)

But at least BD FX now has a range of dates when it should become available assuming the B2 stepping fixes AMD's concerns with performance.

I said early September.

The leaks are all from engineering samples. While Intel's engineering samples often correlate closely to final benchmarks, AMD's often don't. Forget Intel, if those benchmarks were remotely indicative of final performance, they should be more worried their own Phemon IIs would outperform Bulldozer at half the cost, because they would. So either those early steppings aren't very reflective of final product performance, or AMD has lost their minds.
 
If you expect samples produced weeks before launch to not be representative of final products, well... swamp land or desert ocean front property, I have deals on both just for you.

As for the other part, you said it. ;)
 
If you expect samples produced weeks before launch to not be representative of final products, well... swamp land or desert ocean front property, I have deals on both just for you.

The samples currently out there (at least the supposedly benched ones) are older than a few weeks. There's no way AMD could go from B0 to B2 stepping in that span of time.

Found this amusing:

http://techreport.com/discussions.x/19061

Look at some of the reactions to the SB ES.
 
Last edited:
If you expect samples produced weeks before launch to not be representative of final products, well... swamp land or desert ocean front property, I have deals on both just for you.

As for the other part, you said it. ;)

None of the ES samples we've seen leaked benchmarks for (assuming they're real) were released "weeks before launch." We're still potentially over 2 months away from launch and those B0 samples were likely made several months or more ago

I've already said I'm not overly optimistic for Bulldozer, but I'm not going to dismiss it based on shady benchmarks from "who knows what's been changed for testing purposes" engineering samples.
 
None of the ES samples we've seen leaked benchmarks for (assuming they're real) were released "weeks before launch." We're still potentially over 2 months away from launch and those B0 samples were likely made several months or more ago
B1 is just a few weeks old. B2 is apparently due next month and is a metal layer tweak of B1 to improve power consumption at higher speeds.

It's doubtful that B0 is *that* old, having been preceded by the A steppings with samples starting in late 2010 IIRC. It was the latest motherboard manufacturers had while Computex was going on less than a month ago (some complained about not receiving B1 when they heard AMD had a new stepping during Computex).

And I guess it really depends how you define launch, if that's what is supposedly happening next month. It's a secret or something. ;)

What's funny about the situation, to accept your version of what happens historically (I disagree with your view... this poor BD performance we see is what will launch, but supposedly at higher clock speeds [another thing I think will not be completely met at launch]), is that it should scare any potential server customers away. That's a huge problem for a couple of reasons, especially since the BD design is foremost a server CPU. Far away from this chip that isn't ready so close to launch, not having final silicon going through system validation for the usual many months period. Last time that happened, it was a total [Strike=1]Barcelona[/s], I mean disaster.

We'll have to wait to know if unseen hope prevails over what we have seen. :p Count me in with those not buying into "marketing fluff and kool-aid".
 
I do think what they choose to show (and not show) at the upcoming HardOCP event will be rather telling though. If they're still keeping it under serious wraps at that time, then it may be time to worry.
IMO that's just going to be a gaming oriented event, where CPU performance won't matter much. See: pretty much any modern [H] head to head CPU gaming evaluation. AMD isn't dumb in that regard and seems to have chosen a "perfect" event. ;)
We are talking about a [H]ardOCP event with members from this very forum showing up to attend. I'm willing to bet atleast one intrepid individual will take it upon themselves to try and squeeze some info out of these display machines if AMD doesn't give up the goods altogether.

All someone would have to do is load up a flash drive with some benchmark progs (cinebench, wprime, anything), cpu-z, and take some screenshots. *hint*hint*;)
 
B1 is just a few weeks old. B2 is apparently due next month and is a metal layer tweak of B1 to improve power consumption at higher speeds.

It's doubtful that B0 is *that* old, having been preceded by the A steppings with samples starting in late 2010 IIRC. It was the latest motherboard manufacturers had while Computex was going on less than a month ago (some complained about not receiving B1 when they heard AMD had a new stepping during Computex).

And I guess it really depends how you define launch, if that's what is supposedly happening next month. It's a secret or something. ;)

What's funny about the situation, to accept your version of what happens historically (I disagree with your view... this poor BD performance we see is what will launch, but supposedly at higher clock speeds [another thing I think will not be completely met at launch]), is that it should scare any potential server customers away. That's a huge problem for a couple of reasons, especially since the BD design is foremost a server CPU. Far away from this chip that isn't ready so close to launch, not having final silicon going through system validation for the usual many months period. Last time that happened, it was a total [Strike=1]Barcelona[/s], I mean disaster.

We'll have to wait to know if unseen hope prevails over what we have seen. :p Count me in with those not buying into "marketing fluff and kool-aid".

I can't say I've seen much on B1. Iirc, the only thing that's come out as being "maybe" B1 was that Obvrosky site, which I have hard time trusting based on the things he's posted over time.

I really don't keep up with the server side of this chip, but I don't think that's been having nearly as many problems. Pretty much all of the issues I've heard of are all dealing with the client chip. The server chips seems to be having a much smoother time, although it's hard to tell for sure, since there isn't as much talk about it.

I'm not sure where you get the idea I'm drinking the kool-aid. I've already said, multiple times, I'm not that optimistic about Bulldozer. I just don't think it's good practice to practically go on a crusade against a chip that's not even out yet based entirely on rumors posted on sites that lack any real credibility.
 
All this happened before, with Hammer.. And when it came out it took Intel years to catch up. BD is not going for the high IPC. It has a longer pipeline. It's supposed to be capable of reaching higher clocks than Sandy, because of this design. Even the broken B0 stepping hit 6Ghz on LN2 at almost 2v. That's what AMD is fixing.

We think IPC is everything because of how bad Netburst was. IPC is not everything. If for the sake of the argument BD has 20% less IPC but it's capable of reaching 50% higher clocks BD will be faster than SB, despite the lesser IPC.

Better IPC does not mean a faster chip. Sparc has had a better IPC than x86 for a long time, it didn't make it faster. Also check out IBM z196 CPU, which operates at 5.2 Ghz stock.

Going for higher clocks and less IPC is a legitimate design decision which can lead to competitive performance. People posting BD will have less IPC and therefore can't compete with Intel is kind of dumb.

I don't know if BD will be faster than SB. But no one knows. The fact that AMD delayed the chips to fix the clock/voltage situation with the stepping is a good sign. Barcelona was released without this delay with the TLB bug. PXC can't help it though, to him even Llano is a giant disappointment.

Track record is on AMDs side. This year alone they've released Zacate and Llano, and they are both successful products.
 
Last edited:
I see high clock speeds in retail cpus as danger since with technological barriers we have in silicone this usually means much smaller overclocking headroom.

Well apart from prehistoric times of Nortwood Pentium IV at 2,4 Ghz vs Amd Barton 2500+;)
 
I see high clock speeds in retail cpus as danger since with technological barriers we have in silicone this usually means much smaller overclocking headroom.

Well apart from prehistoric times of Nortwood Pentium IV at 2,4 Ghz vs Amd Barton 2500+;)

Shorter pipeline or more features to increase IPC concentrate heat in the core, which is a limiting factor for overclocks in high IPC designs. By being able to keep the heat down also with powergate tech AMD is using in BD, and we've seen some of that goodness in Llano already. I think BD should have good overclocks, provided AMD fixes the voltage/stepping issues.
 
I'm not sure where you get the idea I'm drinking the kool-aid.
I didn't mean to imply that. I was just stating that I agreed with the engineer's quote, using it again. That's just such an awesome dig at Nigel Dessau and (ahem) others at AMD who do serve up the marketing fluff and kool-aid by the tanker to an eager, but dwindling, audience.

Not all sources providing leaks so far are garbage. IMO it's pretty easy to filter out most of the fakes and information that has no believeability. The quality of information then vastly improves, painting a picture of what BD is and isn't at the time it was tested. But lumping it all together and dismissing it as unreliable in general is just as wrongheaded as accepting positive fakes and rejecting what are obviously real testing that shows poor performance.

It's not a "crusade" against anything to analyze available information. If the real tests showed great performance, I would praise BD and have higher hopes for the product when it launches. It's just disappointing to some people who saw BD as a product that has to be competitive, and see any criticism as "being against" a product. That's not how it works. The competitiveness of the product has to do with what is produced, not what it should ideally be.

We'll have to see how BSA (Bulldozer Scalable Architecture) works out in servers this time. ;) Magny Cours wasn't able to stop K10's server share slide, despite multiple deep price cuts.
 
It's just disappointing to some people who saw BD as a product that has to be competitive, and see any criticism as "being against" a product.

How can anyone criticize/praise a product that has no official benches?
 
How can anyone criticize/praise a product that has no official benches?
I wish this stuff was parody. :p

Is this like an uncertainty principle thing? If a benchmark is "official" it's real and if it's "unofficial" it's fake? You cannot tell if the actual benchmark is real without inspecting a permission slip, which collapses the system. OK. So if someone happens upon a (then) current CPU and tests it (for reals!), those results are meaningless. OK. But if someone poorly 'shops together a "screen shot" that shows it > *, that's believable because... just because. OK.

I think there's a good term to describe this: AMD exceptionalism. Nothing about silicon manufacturing, benchmarking or reality applies. OK.
 
Is this like an uncertainty principle thing?

Indeed it is, as you, yourself, acknowledge. Unless you're saying that you know, for a fact, (based on the all the evidence presented) that Bulldozer is going to be a fail.

f a benchmark is "official" it's real and if it's "unofficial" it's fake? You cannot tell if the actual benchmark is real without inspecting a permission slip, which collapses the system.

I think you understand the term "official" refers primarily to final reviews/opinions, since they will be the best indicator for final performance on a "finished" product. Or are you saying that "unofficial" sources, benching an unfinished product ( who may or may not have an agenda), are more reliable sources?


OK. So if someone happens upon a (then) current CPU and tests it (for reals!), those results are meaningless. OK.

Except that I never said that (for reals!) the current benches are fake, or that the final product won't be a disappointment. No, what I said is that we should refrain from coming to a conclusion on an unfinished product. There's a difference. Personally, I'm more optimistic than not, but not omnipotent enough to indicate otherwise.

But if someone poorly 'shops together a "screen shot" that shows it > *, that's believable because... just because. OK.

Show me where I indicated any of this.

I think there's a good term to describe this: AMD exceptionalism. Nothing about silicon manufacturing, benchmarking or reality applies. OK

Tell me about this "reality" that you're referencing to. Tell me what's illogical about waiting for "final" benchmarks.
 
Like all great myths, there is probably some grain of truth to this.

However, I still think it has more to do with Llano and Fusion products than it does Bulldozer. The complaint made by AMD insinuates that the GPU accelerated benchmarks are not factored into the final score. Bulldozer does not have an integrated GPU yet, and that version may not be out until the next version of Sysmark is released.

Nvidia also left but declined to comment as to why. They can no longer build integrated video chipsets for Intel, and since Sysmark is used by businesses and governments to establish a baseline performance for desktop machines that rarely get upgraded discrete gpu's, there is zero reason for them to be a part of BapCo as long as it doesn't help sell their products.

Via also left, but their cpu's are so underperforming that their actions only fan the flames from both sides as possible proof that Intel's performance is overwhelming or that Intel bullies around everyone else and gets their way. Via's chips rarely see desktops anyways since they are preferred for low power embedded applications, so sysmark doesn't help sell their chips either.

Honestly, I feel that Nvidia and Via's silence only shows that they were looking for any excuse to leave BapCo, and AMD's departure was that excuse. In the end, all 3 of them win by making Intel look like the bad guy.


You make it sounds like they just bailed coz all of their products are quite crappy. While in reality the cpu and gpu might do very well even Via has some solutions which are "okay" .

No company is ashamed of the performance gap, going AMD you get a better price perfomance ....

Intel has been skewing benchmarks since forever. Yet the industry really doesn't care to much and moves on some people know this to be true.

What would seem more of a problem is that the benchmark doesn't hold any value anymore due to an one sided approach. Now i got to say that benchmarks are pretty useless anyway but in this case it serves no purpose anymore to be called a benchmark and just rename it to Intel Cheerleading program ...
 
how about some humor to lighten the mood...

dozer20fire.jpg
 
Back
Top