Barcelona @ 1.6GHz benched by Dailytech!

Status
Not open for further replies.
X2 was on the market well before the PD was. The PD was a direct result of the X2. It was slapped together in such a rush because they didnt already have something to compete.

P4D might have come about because of X2, but it was still earlier by about a month. Don't make stuff up.
 
I belive the Pentium D may have been announced earlier, but you could definitely buy X2's in stores much earlier than you could Pentium D processors. (Not to mention they outperformed the Pentium D by alot as you all know.)
 
gawd oh gawd, WHY did i buy stock in AMD? worst idea EVER.

I have made every mistake in the book when it comes to stock trading. I've lost money. I haven't really started making money yet but I found a cool book called "Rule the Freakin' Markets" by Michael Parness. I also like "Stock Market Wizards" by Jack Schwager.

One of the things I find odd though is sometimes you'll find opportunities to change from long or short in a matter of days depending on the situation. Even Apple was once a real dog of a stock but the stock still went all over the place (which means you could have tried trading all those fluctuations).

I still haven't made money trading but like I said, I recently discovered the two books I just mentioned and I think they're great.
 
Overclocking potential should not even be considered in server chips. If you are dumb enough to overclock a server, you might as well go shoot yourself now.

I'm not meaning to suggest OCing the servers or server chips. But the server and desktop chips are based on the same core afaik or at least an extremely similar one. So performance will be similar.

So it stands to reason that if their having trouble getting the clocks up on these that there isn't much headroom currently and that will translate over to a similar situation for the desktop chips.



And about AMD doing the intel thing with two dies on one chip. They would have to do some pretty fancy things seeing as how the memory controller is on the die. I'd imagine isn't not quite as simple as it would be for Intel to do it as they would have to figure out a way to share it or have two working together well...
 
No it wasnt. I could buy a X2 about 2 months before PD became available.

Hah, I worked on P4D and I remember the exact day it came out publicly and there was a big bunch of hoorah about coming out to market before X2. Quit with the revisionist history.

Do you recall store availability? I doubt it, since there is no way you actually bought a P4D.
 
The Pentium Ds came out in late May 2005. The X2s were released over a month later.

It's trivial to verify. YAEBCFDC
 
The Pentium Ds came out in late May 2005. The X2s were released over a month later.

It's trivial to verify. YAEBCFDC

Actually, the Pentium D's were available at OEMs only in May. You wouldn't have bought one, unless, of course, you were a llama.

So it's quite possible that MANY people here owned X2s before anyone else here had a pentium D.

*EDIT*After actually taking the time to read through both the AMD and Intel forums here at HardForum for the Months of May and June of 2005, I found these two posts. They are the earliest post I could find of a person giving a date of RECEIPT of a Pentium D and an X2.

Intel Pentium D
http://www.hardforum.com/showpost.php?p=1027784449&postcount=1
Athlon X2
http://www.hardforum.com/showpost.php?p=1027817501&postcount=12

Pentium D = June 10th.

Athlon X2 = June 15th.

5 days later and kicked the shit out of the Pentium D. Awesome. Too bad the barcelona is a year later than C2D and is getting the shit kicked out of it.
 
Actually, the Pentium D's were available at OEMs only in May. You wouldn't have bought one, unless, of course, you were a llama.
Or part of the tiny 90%+ of the market that gets CPUs in OEM systems. :p Systems with the Pentium D were available at the end of May.

But it does seem that X2s were available before they officially launched, making the Pentium D available only 2 weeks before the X2 instead of over a month. Correction noted.
 
Funny thing is that Intel really didn't do shit for the past year.
Core2 chips really aren't any faster than at the introduction.
Main difference is that they only introduced the 3 GHz and 1333 FSB models in the Xeon-line at first, and they brought them to the desktop later, but obviously that's not a technical achievement, it's all the same Core2 technology, just on different platforms.
They just kept cutting prices to keep the pressure on AMD.

So really, it looks like Intel dealt out a blow a year ago, and is just sitting back and watching AMD lying flat on its back, unable to fight back at all.
Intel still has the ace of clockspeed up its sleeve, as all of us Core2 overclockers know.
And that's just on 65 nm, who knows what 45 nm will bring.

In a way it's just pathetic... Intel lashes out once, and AMD is completely demolished. Intel isn't even *trying* at the moment.
Funny how so many people thought that Intel was behind on technology because the Pentium 4 didn't perform as well as the Athlon64.
Back then people thought I was crazy when I pointed out that the Pentium 4 is really a marvel of technology, because it is capable of such high clockspeeds and Intel sells them quite cheaply considering the die-size. Sure, the performance wasn't there... but the technology was. They had very high yields, excellent reliability, and low cost.
All they needed was a design that could turn their 65 nm process into performance. A design they chose to perfect rather than rush.

Well the thing is Intel really didnt need to do anything. They knew they had very good lineup of chips and that they could spend the extra time tweaking there 45nm lineup, which is looking mighty fine BTW. There is no pressure on them now, and they know this, so why rush out a chip?

They are starting to roll out there quad core line slowly(please dont start that native shit), and the budget processors(the 800FSB ones). Soon enough we will see the 45nm chips and if the benchmarks we see ever so often see on them live up to expectations, they will have a successor to the current 65nm processors.

There roadmap is looking pretty good too, and they have a busy schedule ahead of them. I really hope AMD performs good or this is bad for them and for us.
 
True, they got beaten to the market by the Pentium D, but the pentium D was never really much competition to the X2 performance wise and the Pentium D coming out first didn't hurt AMD much, if at all in the long run.

That's not the point though.
The Pentium D was a dualcore version of the Pentium 4.
The Pentium 4 was already not much competition to the Athlon64...
What AMD should have learned however, is how well the MCM-design worked.
In terms of a Pentium 4 with two cores, it did deliver quite good performance, so AMD should already have figured out that MCM works. It should have been their cue to start working on MCM quadcores. But they were apparently not paying a lot of attention to their competitors.
 
Besides, I dont think that K10 will be that big a deal anyway. It was never supposed to be. Will it beat Conroe? YES.

How is it ever going to beat Conroe? It's slower clock-for-clock, and it will run at least 500 MHz slower.

That is the sole reason why AMD bought ATi..... K10 was a failure..... They needed a highly parallel SIMD architecture, ATi had one...... Fusion will be the end result...... Fusion is what K10 was supposed to be.....

No it's not... Fusion is a gimmick that has little or no use in daily desktop computing usage. It's a niche-product.
The reason why AMD bought ATi is because big OEMs like Dell demand that AMD delivers complete platforms, not just CPUs.
 
Actually fusion is likely going to be AMD"s biggest money maker. The biggest GPU manufacturer is not ATI or nVidia, its Intel... Why? Beucase 90% of computer users have a basic desktop machine with onboard video powered by "Intel Extreme Graphics" Fusion is AMD's version of that, so they will be selling quite a bit of it to OEM's, but I digress... This is about Barcelona and thus far, and its looking like a very expensive failure.
 
And about AMD doing the intel thing with two dies on one chip. They would have to do some pretty fancy things seeing as how the memory controller is on the die. I'd imagine isn't not quite as simple as it would be for Intel to do it as they would have to figure out a way to share it or have two working together well...

Just remove one memcontroller and join all cores at the crossbar.
All the technology is already there, because of the whole HT/NUMA architecture. It's trivial to 'fool' cores to think they don't have any local memory at all, and always use a remote memory controller.
 
Actually fusion is likely going to be AMD"s biggest money maker. The biggest GPU manufacturer is not ATI or nVidia, its Intel... Why? Beucase 90% of computer users have a basic desktop machine with onboard video powered by "Intel Extreme Graphics" Fusion is AMD's version of that, so they will be selling quite a bit of it to OEM's, but I digress... This is about Barcelona and thus far, and its looking like a very expensive failure.

I don't think so.
Intel's Extreme Graphics are so popular because they're extremely cheap, get the job done okay, and are not bad in a laptop for battery life.
In other words, they're great for the corporate/office world.
Fusion is not just a cheap and simple graphics solution. It's an attempt to bring the incredible amount of floating point processing power closer to general purpose computing. In other words, completely unlike Extreme Graphics.

And most people simply don't need that processing power (just like most people don't buy quadcores, SLI/CrossFire setups and PhysX cards). They'll take whichever is cheapest, which will probably be Extreme Graphics for years to come.
 
I love it how zealots come in and proclaim distaster at one bench and say we told you so. It's like OH NO there's a TB patient 800 miles away from me...i had better don my breathing mask to protect myself. I smell paranoia.

In other news: Buying a company such as ATI and keeping timelines/operations can not be an easy task. I will admit though AMD/ATI need to produce results...and quick.
 
Me wonders if we should call this "Cinebench day"

Is it coincidence, or is it paid? I think it's paid.

Cinebench is a very good representation of what I and many others use my computer for, and so I consider this an excellent performance measurement. Enough of your stupid conspiracy theories.
 
I belive the Pentium D may have been announced earlier, but you could definitely buy X2's in stores much earlier than you could Pentium D processors. (Not to mention they outperformed the Pentium D by alot as you all know.)

Costed a good arm and a leg too.

BTW, if Nvidia and ATI IGP aren't compelling enough today to take marketshare and are already leaps and bounds ahead, how will that make Fusion any different? I recently recall AMD saying/dodging the question if there will be cost savings to integrate GPU/CPU. They said it will be "same cost", leaving me to believe that they aren't going to pass on the savings on to OEMs and justify that with energy savings or whatever.

http://www.techreport.com/etc/2005q3/idf/index.x?pg=3
Don't forget that Intel had a glued GPU/CPU thing years ago as AMD will do and has hinted at Nehalem supporting on die IGP.
 
+1

One bench from one site with one non-real world bench tool?
No need to light torches just yet.

Remember that AMD-day a while back? Where members from this forum were invited aswell? And AMD demonstrated Cinebench aswell, on their new system?
And that there was an AMD-goon covering up the part of the screen where the results appeared?
This is what they didn't want us to see.
I'm sure the people who were present can testify that the test indeed took closer to 27 seconds than 17 seconds to complete, even though they may not have seen the exact results.

Other than that, Cinema4D is very much a real-world application.
 
I'm not meaning to suggest OCing the servers or server chips. But the server and desktop chips are based on the same core afaik or at least an extremely similar one. So performance will be similar.

So it stands to reason that if their having trouble getting the clocks up on these that there isn't much headroom currently and that will translate over to a similar situation for the desktop chips.

We have seen in the past that server chips can be drastically different than desktop chips, even when based on the same architecture. Remember Opterons versus X2?

And about AMD doing the intel thing with two dies on one chip. They would have to do some pretty fancy things seeing as how the memory controller is on the die. I'd imagine isn't not quite as simple as it would be for Intel to do it as they would have to figure out a way to share it or have two working together well...

QFT! And people call me crazy when I say FSB is still a superior architecture.
 
Besides, I dont think that K10 will be that big a deal anyway. It was never supposed to be. Will it beat Conroe? YES. Will it save AMD? NO. I still think Barcelona was a last ditch effort to hold them over, when the original K10 failed to deliver. That is the sole reason why AMD bought ATi..... K10 was a failure..... They needed a highly parallel SIMD architecture, ATi had one...... Fusion will be the end result...... Fusion is what K10 was supposed to be.....

My only thing is will AMD even last that long? Lets face it, even if Barcelona delivered on its hype, Penryn was going to more than likely masacre it anyways. Add to that, Fusion doesnt look to serve any practical application (outside of gaming possibly) for a desktop user. Further down the road maybe, but once again we are running with the assumption that amd has all this precious time to see its vision through.

It looks promising for low budget igp pc's and low/mid range budget laptops, but i dunno.

Also AMD isnt winning any fans over with the whole being late to the party and then having a lack luster gift. If you're going to arive to a party fashionably late, you better damn well make sure your shit is fashionable.
 
We have seen in the past that server chips can be drastically different than desktop chips, even when based on the same architecture. Remember Opterons versus X2?



QFT! And people call me crazy when I say FSB is still a superior architecture.

I wouldn't say FSB is superior, but I would say it's most certainly a more flexible configuration.
 
I wouldn't say FSB is superior, but I would say it's most certainly a more flexible configuration.

Dan I was wondering, what are your feelings about how Intel arrives at Computex, shows off their next processor with an actual bench, While AMD has it's own San Francisco brewhaha but keeps everything completely secret.

Anand of course laid his cards out on the table about how much he dislikes the secrecy of AMD and the frank openess of Intel, I am curious to how you and Brent feel about it? and do you think that AMD is making big mistakes with keeping things too quiet.

Because frankly when I read about Barcelona over the past few days it seriously bring up terrible recent memories of the R600.
 
QFT! And people call me crazy when I say FSB is still a superior architecture.

Yup, it's just like MCM vs 'native' multicore.
In theory a native design could be better, but only if both the design and the actual software take advantage of it.
In other words, it's not a guarantee for superior performance.

NUMA can be an advantage, IF your architecture actually gives you more bandwidth than a single controller. In the best case (each core only uses its own controller), AMD's design does this.
But in the worst case (all cores are accessing memory via remote controllers), it is worse than Intel's solution.

The FSB gives you a centralized system, which means you can do very efficient bus arbitration, and you will automatically get good performance in any application.
Therefore, if the FSB and the memory controller are fast enough, it is always a better system than AMD's decentralized system.
It's just that it's easier to split the resources over multiple parts. Somehow a lot of people were brainwashed into thinking NUMA is a good thing. It's not. It's hell for programmers, and it's hell for users if their software isn't NUMA-aware. And most software isn't.
Basically it has no place on the desktop. It's only interesting for (virtual/zoned) servers, where you basically treat a single system like a cluster of computers, each with its own CPU and memory, and keep them isolated.
 
I don't think so.
Intel's Extreme Graphics are so popular because they're extremely cheap, get the job done okay, and are not bad in a laptop for battery life.
In other words, they're great for the corporate/office world.
Fusion is not just a cheap and simple graphics solution. It's an attempt to bring the incredible amount of floating point processing power closer to general purpose computing. In other words, completely unlike Extreme Graphics.

And most people simply don't need that processing power (just like most people don't buy quadcores, SLI/CrossFire setups and PhysX cards). They'll take whichever is cheapest, which will probably be Extreme Graphics for years to come.

Obviously you have not seen any of the preliminary fusion benchmarks. If you had, you'd know it is in no way comprable to a dedicated solution. It IS infact, a cheap onboard graphics solution like Intels, it's just slightly better as far as performance is concerned.

And please stop posting about things you obviously have no clue about, like FSB and HT... FSB vs HT have nothing to do with intel's performance lead right now, they have better performance becuase they have a better core, simple as that. Just look at multi core netburst based xeon CPU's and you'll see how the FSB started becoming a bottleneck. HT has yet to become a bottleneck in a multi socket platform. It wasn't until Core was introduced which is far more efficient in it's bandwidth usage than netburst and the increased FSB clock speed that the bottleneck was eliminated. Your Intel fanboyism is just as bad as Duby's AMD fanboyism.
 
My only thing is will AMD even last that long? Lets face it, even if Barcelona delivered on its hype, Penryn was going to more than likely masacre it anyways. Add to that, Fusion doesnt look to serve any practical application (outside of gaming possibly) for a desktop user. Further down the road maybe, but once again we are running with the assumption that amd has all this precious time to see its vision through.

It looks promising for low budget igp pc's and low/mid range budget laptops, but i dunno.

Also AMD isnt winning any fans over with the whole being late to the party and then having a lack luster gift. If you're going to arive to a party fashionably late, you better damn well make sure your shit is fashionable.

I've already covered your concerns here. AMD screwed up with K10. Becouse of that they are several years behind. Fusion will be the ultimate replacement for K10.

I think of Fusion in the same way that I think of amd64. It requires software support. Software designers will have to write there code to take advantage of it. It will do wonders
for interpreted environments, and will eventually saturate compiled environments.

Most people will still have a discreet GPU. As such the on die GPU will sit idle, and that procesing capacity is just too tempting. That is what K10 was supposed to be. That is what Fusion will be.
 
Your Intel fanboyism is just as bad as Duby's AMD fanboyism.

Please dont put Scali in the same playing field with me. I Scali's mind, Intel is literally god. I have the good sense to understand that AMD is a company. Unlike Scali, I have the good sense to admit that I'm a fan.
 
I can't help but wonder if the benchmarks were favorable, some people would be touting them in a completely different one.

AFAIK, one thing you have to remember is that K8 actually performs very well and scales very well on Cinebench, more so than some other applications:

http://www.techreport.com/reviews/2007q2/intel-v8/index.x?pg=9

Basically the FX-74 can almost keep up with the QX6800 and the 6000+ is just a hair behind the X6800.
 
I can't help but wonder if the benchmarks were favorable, some people would be touting them in a completely different one.

AFAIK, one thing you have to remember is that K8 actually performs very well and scales very well on Cinebench, more so than some other applications:

http://www.techreport.com/reviews/2007q2/intel-v8/index.x?pg=9

Basically the FX-74 can almost keep up with the QX6800 and the 6000+ is just a hair behind the X6800.
That isn't good. If Cinebench is a benchmark that typically favors AMD architecture, other situations where AMD typically struggles might be even worse.
 
My friend had one of these, a 180MHz variant. I thought his was superior until we ran some primitive benchmarks. I had a Pentium 100MHz

I worked at Comp USSR back when they used to sell the Media GX's. Then a little later when I left sales and became a full time service technician, I had to build a system that had one in it. What a pile of crap. That's about all I have to say on the subject.
 
/waits for real and official benchmarks

Indeed and why?

Getting shafted in video forums by bogus benchies for so long Iam willing to take the waiting stand. Somehow corporates showing their own benches have always been bit suspicious, I will wait for review sites version of the truth.
 
And please stop posting about things you obviously have no clue about, like FSB and HT... FSB vs HT have nothing to do with intel's performance lead right now, they have better performance becuase they have a better core, simple as that. Just look at multi core netburst based xeon CPU's and you'll see how the FSB started becoming a bottleneck. HT has yet to become a bottleneck in a multi socket platform. It wasn't until Core was introduced which is far more efficient in it's bandwidth usage than netburst and the increased FSB clock speed that the bottleneck was eliminated.
Did he say that Core is outperforming K8 because of the FSB vs HT? He was speaking in terms of how the two systems were designed. He was praising the flexibility of Intel's FSB from an architectural level. You are speaking of HyperTransport and FSB in terms of how they impact benchmarks. Whose words betray a deeper understanding here?
Your Intel fanboyism is just as bad as Duby's AMD fanboyism.
The personal attack was unnecessary. State your opinion and move on. Nobody's going to pat you on the back for being able to call out a fanboy.
Please dont put Scali in the same playing field with me. I Scali's mind, Intel is literally god. I have the good sense to understand that AMD is a company. Unlike Scali, I have the good sense to admit that I'm a fan.
If you don't have something nice to say...
 
Status
Not open for further replies.
Back
Top