The Worst CPUs Ever Made

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
ExtremeTech has assembled a list of the worst CPUs ever – that is, processors that were not only poorly positioned or slower than expected, but fundamentally broken. These include Intel’s Itanium, which was meant to replace x86 but ultimately limped along due to incompatibilities, AMD’s power-hungry Bulldozer, which failed to meet performance expectations, and Sony’s Cell Broadband Engine, which was only “phenomenally good in theory.”

Plenty of people will bring up the Pentium FDIV bug here, but the reason we didn’t include it is simple: Despite being an enormous marketing failure for Intel and a huge expense, the actual bug was tiny. It impacted no one who wasn’t already doing scientific computing and the scale and scope of the problem in technical terms was never estimated to be much of anything.
 
Not a bad list. Prescott definitely belongs, and the sad thing was that the engineers on the ground new, way in advance, that it would be a hot mess. But management listened instead to marketing, who thought the stupid consumer would confuse and conflate higher clock speed with higher performance.
 
  • Like
Reactions: N4CR
like this
Ah yes, the Pentium 4 Prescott. Commonly referred to as the heat pump on a chip.
Thing is... that was went I built my 1st PC and equally when I started using gentoo (FACEPALM).

The CPU running hot was one thing, but the god awful compile times to build the entire OS was beyond painful. for quite some time I put down the long compile times to thermal throttling or inexperience in configuring gentoo stage1.
Years later I then learnt about the long pipeline and the impact it had on the sort of workloads I was doing... double facepalm.
 
Cell was a fucking joke, Sony hyped that thing like the second coming, a lottery win and a free handjob for every customer that bought a PS3. Remember it was initially going to be the cpu, the gpu and the audio chip, then they got samples back, realised they had been smoking the Ken Kutaragi crack and went running to nvidia for a bastardized 7800 gpu dubbed "rsx reality synthesizer" :rolleyes:. All i remember about it was the majority of developers absolutely hated coding for it as it was a total pain in the ass, only a few like naughty dog really got their heads around it and extracted all the power they could out of it.
 
Cell was a fucking joke, Sony hyped that thing like the second coming, a lottery win and a free handjob for every customer that bought a PS3. Remember it was initially going to be the cpu, the gpu and the audio chip, then they got samples back, realised they had been smoking the Ken Kutaragi crack and went running to nvidia for a bastardized 7800 gpu dubbed "rsx reality synthesizer" :rolleyes:. All i remember about it was the majority of developers absolutely hated coding for it as it was a total pain in the ass, only a few like naughty dog really got their heads around it and extracted all the power they could out of it.
Back in 2006, that CPU curb-stomped every x86/x86-64 CPU out there (remember, x86 dual-cores were only very new this year) and while it lacked integer performance, it had immense floating point performance that no general purpose CPU could even begin to match.
This is why, despite the PS3 having a modified NVIDIA Series 7 7800GTX GPU installed, which was itself not capable of PhysX (Series 8 G80 GPUs started that), that the PS3 still had PhysX functionality due to the SPE units in the Cell CPU.

Also, Sony themselves didn't make the Cell, it was a PowerPC ISA CPU made by IBM, whom also used it (the full, non-cut version) in many servers up until it was discontinued in 2012.
So obviously, for 6+ years, it was good enough to stay on the market in both game consoles and servers.

From the article:
but Cell was far better at multimedia and vector processing than it ever was at general purpose workloads
Seriously, no shit.
It had one PPU (PowerPC ~970 core with in-order execution) for general purpose computation, and the 6-8 SPE units were all specifically meant for floating point computation.

Also from the article:
It's quite difficult to multi-thread the CPU to take advantage of its SPEs (Synergistic Processing Elements) and it bears little resemblance to any other architecture.
Actually, it would be the most similar to modern GPGPU designs, and really was in a way, a stepping stone between standard CPUs and GPGPUs in terms of functionality, and would be much more similar to modern APUs than anything else now.
 
The list is a bit narrow-focused.

A more accurate title for the article should have been "Worst x86-like CPUs, built between 1993 and the present, ever made".

Computing has a much larger history than the x86 architecture and there were plenty terrible chips to choose from in the 70s and 80s that were not x86-based.
 
The list is a bit narrow-focused.

A more accurate title for the article should have been "Worst x86-like CPUs, built between 1993 and the present, ever made".

Computing has a much larger history than the x86 architecture and there were plenty terrible chips to choose from in the 70s and 80s.
I agree, and while there were two PowerPC CPUs in there, there are a LOT more that could have been listed for various reasons.
 
the long pipeline
Ah yes, the extremely long 31-stage pipeline. Having an execution pipeline that long played holy hell with the branch predictor which more often than failed in it's predictions which caused the processor to stop and have to reload the execution pipeline again. If I remember correctly that process required four to five processor cycles to complete which resulted in a hell of a lot of processor cycles being wasted. In the end, because of the extremely long pipeline, the Pentium 4 Prescott ended up spending nearly half the time stopping and reloading due to branch prediction failures which of course resulted in piss poor performance.
 
Cell was a fucking joke, Sony hyped that thing like the second coming, a lottery win and a free handjob for every customer that bought a PS3. Remember it was initially going to be the cpu, the gpu and the audio chip, then they got samples back, realised they had been smoking the Ken Kutaragi crack and went running to nvidia for a bastardized 7800 gpu dubbed "rsx reality synthesizer" :rolleyes:. All i remember about it was the majority of developers absolutely hated coding for it as it was a total pain in the ass, only a few like naughty dog really got their heads around it and extracted all the power they could out of it.
The CPU was pretty awesome and on paper would of been awesome for gaming, but Sony was asking too much out of developers and apparently not providing sufficient developer tools to aid in development until very late.

MS played it safe with their 3(?) triple core + GPU approach.
 
Posting this regarding the Cyrix CPU, and it is actually quite entertaining!


Been a fan of his channel for ages. It's how I was introduced to the notion that there are more shitty CPUs to choose from than simply the x86 variety.
 
I disagree with the CELL it was a awesome CPU the main problem was programmers got lazy and became cry babies.

Ah yes those damn lazy devs. They just should have created tools for a new type of processor from scratch as well as develop for other platforms! SOOO lazy, or you know, not worth the all the extra effort.
 
The CPU was pretty awesome and on paper would of been awesome for gaming, but Sony was asking too much out of developers and apparently not providing sufficient developer tools to aid in development until very late.

MS played it safe with their 3(?) triple core + GPU approach.

They "played it safe" but at the same time it was embraced more by developers, cell was just horribly awkward to get the most out of. And the way sony hyped the thing didn't help matters. There's very few ps3 games that leveraged the cpu to its fullest, the uncharted series are probably the poster boys in that respect, but Sony talked about it like every game was going to be substantially better on ps3 due to the cell, and it didn't even remotely work out like that.
 
Maybe it's just me, but the Pentium 2 in that "cartridge" format was just horrible... maybe it performed ok but damn the way it installed.
 
When even juanrga admits it's a disaster, you know it's true.
That 10nm laptop chip Intel released, quad cut to dual, no igpu, defects for Africa and lower clocks than the old part.. Enough said.
 
They "played it safe" but at the same time it was embraced more by developers, cell was just horribly awkward to get the most out of. And the way sony hyped the thing didn't help matters. There's very few ps3 games that leveraged the cpu to its fullest, the uncharted series are probably the poster boys in that respect, but Sony talked about it like every game was going to be substantially better on ps3 due to the cell, and it didn't even remotely work out like that.
That’s all true and I’m agreeing with you. Sony was hell bent on locking development to just the PS too and not helping the development community.

There was also talk that IBM didn’t want a lot of their tools handed out to game developers, which just shows Sony wasn’t totally prepared themselves with it.

Sony took a lot of chances with the PS, and it really backfired. Being forced in bed with XDR and an apparently terrible contract with Nvidia.
 
Maybe it's just me, but the Pentium 2 in that "cartridge" format was just horrible... maybe it performed ok but damn the way it installed.
Yea the cartridge was stupid, this was to help with airflow and simplify OEM builds. Still the processors worked well, except for the ones with external cache ...
 
Yea the cartridge was stupid, this was to help with airflow and simplify OEM builds. Still the processors worked well, except for the ones with external cache ...
To my knowledge, all of the Pentium II CPUs had external cache.
Even the Pentium II Xeon & Celeron had an external cache, though it ran at the full CPU speed instead of at half like the normal Pentium II.
 
Cough cough. 486sx
Not sure what was wrong with that one, and even though it lacked the FPU, the 80387 FPU could be added, or it could be upgraded to a 80486DX later on if the FPU was needed.
This was done as a cost-cutting measure for home or consumer based systems, or even embedded systems that only needed the integer functions.

Why do you think it was a bad or poor CPU?
 
To my knowledge, all of the Pentium II CPUs had external cache.
Even the Pentium II Xeon & Celeron had an external cache, though it ran at the full CPU speed instead of at half like the normal Pentium II.
Did the P2’s have external cache? I thought only the celeron did? Oh well, it was a horrible way to cut costs though.

Edit : ok I see my error, so it was added to the cartridge to simplify motherboard production. That explains that, plus it was the L2 which hadn’t been added to the core yet.
 
Pentium D 820

I had a gateway back in the day with one of this CPU. Not only was it bad it had so many problems with mobos boing bad and replaced and it ate 3 HDDs. The whole system built around this CPU was horrible and these were $800+ machines.
 
Clarksdale, Intels first attempt w/ a GPU.
I built a 1U where I dremeled the fans into the sides of the chassis, was already for this great DAW I was to build.
Total loser CPU.
GPU got priority that interrupted audio streaming so badly it became my surfing PC.

I was the only guy I knew who bought one of those POS.
 
  • Like
Reactions: N4CR
like this
Clarksdale, Intels first attempt w/ a GPU. ...
I was the only guy I knew who bought one of those POS.
Shoulda waited for the last train.

[And for you kids, that's a song reference.
But it's kind of meant to refer to the risk of trying to stay on the "bleeding edge" of technology.]
 
Not sure what was wrong with that one, and even though it lacked the FPU, the 80387 FPU could be added, or it could be upgraded to a 80486DX later on if the FPU was needed.
This was done as a cost-cutting measure for home or consumer based systems, or even embedded systems that only needed the integer functions.

Why do you think it was a bad or poor CPU?

It was a regressive step. The 386 dx sx differed in memory interface. The 486 dx sx differed in FPU. Other 486 clones like the AMD dx2 had FPU. It just seem inconsistent at a time when chips needed consistency.

I feel much better about many of the other CPUs on the list.
 
The CPU was pretty awesome and on paper would of been awesome for gaming, but Sony was asking too much out of developers and apparently not providing sufficient developer tools to aid in development until very late.

MS played it safe with their 3(?) triple core + GPU approach.
This was my takeaway too. If you're gonna drop a chip like that on developers you better be willing to support it and walk them through the nuances. I thought it was a very cool chip. If I remember correctly didn't we used to run F@H and Linux on PS3? I felt like that console generation was fun with the PowerPC stuff.
 
To my knowledge, all of the Pentium II CPUs had external cache.
Even the Pentium II Xeon & Celeron had an external cache, though it ran at the full CPU speed instead of at half like the normal Pentium II.

The PIIs all had external cache which was usually 512k (if memory serves) but ran at I believe 1/2 or 2/3 speed. The first version of the Celeron had no cache at all but the second version (Celeron A)had on-die cache in the amount of 128k. However, the Celeron cache ran at full speed. In most cases a Celeron A at the same clockspeed as a PII would perform better than the PII as long as what was running was larger than the 512k cache on the PII. The higher speed and lower latency of the cache on the Celeron A was a huge boost compared to the PII.

The Celeron A design basically became defunct in the PIII era as the cache was on-die for the PIII as well as Celerons and the PIIIs had at least twice as much. At that time the Celerons still ran on a 66mhz bus while the PIIIs were 100 and 133. You could still find the occasional PIII based Celery model which was decent because you'd be able to overclock the piss out of it raising the clock speed considerably which left the only downside being the smaller cache.
 
Back
Top