The Top 5 Best CPUs of All Time

I would like to throw in E8400, for gaming at the time it really set the bar for performance. Similar for Q6600 and Q9550.
 
The Q6600 was an amazing CPU with a great deal of longevity to it.
Hey, the Q6600 still powers the second most powerful desktop in this house! There aren't any CPUs I can think of produced before it that would still be viable ten years later when maxed out with 8 GB of DDR2 and a much later graphics card like a GTX 760, even if my i7-4770K setup runs circles around it in more demanding games. The exponential performance leaps year after year started plateauing after that.

If anything, what might hold the Q6600 back most is simply lack of SSE4, in much the same way that Athlon XP systems are much less viable than their Pentium 4 contemporaries due to lack of SSE2. When software starts requiring instruction sets the CPU doesn't physically have, well, that's one way to sell new hardware. (I actually ran into this problem trying to use my i1Display Pro colorimeter, of all things; the i1Profiler software requires SSE2. Hey, even my retro rigs deserve accurate ICC profiles for their monitors!)
 
In no particular order (I guess technically I did list them chronologically):
Motorola 68000, Celeron 300A, Athlon XP 2500+, C2Q 9650, and i7 2600K (just because it overclocked a helluva lot better than later generations I've tried out).
 
I was a kid, but I remember seeing a big difference comparing a standard Pentium to a Pentium MMX.
 
I was a kid, but I remember seeing a big difference comparing a standard Pentium to a Pentium MMX.

MMX is fine, it was just overhyped by the marketing dept. It did help in the areas it was actually designed to address, but the marketing flaks made it sound like it would ZOMG MAKE UR MODEM TEH FASTERZ and all manner of silliness.
 
I was a kid, but I remember seeing a big difference comparing a standard Pentium to a Pentium MMX.

At the same clocks in an application that wasn't designed to support it, there were zero gains from MMX. The biggest reason why people saw a difference was that the Pentium MMX was offered in speeds beyond the standard Pentium which topped out at 200MHz. I owned a Pentium MMX 233. Some how I ended up with a Pentium MMX 166MHz chip in my CPU collection. I never used this CPU and it was just in the box with the other CPU's I've collected over the years.
 
Nexgen NX586. AMD acquired them and was able to leapfrog intel for a short period.

AMD never leap frogged Intel during this period. Even the K6, which was based on the NX686 didn't do that. It could win against the older P54c Pentiums and even against the Pentium MMX (P55c) in many, but not all applications. It was certainly no match for the Pentium Pro despite AMD's marketing claims. The K6 II was also no Pentiun II killer. It was the same story with K6 III. It's not until the Athlon that we see AMD really give Intel a run for it's money.
 
AMD never leap frogged Intel during this period. Even the K6, which was based on the NX686 didn't do that. It could win against the older P54c Pentiums and even against the Pentium MMX (P55c) in many, but not all applications. It was certainly no match for the Pentium Pro despite AMD's marketing claims. The K6 II was also no Pentiun II killer. It was the same story with K6 III. It's not until the Athlon that we see AMD really give Intel a run for it's money.

No one was using a pentium pro at home. It was slow as shit for home use due to the poor 16-bit application speed.
 
At the same clocks in an application that wasn't designed to support it, there were zero gains from MMX. The biggest reason why people saw a difference was that the Pentium MMX was offered in speeds beyond the standard Pentium

You're wrong. The MMX pentiums had double the L1 cache (and 4-way compared to 2-way on non-mmx) and boosted performance in general compared to non-mmx versions. Good grief did [H] hire a 19yr old as motherboard editor?
 
At the same clocks in an application that wasn't designed to support it, there were zero gains from MMX. The biggest reason why people saw a difference was that the Pentium MMX was offered in speeds beyond the standard Pentium which topped out at 200MHz.

You're wrong. The MMX pentiums had double the L1 cache (and 4-way compared to 2-way on non-mmx) and boosted performance in general compared to non-mmx versions. Good grief did [H] hire a 19yr old as motherboard editor?

Why the attitude?

His statements of "no gains from MMX" (the instructions) and that Pentium MMX chips were faster mainly due to clock increases are both correct. If you'd like to clarify that there were some increases in addition to clocking, just add that without being rude.
 
No one was using a pentium pro at home. It was slow as shit for home use due to the poor 16-bit application speed.

I was. :) I ran dual Pentium Pro 180's overclocked to 200MHz. Honestly, it depended on what you were doing. As you pointed out, 16-bit application support was worse than the standard Pentium, but a lot of times you couldn't tell outside of benchmarks. I usually ran Windows NT 4.0 for workstation tasks and general computing and Windows 9x when playing games.

You're wrong. The MMX pentiums had double the L1 cache (and 4-way compared to 2-way on non-mmx) and boosted performance in general compared to non-mmx versions. Good grief did [H] hire a 19yr old as motherboard editor?

You're obviously the one who's been triggered by my statement and coming across as being ultra young.

Yes, there were differences between the older P54c Penitums and the P55c Pentiums with MMX. However, my statement still stands. The biggest reason for the Pentium (P55c) being faster was due to clock speed. I've owned damn near the entire Pentium family back when they were current. There were a lot of things that contributed to or took away from the overall system performance we don't contend with today. The motherboard you used back then mattered as that's where the L2 cache resided. L2 cache wasn't even present on some motherboards. I can't tell you how many systems I serviced back then that had empty COAST slots. You had Intel and a plethora of non-Intel chipsets. Some of which had cache implementations that were more suited to non-Intel CPU's.

Again, outside of benchmarks you could rarely tell the difference between a Pentium 166MHz and a Pentium MMX 166MHz CPU doing the same tasks unless the application was optimized to use MMX instructions. Some people actually look at more than just benchmarks and in the real world the difference wasn't huge in a lot of cases. This was only a big deal on applications like Photoshop which were updated to leverage MMX instructions. Of course most people back then held on to systems for 3 to 5 years. It was likely that someone buying a Pentium MMX based system was coming from a Pentium 75MHz system or worse. In such cases, it wasn't the L1 cache changes that made the new system so much faster. It was clock speed. If you were like me and had a good motherboard which could support a P55c Pentium properly , you could do a drop in upgrade of the Pentium MMX 233 and replace that Pentium 200MHz processor. That's what I did. Guess what? Again the jump in performance was largely due to clock speed. Frankly, it wasn't one of the most mind blowing upgrades I've done over the years either.
 
1366 Id say needs to go down as one. showed real quick AMD's still belonged on XP and stuff lol
 
IMHO, you can pretty much rewrite this list after Spectre and Meltdown now. All those CPU's will be seen as "those from back then when they all had that terrible bug.."

The CPU era will start over, before & after Spectre
 
IMHO, you can pretty much rewrite this list after Spectre and Meltdown now. All those CPU's will be seen as "those from back then when they all had that terrible bug.."

The CPU era will start over, before & after Spectre

I don't think it does. It was a Google project that discovered this issue was there. This isn't an exploit that was found out by the dick bags who write viruses and partake in DOS attacks. To my knowledge, it's never been used and for the moment, the exploit is more of a theoretical problem than a practical one right now. This will certainly change as things progress. However, I'm not one for revisionist history. I'm certainly not going to be the one writing it. The Pentium 60/60MHz CPUs had a huge bug as well. Still, I don't think they were any less influential or ground breaking from a design perspective. Intel replaced those CPUs and corrected the issues on newer designs. Replacement isn't possible here. However, newer designs will eliminate these issues going forward. The influence, performance, market saturation, penetration, and value any of these CPUs have provided doesn't really change since these exploits cover just about everything. It's a fundemental problem with how the logic works on these parts.

Let me put this another way. A 1967 Camaro is a death trap compared to modern vehicles. Are these cars any less awesome today? I don't think that they are. We simply view them in a different light than we would have when they were current production. That's the nature of all products be it music, entertainment, hardware, or even software. Those diecast Voltron toys of the 1980's were made with lead based paint. I think they are still awesome. I just wouldn't lick them or put them in my mouth.

Spectre and Meltdown haven't actually been used prior to the disclosure of the existence of these exploits. It doesn't retroactively change how these processors were perceived or how they functioned during their time on the market. The processors that are still in production and actively being sold on the market may be seen in a different light before too long. I can agree with that. There are certainly implications of future lists being evaluated with this in mind and AMD's Ryzen and Threadripper may be looked at more fondly than they otherwise would be as a result of this. That said, the Core i5 2500K is 7 years old and is still a great overclocker and serves many people well to this day. It and the 2600K have served well beyond a normal operational life span and deserve no less than their existing place on the list. Even if you stopped using a 7 year old CPU right now for fear that this exploit might be used against it, wouldn't you say you got your money's worth out of it? To me, it's no different than if my 2600K died tomorrow and I was forced to replace it the next day. I still got over 7 years of hard use out of it. That's a better value than I've seen out of any other CPU in history.

I agree with the idea that we may very well think about CPU's in a pre-Spectre/Meltdown and a post-Spectre/Meltdown context. This is certainly true of CPU's that bridge the gap between pre and post discovery of the exploits. Frankly, these issues could effect Core 2 Duo / Quad CPUs which are still in use and I would pretty much say: "Who cares?" That CPU is so old you probably should have moved on by now anyway. Most of the CPU's on this list, even if they are effected are so old it wouldn't make any difference if they were vulnerable to these exploits or not.
 
Last edited:
Frankly, these issues could effect Core 2 Duo / Quad CPUs which are still in use and I would pretty much say: "Who cares?" That CPU is so old you probably should have moved on by now anyway. Most of the CPU's on this list, even if they are effected are so old it wouldn't make any difference if they were vulnerable to these exploits or not.
What kind of reasoning is that? There are thousands if not millions of Core 2 era builds out there running perfectly, especially in offices. Upgrade for the sake of upgrade is not something most people do. I'm running 2D CAD work on one and there's barely any difference between it and Sandy Bridge. Pair them with an SSD and they are fast enough for the majority of average PC users.
 
Oh wow...this reminds me of my first PC back in 1988.
I really wonder what happened within AMD after Athlon 64 series...
 
What kind of reasoning is that? There are thousands if not millions of Core 2 era builds out there running perfectly, especially in offices. Upgrade for the sake of upgrade is not something most people do. I'm running 2D CAD work on one and there's barely any difference between it and Sandy Bridge. Pair them with an SSD and they are fast enough for the majority of average PC users.

I've worked a lot of places as a contractor. I don't see too many 9 or 10 year old machines still in service for desktop, office or server work.

Home users might do this, but it's not being done at most businesses.
 
I've worked a lot of places as a contractor. I don't see too many 9 or 10 year old machines still in service for desktop, office or server work.

Home users might do this, but it's not being done at most businesses.
There's a world outside the USA, too.
 
There are tons of POS Pc's out there w/C2D's and XP still chugging in the retail world. ;)
 
Itanium 9760 could have been the best. Eventually history will prove that we are all wrong and VLIW + SIMD is the king.
 
Dan, I can agree with your scope of things, I loved my 2600k over ANY other CPU I owned or worked with and for sure they deserve a top rank position on your list, zero doubt. The before/after era makes the most sense, a paradigm change is bound to happen.


I dont agree with 1 particular thing. I would not insist on the fact that Spectre or Meltdown have not been used. They may have been in some books that we all will never have access to, FSB, Mossad, NSA, etc. etc.

Since you can't track the attack iirc, there is no way of telling if or if not. To be safe, assume it has been known to circles that would never ever say that they knew it for years, for obvious reasons.


I dare to say, there are at least a handfull of such bugs surrounding us, we are just not aware yet.
 
Last edited:
There's a world outside the USA, too.

Fair enough, but in the U.S. at least, this isn't as big an issue as you might imagine. Businesses rely on support from hardware and software vendors to make sure their businesses run. You can't do that effectively with ancient hardware that won't handle newer OSes as older operating systems are phased out of vendor support contracts. Ancient hardware can often times be prohibitively expensive to maintain beyond their original or extended warranty periods. This too has problems. That said, I've seen legacy hardware just about everywhere I've worked, but rarely is any of it 10 years old or more. I've certainly seen cases of that, but it's usually in government rather than the private sector.

Dan, I can agree with your scope of things, I loved my 2600k over ANY other CPU I owned or worked with and for sure they deserve a top rank position on your list, zero doubt. The before/after era makes the most sense, a paradigm change is bound to happen.


I dont agree with 1 particular thing. I would not insist on the fact that Spectre or Meltdown have not been used. They may have been in some books that we all will never have access to, FSB, Mossad, NSA, etc. etc.

Since you can't track the attack iirc, there is no way of telling if or if not. To be safe, assume it has been known to circles that would never ever say that they knew it for years, for obvious reasons.


I dare to say, there are at least a handfull of such bugs surrounding us, we are just not aware yet.

It is certainly possible that Spectre and Meltdown have been used. I will conceede that point. However, we know of no documented incidents or cases where this has happened. If it has, it didn't make the news and probably wasn't a widespread use of either or both exploits. For the home user and gamer, it's been a non-issue up until now.
 
Q6600 G0 stepping I had OCed like a madman and lasted me a long time.
Hey, mine's still going strong after ten years! Try that with any CPU released before it, even though people like Dan here give the vibe with statements like "That CPU is so old you probably should have moved on by now anyway." that we should just toss 'em in the landfill and run out and buy Ryzen or Coffee Lake this instant, regardless of whether or not it's affordable or even sensible.

Granted, it's no longer my mainline machine due to the need for better single-threaded performance, but my little bro has no complaints gaming on it... well, unless he spawns a ridiculous number of cars in BeamNG.drive, props in Garry's Mod or anything else infamous for being demanding on single-threaded CPU performance, where you most certainly will feel the difference between Kentsfield and Haswell, let alone all the post-Skylake stuff.

That's partly why I moved on about four or five years ago (can't believe the 4770K is that old already), but the other big reason was that the rest of the family frankly deserved something far better than the aging Athlon XP 1800+ that I had initially displaced with the Q6600. The poor Athlon XP was already painfully slow after just three to five years, but a Core 2 Quad still feels responsive enough in daily use after ten! Some might say that modern multi-core CPUs are finally good enough for general computing, while others might lament their relative lack of speed increases and games like Trespasser/Doom 3/Half-Life 2/FEAR/Crysis/Star Citizen/etc. to take advantage of that.

The only particular issue I have with the ol' Q6600 box is that even after strapping custom water-cooling to it, I can't get 3.6 GHz to run effectively. It's either unstable, or by cramming roughly 1.45V into the CPU, stable enough to run, but slower in actual use than the 3.0-3.2 GHz range. Motherboard power delivery not good enough under load, perhaps?
 
Hey, mine's still going strong after ten years! Try that with any CPU released before it, even though people like Dan here give the vibe with statements like "That CPU is so old you probably should have moved on by now anyway." that we should just toss 'em in the landfill and run out and buy Ryzen or Coffee Lake this instant, regardless of whether or not it's affordable or even sensible.

Granted, it's no longer my mainline machine due to the need for better single-threaded performance, but my little bro has no complaints gaming on it... well, unless he spawns a ridiculous number of cars in BeamNG.drive, props in Garry's Mod or anything else infamous for being demanding on single-threaded CPU performance, where you most certainly will feel the difference between Kentsfield and Haswell, let alone all the post-Skylake stuff.

That's partly why I moved on about four or five years ago (can't believe the 4770K is that old already), but the other big reason was that the rest of the family frankly deserved something far better than the aging Athlon XP 1800+ that I had initially displaced with the Q6600. The poor Athlon XP was already painfully slow after just three to five years, but a Core 2 Quad still feels responsive enough in daily use after ten! Some might say that modern multi-core CPUs are finally good enough for general computing, while others might lament their relative lack of speed increases and games like Trespasser/Doom 3/Half-Life 2/FEAR/Crysis/Star Citizen/etc. to take advantage of that.

The only particular issue I have with the ol' Q6600 box is that even after strapping custom water-cooling to it, I can't get 3.6 GHz to run effectively. It's either unstable, or by cramming roughly 1.45V into the CPU, stable enough to run, but slower in actual use than the 3.0-3.2 GHz range. Motherboard power delivery not good enough under load, perhaps?

My Q6600 held 3.4Ghz it's whole life. I remember having to rig a fan over the NB. I even lapped mine and had a mirror like finish. It'd prob still be going strong today if not for the motherboard taking it along with it. After the Q6600 I moved to the 3570k and now to a 7700k.
 
My first lust was an AMD athlon 1000 mhz CPu sitting on best buy display shelf! I wanted to buy it so bad! Medal of Honor Allied Assault was the game during that time!
 
My favorite processor was the Core 2 Duo E4300. Dirt cheap processor. Only 1.8GHz, but highly overclockable. Mine ran at 3.3Ghz and it was a big improvement over the dumpster fire Pentium D 820.
 
To me i7-920 and it's family (Bloomfield) remain ahead of the 2600k in terms of historical importance.

It was the first jump to the i3/i5/i7 family. And it was an overclocker's dream, easily getting pushed over 4ghz. i7-2600k is certainly a faster chip, but the 920 always screamed 'overclock' me from the day it came onto the enthusiast scene in 2008.


Another chip they skipped mention was the Pentium III. At that time 1Ghz was a huge landmark jump in the consumer market. I'll always remember feeling so proud to have the first 1GHZ machine on the block.

Depending on what you were running before the new P3 doubled up (500) or even tripled up (333) on the previous clock speed of the Pentium II chip.

Of course it was overkill at the time, ran some programs too fast, so you had to find a way to 'slow' things down using emulators. But whatever, with 1000Mhz, it was the Ferrari of consumer CPUs at the turn of the 21st century (released in march of 2000).

What is your favorite Intel chip, and what do you dream of upgrading to in the future?
 
Last edited:
You could say the P3 but at the same time The AMD Athlon was trading blows in performance and was much cheaper. That was when competition was totally a win-win for us consumers. I also agree with the Core i7 920 as the X58 platform also brought official SLI support as well.
 
Wish I kept my 2600K. Gamed with it for 4 years until replacing it when Skylake came out. Not 100% sure the 6700+ is noticeably better.
 
Wish I kept my 2600K. Gamed with it for 4 years until replacing it when Skylake came out. Not 100% sure the 6700+ is noticeably better.

In games, the CPU really isn't much better. Outside of that the platform has a lot more to offer in the realm of I/O performance. I can tell the difference between a modern system and something like a 2600K system. That's probably only because I happen to have them both side by side on my test bench. Or rather, I have had them side by side a few times.

If I had to do a blind test with a 2600K system overclocked to 4.5GHz+ and a newer system using the same graphics card, I'd probably be able to tell the difference, but I wouldn't sit here and tell you the newer one is worth the cost.
 
I would argue that the Intel DX2-66 should be on that list along with the original Core series...
 
Ok so few things missed:
  • Original slot A athlons, significantly faster at lower clocks than competitors for everything - I had a 600mhz, it was paradigm shifting.
  • C’T 286 - the only 286 which could swap into and out of protected mode without a hard reset. Showing intel how it as done
  • Am386dx40 - who needs a 486 anyway?
  • Motorola 68k series - Amiga anyone?
  • Intel Q6600 - making quad mainstream
  • Ryzen - architecturally very balanced, highest possible throughout per x64 core ever created
  • I7-5775C - integrated gpu with ram on the chip, extremely high performance/watt on the desktop (higher than most things out even now)
 
Back
Top