I've been out of the hardware world for a while, but I remember in the 90's CPUs were:
100Mhz
133Mhz
166Mhz
200Mhz
...
Every single release was +33Mhz. How does that work? You're telling me every single breakthrough in CPUs just happened to be an additional 33Mhz, or consistent with Moore's law?
Are they trying to squeeze money out of us, or has there ever been a giant breakthrough in CPUs that went beyond Moore's law, such as a big jump from 1 Ghz to 2 Ghz like you would expect to happen once in a while?
100Mhz
133Mhz
166Mhz
200Mhz
...
Every single release was +33Mhz. How does that work? You're telling me every single breakthrough in CPUs just happened to be an additional 33Mhz, or consistent with Moore's law?
Are they trying to squeeze money out of us, or has there ever been a giant breakthrough in CPUs that went beyond Moore's law, such as a big jump from 1 Ghz to 2 Ghz like you would expect to happen once in a while?