Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
I don't think anyone predicted how hard it takes to overcome the power wall as you crank up the clocks. Ultimately that meant the end to the Netburst and 10Ghz chip roadmaps
The clock is still ticking on the first post to somehow attribute this to a failing in the current administration.
The comments from that article are priceless; Reading peoples predictions from 10 years ago is both fascinating and hilarious.
"The point is, within ten years, we won't be using silicon-based computers. They'll be made obsolete by DNA/protein type bio-computers or maybe molecular computers."
this is obama's katrina
I've seen reference to a 6.5 GHz Phenom II X4 by an end-user, and 7 GHz by AMD themselves.
And an 8 GHz Pentium 4.
I just love the first comment, though. "If 10 GHz is the best that Intel can do by 2011, AMD or somebody else is going to eat their lunch."
I do agree with the "980X counts as 20 GHz", though. At the time the article was written, they were both misunderstanding Moore's Law, and not taking into account multiple cores. For what the prediction meant (even if they didn't know it,) we've exceeded it by a factor of 2.
Intel didn't quite hit the front side bus number, though. They used the NetBurst 100 Mhz * 4 as a basis, which is 3.2 GB/s. The fastest QPI is 25.6 GB/s, not quite 10x.
May 2004
It does beg the question, however, whether the Thief games will survive the march of progress. CPUs may hit five gigahertz by the end of this year and will almost surely reach 10 GHz by 2007
What I find more interesting are the comments by the readers way back in 2000. Most comments were that 2010 was going to be all great and stuff for computers, BUT from what I can see, most of it is still the same ol' same 'ol just a little faster.
I do agree with the "980X counts as 20 GHz", though. At the time the article was written, they were both misunderstanding Moore's Law, and not taking into account multiple cores. For what the prediction meant (even if they didn't know it,) we've exceeded it by a factor of 2.
I know people don't look at multiple cores as additive but, I would say that with the introduction of the first Core 2 Quad processor, the much loved Q6600, Intel matched and fulfilled that prediction even in spite of it being just a 2.4 GHz processor natively (ok, the 4 x 2.4 = 9.2 GHz of potential effective processing power isn't 10 GHz but, dammit it's close enough).
^EDIT: Well, almost QFT. As I said, by strict interpretation of Moore's Law we've just nearly met it. Applying the same law to processing power, we may very well have exceeded it.
Rather than additive, i'd say it's just a natural progression. 1Ghz today does a heck of a lot more than 1Ghz 10 years ago
Well if you count that there is now 6-cores running at 3.33ghz, in highly threaded applications that is like one core at 20 GHz.
The comments are absolutely rediclous on that article. This one is actually 10% correct though.
"Personally, I think there will be problems making microprocessors go as fast as Moore's law would predict, due to RF interference generated by the ever shorter wavelengths of the data pulses internal to the processor, and due to excessive heat. My ideal view of a future computer would be massive multiprocessing say, multiple 10-GHz chips each with their own memory and bus, working in parallel or each of them running different programs when I am multitasking."
Seems to me just about every architecture is becoming less efficient than the previous (dont take that out of context) however...scalability seems to be increasing.
Sometimes this multi-core business feels like one of those stupid "Buy 3, pay for 2" deals that some stores run in an attempt to sell stuff people don't really want...
During the "MHz wars", you could upgrade from a 500 MHz Pentium III to a 1.2 GHz Athlon, and everything would run over twice as fast, including poorly optimized games and applications. Now you're much more dependent on developers writing good, multi-threaded code...
You cant blame the technology for developers being lazy. The MHZ wars where also largely marketing bullshit. Us enthusiasts now know (or should) that superior architecture means far more than higher clock speeds.
3GHZ i7 > 10GHZ single core any day of the week.
I would like to see a comparison of past flagship chips vs current chips. Disable all but one core on an i7 and cut the speed down to match each chip.
but will it run Crysis?
You still don't add the clocks.
You still don't add the clocks.
True, but would the single 10 Ghz processor perform calculations faster than the Hexacore 3.33 Ghz processor? Hard to say, but I'd give the edge to the multicore processor, given of course that the program properly took advantage of multi-threading. As long as the computational performance is there, the clock speed is secondary imo.
It is more the other way around. NetBurst died because they couldn't ramp up the clock. At 10 GHz, NetBurst still would have been pretty awesome.