Has Moore's Law stalled?

Wiseguy2001

2[H]4U
Joined
Nov 28, 2001
Messages
3,470
I was going through some boxes yesterday and found the box and OEM heatsink for my i7 K2700k, it was dated 2010! :eek:

I looked at CPU's online and prices have hardly changed and nothing like the performance increase of going from a Q6600 to 2700K.

Same goes for GPU's too, STILL haven't moved past 28nm. Which has been out for more than two years now and will probably be closer to 3 before we see 20nm GPU's.

Guess I'm old (30) and used to constant product revamp's from the last decade or so.
 
I'm about 30 aswell. I think we have slowed down on technology only because we don't have the demand for it right now like we did in the 90s and 2000s. Things will probably speed up again next cycle due to 4k and 8k which will require atleast double the horsepower for gaming and work related tasks.

But yea, 30 isn't too old, I still feel like a teenager and still act like one!
 
Regarding modern computation technology, Moore's law is an antiquated idea. And I think flawed in the way it is portrayed to and understood by lots of people (not saying you, OP). In its purest interpretation (transistor count increase vs. time), there's no physical reason to believe this trend could possibly be true indefinitely without some asymptotic condition--with that being the case, where are we re. that asymptote?

But even so, you have to factor in the idea that even if it were possible to keep pushing the transistor size (or count per area), is there commercial reason for tech companies to continue expensive R&D to churn out another refresh when there's little cause for or benefit from it on the software/services side to the end customer (I imply standard commercial consumers).

Not to mention the idea of this observation could just be a self-fulfilling prophecy stated by one of the founders of the world's largest IC company. There's no natural law that says transistor count/IC should double every two years, and the basic principle originally ignored the idea that new IC tech would also become more efficient (vs. just cramming more transistors in an IC).

In case it wasn't coming through my tone, I've found the idea of Moore's law to be meaningless, especially when computing tech has advanced to a point (of practical use) beyond where Moore's Law really considered or applied.
 
Moore's law has kinda been stalled for awhile. The only reason transistors have really been doubling even up to 2010 was because of multi-cores being slapped on the same die. Right now the extra silicon is being used for iGPU's so in that sense, yes it's stalled even more, but at the same time besides throwing extra cores there seems to be little they can do at this point. They keep talking about IC improvements being the focus from here on out, but thus far in the past 3 years the IC improvements have been laughable. The time will come when improvements will be needed on the CPU front for future software and probably gaming leading that charge. Right now things are gravy, but that will end eventually if nothing is done about it.

Silicon's limits have been known for awhile, the only thing that's constantly and surprisingly changed in recent years was the ability to continue to shrink the die space.

Pricing hasn't changed (and won't anymore) because AMD is no longer a competitor in anything but the low-end market. Bulldozer/Fusion was a disaster as people thought it would be. The only hope for AMD to climb back into the market is for the entire world to fully embrace Unified Memory. Even then it would take another 10 years to see those fruits.

Everyone who isn't Intel is struggling getting to 20nm and below. Intel is overzealous and struggling with 14nm let alone 10nm or below. Right now we are just forced to wait until silicon dies as a mainstream use and for something else to present itself. The industry will stall big time if they can't find a workable, mass produced alternative sooner then later.
 
Back
Top