Moore's Law Ending Soon?

CommanderFrank

Cat Can't Scratch It
Joined
May 9, 2000
Messages
75,399
Intel’s Ex-Chief chip architect must be a psychic of sorts. He looked into his crystal ball and predicted the demise of Moore’s Law within this decade.

“Let’s at least face the fact that [Moore's Law] is an exponential, and there cannot be an exponential that doesn’t end,” he said. “You can’t have it.”
 
I tend to agree but think it will just change or morph into other areas... Desktop CPUs seemed to have stalled, but we are seeing surges in mobile power efficiency which HAS to continue to support of move to everything mobile and small.
 
I tend to agree but think it will just change or morph into other areas... Desktop CPUs seemed to have stalled, but we are seeing surges in mobile power efficiency which HAS to continue to support of move to everything mobile and small.

The laws of physics still hold sway. Mobile performance has improved rapidly over the last half decade because it had been neglected before and because we were at higher nodes, so there was a lot more performance to wring out.

That will not continue for much longer, given the very low TDP limits phones and tablets are required to have.

If anything an end of Moore's Law would hit small form factors worse than large ones because there is much less room for cooling and because even if they work out cooling hungrier parts, you're then limited by battery performance which improves much more slowly.
 
It's pretty obvious to most people that Moore's law isn't going to stay around forever. How soon it will ends, I think we'll have to wait and see.
 
You have to remember quantum computing is in its infancy.
 
Moore's Law changing into something else isn't necessarily a bad thing ... since Moore's Law only specified the doubling of integrated circuits in a fixed time period (18 months) the concept could continue with a longer time period (36 months perhaps) or a new law around using the existing circuits or incremental increases to the circuit count more efficiently (a 50% reduction in voltage every 18 months, a 50% reduction in TDP every 18 months, 50% reduction in total power consumption every 18 months, doubling of battery life every 18 months, etc) ...

the modern CPU and GPU already have significant amounts of capabilities integrated into them ... I would like to see the focus moving forward on better transfer speeds, lower heat, better energy efficiency, etc rather than just trying to put every single function on one chip (enabling the doubling but at the price of heat and efficiency) ... Moore's Law could also just shift to a different arena where ARM chips might be able to continue the doubling as the Tablet and Smartphone markets mature ;)
 
Moore's law was pretty silly from the get go, why there was a paper written on an observation... ehh whatever early 1900s you could get a PhD by describing a simple RC circuit too. However the fact the industry has used Moore's law as a guide to the pace they move out to me seems like if anything it has hampered the industry, why put 4x the number of transistors we only have to do 2x!
 
They've been beating the drum of the death of Moore's Law almost since its conception.

Seems like the engineers are always pulling another rabbit out of the hat to keep transistors shrinking. Though its costing them untold billions I'd have to say that there's no other ongoing engineering project that has had as much of an impact on humanity as the development of the microprocessor. With so much invested, and so many minds working on the problem ironically with the help of the microprocessor itself, I'm sure they'll find a way.
 
i think this has less to do with moore's law failing and more to do with the competitive nature of the chip industry failing. intel has been late with die shrinks and piss-poor with each IC change, its supposed to 2 years for each but now its just tick-tick-tick-tick....tock
 
I feel like it's already dead. But it seems like every time it dies, the definition gets tweaked.

I feel like things have been slowing down greatly since the Core2Duo came out. The last major bump was Sandy Bridge, but even that didn't feel like as big of a bump over the first gen i5/i7 chips as going to Core from P4 did. Ever since Sandy Bridge we have had nothing but mediocre improvements, and Sandy Bridge has already been out longer than the 2 years it takes for "doubling in chip performance" as dictated by Moore's Law.

Whenever this happens, people tweak the definition to make it fit reality after the fact. I think all cores are counted cumulatively for the purposes of Moore's Law, even though simply adding more cores doesn't automatically increase performance in the real world, with severe diminishing returns past the first few cores. Failing that they would probably try to combine the theoretical GPU compute capability of the onboard GPU, etc, even though outside of quicksync it's pretty much going to be useless for most.

When that fails they will probably claim the combined processor power of all the servers powering a cloud service as the continuation of Moore's law, or something else equally as stupid.
 
The number of predictions that Moore's Law will end will double every 18 months.
 
I feel like it's already dead. But it seems like every time it dies, the definition gets tweaked.

I feel like things have been slowing down greatly since the Core2Duo came out. The last major bump was Sandy Bridge, but even that didn't feel like as big of a bump over the first gen i5/i7 chips as going to Core from P4 did. Ever since Sandy Bridge we have had nothing but mediocre improvements, and Sandy Bridge has already been out longer than the 2 years it takes for "doubling in chip performance" as dictated by Moore's Law.

Whenever this happens, people tweak the definition to make it fit reality after the fact. I think all cores are counted cumulatively for the purposes of Moore's Law, even though simply adding more cores doesn't automatically increase performance in the real world, with severe diminishing returns past the first few cores. Failing that they would probably try to combine the theoretical GPU compute capability of the onboard GPU, etc, even though outside of quicksync it's pretty much going to be useless for most.

When that fails they will probably claim the combined processor power of all the servers powering a cloud service as the continuation of Moore's law, or something else equally as stupid.

P4s were kinda crappy chips from the get go. The Athlon 64 chips walked all over them without even trying. Sure, you could clock the P4s super high, but the IPC was horrid as was the memory throughput.

As for the on-die memory controller of the Athlon 64, it would run circles around DDR2 Intel based systems until you got a bit past DDR2-800.

The Core2 chips have a lot higher IPC than the old Athlon 64 chips, but the memory controller was still sucky since it wasn't on-die.
 
I feel like it's already dead. But it seems like every time it dies, the definition gets tweaked.

I feel like things have been slowing down greatly since the Core2Duo came out. The last major bump was Sandy Bridge, but even that didn't feel like as big of a bump over the first gen i5/i7 chips as going to Core from P4 did. Ever since Sandy Bridge we have had nothing but mediocre improvements, and Sandy Bridge has already been out longer than the 2 years it takes for "doubling in chip performance" as dictated by Moore's Law.

Whenever this happens, people tweak the definition to make it fit reality after the fact. I think all cores are counted cumulatively for the purposes of Moore's Law, even though simply adding more cores doesn't automatically increase performance in the real world, with severe diminishing returns past the first few cores. Failing that they would probably try to combine the theoretical GPU compute capability of the onboard GPU, etc, even though outside of quicksync it's pretty much going to be useless for most.

When that fails they will probably claim the combined processor power of all the servers powering a cloud service as the continuation of Moore's law, or something else equally as stupid.

My understanding of Moore's Law is it didn't guarantee a doubling of performance only a doubling of the circuits ... I don't think those two things are synonymous (unless I am misunderstanding something about Moore's Law) ... doubling the transistors might only give you a 20% bump but that doesn't violate Moore's Law :confused:
 
However the fact the industry has used Moore's law as a guide to the pace they move out to me seems like if anything it has hampered the industry, why put 4x the number of transistors we only have to do 2x!

I've always felt the same way -- that Moore's was only 'Law' as a consequence of human action and pacing, not because of some natural, inherent limit on how much circuitry can fit on X amount of area...
 
Well, there is a definite end to MOSFET scaling and it's doesn't take a PhD to understand that. Something will follow MOSFET technology and some smart guy will get credit for recognizing patterns of innovation in whatever that is. Moore's Law (will be) dead! Long live whatever comes next!

BTW, Moore's Law is about numbers of transistors, not the performance increases of chips, although the two are somewhat related (CPU performance generally improves as transistor count increases). Moore's Law is still currently on track, but Intel may have few other foundries following it as the shrinks continue (the IBM/GLF/Samsung "Common Platform" consortium will not have a full 14/16nm node; it's a 20/22nm hybrid with scaling not even halfway to 14/16nm).
 
I think about 2-3 time per year some douche makes a prediction that Moore's Law is ending soon and each time not to soon after such a prediction is made , another breakthrough is made that extends it.

These are becoming so tiring , just like the demise of the desktop PC and the end of PC gaming and the fall of consoles and blah blah blah blah S T F U media.
 
I think about 2-3 time per year some douche makes a prediction that Moore's Law is ending soon and each time not to soon after such a prediction is made , another breakthrough is made that extends it.

https://news.uns.purdue.edu/html4ever/001208.Lundstrom.Mooreslaw.html

Purdue University engineers have new information contradicting the most dire predictions about the imminent demise of Moore's Law, a general rule that is central to the evolution and success of the computer industry.

Yeah. That article was written in 2001 :D
 
Look we've still got to make chips with carbon nanotubes so we've got a few more years of the law left.
But yeah, at some point in time we're going to hit a size where we cannot shrink anymore and chips will start to get bigger and bigger until it'd be insane to put them into devices.
I think we don't need the law anymore due to the fact that most of our current pcs can do anything we throw at them without lagging or stuttering anymore. We multitask freely and the only reason to get better hardware is if you're a gamer.
 
I think people mostly attack it because it has the word 'law' in it, it should be moore's prediction.. no one would write about it.
 
The guy is not imaginative at all. As long as nothing hinders the progress of technology (such as all out nuclear war and the extermination of humans) how can there not be better and better computer chips and such?
 
Back
Top