Moore’s Law Is Showing Its Age

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
Manufacturing issues and delays with Intel's 10-nanometer technology has called Moore's Law into question once again.

Intel, which tends to brag about shrinking circuitry on its chips more quickly than competitors, disclosed Wednesday that for a second time it would miss a two-year schedule for making more compact chips. The Silicon Valley icon went further than projecting another one-time delay; it set a new definition of what people should expect in the future.
 
Well, all good things must end.

The good news is that processor may keep their value for a longer period of time.

Also, this is likely to stimulate R&D in alternative technology to silicon such as carbon nanotubes.
 
This is GREAT news to me and fits in perfectly with what I want to spend. For the last 15 years I've spent so much money upgrading to the latest and greatest when most of the time, I could have made those computers last twice as long.

If they don't have new CPU's out, I can't spend any money.

I have a new 5820 and DDR4 and love it. I would like to keep it for at least the next 12 - 18 months.
 
Well, for consumers, they can still increase die size (closer to Xeons) or launch a new SKU directed at gamers with discrete graphics that allocates little to no space to an IGP.
 
Oh bullshit... If AMD was competitive in the CPU market there would be 10nm parts shipping in September. :p
 
This is GREAT news to me and fits in perfectly with what I want to spend. For the last 15 years I've spent so much money upgrading to the latest and greatest when most of the time, I could have made those computers last twice as long.

If they don't have new CPU's out, I can't spend any money.

I have a new 5820 and DDR4 and love it. I would like to keep it for at least the next 12 - 18 months.

There's also diminishing returns to consider.

I used the same CPU for nearly 6 years. My time between upgrades was always under 2 years before that.

I upgraded to the near maximum CPU my motherboard could support, from an Athlon II x4 to a Phenom II 1055t.

And you know what? I didn't even need to do it. Sure. Turbo is handy. 6 cores is beastly. But I barely use it all.

I can easily see myself going another 5 years on this CPU.

That's 10 years on one generation of a CPU.

And before you even consider that a lot of games and apps still don't make use of multithreading all that well, if at all.

We're well past the days where a 100Mhz overclock means anything. When 100Mhz is 10% of your overall speed, great... but not so much when it's 2.5% of your overall speed.

We've been at or approaching this great plateau for a little bit of a while now.
 
Still on my i5-750 clocked at 4ghz...over 5 1/2 years. It is most definitely the longest I have ever gone without upgrading my entire system. Video cards are another story...
 
Well, all good things must end.

The good news is that processor may keep their value for a longer period of time.

Also, this is likely to stimulate R&D in alternative technology to silicon such as carbon nanotubes.

Carbon nanotubes can only shrink the conductor. That wont help shrink gate size. The last gate size shrink they did wasn't even a shrink. It was just a vertical gate. They are approaching atomic limits. At a certain size, silicon just becomes a conductor and not a semi-conductor. Quantum tunneling is observed in the electrons and you get excess gate leakage. Controlling that with different materials is the key.
 
I agree I have had my 3930k since it came out and I feel bo need to upgrade it yet which is good and kinda sucks, I wsnt a new rig but its hard to go speend a few grand when the new stuff makes litle difference gaming. So I thew new gpus in it and called it a day
 
Computers may finally be turning into appliances like a refrigerator or microwave: a device that just works for years without much thought put into it by the owner.
 
Computers may finally be turning into appliances like a refrigerator or microwave: a device that just works for years without much thought put into it by the owner.

The same is happening with phone style computers. We seem to have reached the high water mark.
 
http://phys.org/news/2015-07-year-massless-particle-next-generation-electronics.html
An international team led by Princeton University scientists has discovered [evidence of] Weyl fermions, an elusive massless particle theorized 85 years ago. The particles could give rise to faster and more efficient electronics because of its unusual ability to behave as matter and antimatter inside a crystal, according to new research.

The researchers report in the journal Science July 16 the first observation of [evidence of] Weyl fermions, which, if applied to next-generation electronics, could allow for a nearly free and efficient flow of electricity in electronics, and thus greater power, especially for computers, the researchers suggest.
Before sfsuphysics comes in here, they didn't actually find Weyl fermions, they found traces left by Weyl fermions, which could still prove extremely useful.
 
moore's law is a self-fulfilling prophecy, not so much a law. it has become a design goal in the industry long ago so the "law" has been more or less correct for much longer than it would have been otherwise.
 
It's a law, so it must be part of nature... THEY ARE GOING AGAINST NATURE... now they will be punished accordingly :D
 
Seeing as "Moore's Law" was never meant to exist in the first place, it doesn't really matter. The media took a comment that Gordon Moore made offhand about the near future and extrapolated that into a theory that is totally impossible.
 
That's why we have stacking, but so far thats only being used in memory chips for next gen gpu's.
 
Oh bullshit... If AMD was competitive in the CPU market there would be 10nm parts shipping in September. :p

I don't think there's any lack of motivation for Intel to advance their process tech as fast as they could. I believe Intel is still trying to compete in the mobile segment where they are facing plenty of strong competitions there. Being the first to do 10nm would give them an advantage.
 
Carbon nanotubes can only shrink the conductor. That wont help shrink gate size.

That is why the new IN material is Black Phosphorus or Phosphorene, which shares some properties with graphene (the high conductivity) but unlike it, it is a semi conductor (graphene is a semi metal, you know what it means but for others who don't, it means that it is a great conductor at pretty much any energy level, which makes it impossible to turn off in an efficient manner, phosphorene is amazing conductor at certain power bands and thus better suited).

This still won't help shrink gate size but will massively increase the frequency at which they can work.
 
Computers may finally be turning into appliances like a refrigerator or microwave: a device that just works for years without much thought put into it by the owner.

The same is happening with phone style computers. We seem to have reached the high water mark.

Pretty much this for me. Haven't quite figured out if its because I am getting older and don't really care or its actually the technology. I assume its both.

I am still running and i7 920... so 5 years. Though I have thought about upgrading recently.

I updated my HTC One (m7) recently not due to performance (well the camera sucked) but more due to carrier/signal change.

Games still push hardware but with "consoles" much less than "in the good ol' days". Now you can easily get a good gaming experience with lesser hardware, you just won't get the graphic candy, which most disable when playing online/competitive anyways.
 
I used to build a new machine every year. Sometimes less.

I haven't built one in 2-3 years now. I've upgraded video cards and hard drives, but I've seen zero compelling reason from Intel to upgrade since.

I started with an i920. It tool a LONG time to need to upgrade that. Then it was on to a Sandy Bridge first gen (2600K), then to a Sandy Bridge second gen (forget the cpu, but as usual the as-then sweet spot processor).

I tend to upgrade my gaming machine and rotate the old one to desktop duty.
 
This is GREAT news to me and fits in perfectly with what I want to spend. For the last 15 years I've spent so much money upgrading to the latest and greatest when most of the time, I could have made those computers last twice as long.

If they don't have new CPU's out, I can't spend any money.

I have a new 5820 and DDR4 and love it. I would like to keep it for at least the next 12 - 18 months.

Oh contrair mon-friare.

This has nothing to do with putting more processing cores on a chip, etc etc. Nor does it touch on the still largely un-leveraged ground of parallel computing. There is more then enough headroom left in increasing computing power to keep us buying new stuff for some time yet.
 
When has AMD ever mattered? Even in the Athlon 64 days Intel just leaned on OEM's to stick with their crappy Pentium 4's or else.

Really? You have any idea how many servers are still out there running on Opterons? My last job we had several HP Blade servers, some stack were Intel but just as many were AMD.

Go to Best Buy or look online, see how many laptops are still being sold with AMD CPUs.

Here is laptops on Best Buy
Intel (2,068)
AMD (269)

Here is desktops on Best Buy
Intel (1,675)
AMD (382)

Laptops at Tiger Direct
Intel (999)
AMD (107)

Desktops at Tiger Direct
Intel (861)
AMD (225)

If your idea of "AMD mattering" is that they have to surpass Intel then maybe you are right. But if your idea of AMD mattering has more to do with their accomplishments and there effect on the industry, then I don't get your statement.

AMD has mattered for a very long time now. If AMD didn't matter then there wouldn't be a server running with their CPUs inside.
 
Oh, and when it comes to servers on Tiger Direct, it's Intel 37, AMD 10.
 
If your idea of "AMD mattering" is that they have to surpass Intel then maybe you are right. But if your idea of AMD mattering has more to do with their accomplishments and there effect on the industry, then I don't get your statement.

AMD has mattered for a very long time now. If AMD didn't matter then there wouldn't be a server running with their CPUs inside.

You're missing my point. What I'm saying is that even in the Athlon 64/Opteron glory days Intel didn't cut prices on their awful netburst products. Intel completely dominates the market and sets the prices, AMD is simply too small to matter. Even when AMD did something innovative like integrating the memory controller Intel could afford to wait and do the same thing years later.

Not knocking AMD, I'm a big fan of their products, I'm just tired of the 'if only AMD was more competitive' line with regard to Intel's pricing or performance.
 
I am using an i7-2600K from 2011 and unless there's a major technology shift requiring a different CPU for new apps I don't plan to upgrade this until it literally burns out. It does what I need to do just fine.
 
When has AMD ever mattered? Even in the Athlon 64 days Intel just leaned on OEM's to stick with their crappy Pentium 4's or else.

AMD matters because we'd all be using those power hungry, inefficient and insufficient P4's otherwise.

Anything that keeps Intel more honest matters. You prove that point with your own comment.
 
Back
Top