Intel Predicts Moore's Law to Last Another 10 Years

CommanderFrank

Cat Can't Scratch It
Joined
May 9, 2000
Messages
75,399
We’ve all lived with it for 50 years and have always relied on this rule and its application. Moore’s Law may be coming to the end of its run, according to Intel’s Mark Bohr. Bohr, a senior fellow at Intel, predicts that Moore’s Law will probably only last another 10 years.

Bohr predicted that Moore's Law will not come to an abrupt halt, but will morph and evolve and go in a different direction, such as scaling density by the 3D stacking of components rather than continuing to reduce transistor size.
 
You could make processors infinitely faster so long as your willing to increase chip die size. Which will increase power consumption and heat. Something that people aren't willing to deal with in tablets and smartphones. The desktop however can pile on the power and heat. Size is also not an issue on the desktop.

There isn't a big different between a 386 and a 4770k in size. Intel wouldn't want to make bigger chips to increase performance, cause it would reduce their profits. They want to make better use of the silicon or whatever material they use now. Unless Intel goes quantum computing, the chips will have to get bigger and hotter.

Intel_i386DX_25.jpg


Core-i7-4770K.jpg
 
The desktop however can pile on the power and heat.

The Geforce 480 GTX might disagree. While the issue is obviously more acute with handheld battery driven devices heat and power are still very big concerns on the desktop. Heat, power draw and noise are elements in every discrete GPU review on this forum for instance.
 
using the wrong word for for evolve

to change or develop slowly often into a better, more complex, or more advanced state : to develop by a process of evolution

thats the proper definition of it
 
That's ok, because console ports will ensure that gamers don't get to feel the full benefit.
 
I agree in a way but they were also considerably less complex.

You mean more complex right? You used to have to install RAM chips individually, ISA sound, video and NIC or modem, manually set IRQs, possibly SCSI CD ROM or tape drive.

Today it's so much easier.
 
You mean more complex right? You used to have to install RAM chips individually, ISA sound, video and NIC or modem, manually set IRQs, possibly SCSI CD ROM or tape drive.

Today it's so much easier.

More complicated from an end user perspective sure. But today's computing device are far more complex technologically. Thousands of times faster, millions of times more memory and storage with far more complex software connected in real time with equally as complex machines doing who really knows what most of the time.
 
you do know moores law. of course you can increase die size. but the law is. every 2 year the the amount of transistor in a circuit doubles. factory get efficient. but also cost to operate quantum. surprising it hasnt reached there yet. having a bigger die size is working backwords. where at 22nm. they already struggling to keep up the pace every 2 years. with the tick tock plan.
 
from a previous post of mine..


http://slashdot.org/story/11/01/04/1922239/45-years-later-does-moores-law-still-hold-true

In 1965, an article in "Electronics" magazine by Gordon Moore, the future founder of chip juggernaut Intel, predicted that computer processing power would double roughly every 18 months. Or maybe he said 12 months. Or was it 24 months? Actually, nowhere in the article did Moore actually spell out that famous declaration, nor does the word 'law' even appear in the article at all.

http://www.foxnews.com/tech/2011/01/...law-hold-true/

"Yes, it still matters, and yes we're still tracking it," said Mark Bohr, Intel senior fellow and director of process architecture and integration. The company is certainly one reason Moore's Law has remained in the public's mind: A section on Intel's website details the law, explaining that "his prediction, popularly known as Moore's Law, states that the number of transistors on a chip will double about every two years. Intel has kept that pace for over 40 years, providing more functions on a chip at significantly lower cost per function."

Bohr told FoxNews.com that doubling the number of chips is far less important these days than making them smaller, which has other tangible benefits for consumers

"Moore’s law isn’t tracking exactly, but the spirit of the law is still alive in that the dies are still shrinking, and CPUs become more and more capable every 12-18 months or so," said Joel Santo Domingo, lead analyst, desktops at PCMag.com. His former boss agrees.

"I did the math, and while it’s not exactly doubling every two years, it’s pretty close," agreed Michael Miller, the award-winning math geek and former editor in chief of PCMag.com.

Maybe, as Johnny Depp said in the Pirates of the Caribbean movies, it's really more of guideline?

"Semiconductor chips haven't actually tracked the progress predicted by Moore's law for many years," said Tom Halfhill, the well respected chip analyst with industry bible the Microprocessor Report.

Halfhill is quick to note that Moore's law isn't truly a scientific law, merely "an astute observation." In fact, since Gordon Moore made his observation in '65, the law has been modified and manipulated to fit the actual progress of semiconductors to such an extent that it can arguably be said to have predicted nothing.

It's also been so frequently misused that Halfhill was forced to define Moron's Law, which states that "the number of ignorant references to Moore's Law doubles every 12 months."

Halfhill wrote a paper for "The Microprocessor Report," published in December of 2004, which debunked the connection between Moore's Law and reality. In it, he noted that Moore's Law was more like Bode's law, an observation by early astronomers that each planet in our solar system is roughly twice as far from the sun as the planet in the next inner orbit.

"Modern astronomers don't expect the distances between planets to add up exactly, and they don’t expect other solar systems to conform to the same rules," Halfhill explained. Likewise, engineers don't really require the latest generation of computer chips to exactly meet Moore's Law either.

In fact, to make it track more closely to actual transistor counts, he proposed Epstein's amendment, named after a fellow editor at "The Microprocessor Report," which adds a leveling factor that accounts for the law of diminishing returns.

Halfhill is quick to point out that the Law is meaningless -- but the idea that computing keeps relentlessly advancing, that's what's really important.
 
Translation: Intel will drag things out to last another 10 years to maximize revenue.
 
The only thing on a computer back then that was plug and play was the computer's power cord.:D

You mean more complex right? You used to have to install RAM chips individually, ISA sound, video and NIC or modem, manually set IRQs, possibly SCSI CD ROM or tape drive.

Today it's so much easier.
 
The Geforce 480 GTX might disagree. While the issue is obviously more acute with handheld battery driven devices heat and power are still very big concerns on the desktop. Heat, power draw and noise are elements in every discrete GPU review on this forum for instance.

I have a 470 and noticed it was running hot. Installed the EVGA software, noticed the fan spun at 50% speed, max. Upped that to 65% for all but the highest tiers where it runs at 75% and the card has trouble breaking 135f.

Fermi was known for heat problems but from my experience, that was also because people were being too stingy about ramping up the fan.

You could make processors infinitely faster so long as your willing to increase chip die size. Which will increase power consumption and heat. Something that people aren't willing to deal with in tablets and smartphones. The desktop however can pile on the power and heat. Size is also not an issue on the desktop.

Not quite that simple. Like Ram traces, chips have to be organized carefully the faster they get... otherwise you run into timing issues. You could theoretically make a chip much larger but then the chip would also end up with the pesky problem of the speed of light (information) being the speed limit for data to travel from one end of the chip to the other. We're closer to that than you realise, hence the building UP with 3D gates/transistors and multiple layers of etching.
 
The Geforce 480 GTX might disagree. While the issue is obviously more acute with handheld battery driven devices heat and power are still very big concerns on the desktop. Heat, power draw and noise are elements in every discrete GPU review on this forum for instance.

The way we cool GPU chips is asinine. We throw huge amounts of copper piping with 2 to 3 fans to cool a graphics card. All this to maintain a 70C to 80C temperature? While a water cooling solution would use a single water block with a single fan that would take dramatically less space and maintain a 50C to 60C temp.

And if we pile on the chips like super computers you don't need to worry about temperature so long as each individual chip is effectively cooled. Like a R9 295X2 or Xeon machines with dual chips.
 
Back
Top