Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
It used to be the world's largest chip maker until it was split into <pretentious madison ave generated name> and CPU Holdings Inc in the Apple (1976-2029, RIP) brought anti-trust settlement.what's intel?
I think this more appropriately belongs in GenMay...
That's a great video, but they've been saying that Moore's Law will break down within a decade for probably over 15 years. Every time it's in jeopardy, some new piece of technology goes from research to production and we're given another 5-7 years of headroom.
When you have tens of billions of dollars of potential revenue at stake, you find out how to cheat physics. It's almost as if Intel has found a way to show that economics is stronger than physics.
I'd be willing to bet that platter drives will be end of life by then.
All I know is that in the year 2042 another person will make the prediction that Moore's Law is doomed and can no longer continue and on my dying bed I'll choke out a laugh through an oxygen mask and say to the sexy nurse giving me a towel bath, "Hahuh, I heard that one before!"
I would be willing to bet silicon flash NAND based SSDs will have been EOLd by then as well (actually long before then). The big question is what storage technologies will replace both.
Would we see huge gains in both efficiency and performance?
Would the 8bits-in-a-byte idea change for home computers?
YesNo. Those two aren't mutually exclusive, even in the short term. The second generation Atom core on 22nm coming next year will leapfrog even ARM, mostly due to the lead in process technology. (However, it won't challenge ARM in most segments... ARM has advantages in customability and lower price which are hard to overcome.)Would we see huge gains in both efficiency and performance?
Last year, a research paper pointed out that the total instruction execution capability per second in 2007 on general purpose computers was "in the same ballpark area as the maximum number of nerve impulses executed by one human brain per second." Both have much different purposes of course, and each is mostly specialized in the methods they use to solve problems.
Well, yeah, but that's a separate point. The methods computers in the far future will use to process general tasks will also likely be different from the brain or current computers. I mentioned the non-general computing quantum computer as an example.There are a lot of differences that go beyond specialization. For one, compared to a computer chip, the human brain is nigh-inifinitely parallel.