The year is 2042 -- Your best prediction for Intel's best chip

aphexcoil

Limp Gawd
Joined
Jan 4, 2011
Messages
322
Where's the lithography? What exotic technologies are in use? Has Intel replaced Silicon wafers with something else? How fast are the chips? How many cores do they have? What's the power draw per gigaflop?

My prediction: MoS4 transistors with reprogrammable gate arrays and a maximum performance of 10 petaflops with a power draw of 10 watts per petaflop. Four "main cores" with hundreds of mini "programmable" cores. The chip will use .5nm transistors.
 
I think this more appropriately belongs in GenMay...
 
Before that, revolutionary processors will come into the horizon. It will be "Quantam Computers" that will totally change the way computer industry works. "As traditional silicon computers advance, they are getting nearer and nearer to their physical limit, beyond which they will be unable to advance any further".
 
Last edited:
I think it 20 years we will see some massive integration to System on a Chip (SoC). Just be modules that plug in to a backplane.

I know we want to think things will be wild in 20 years. But looking back 20 years ago, computers as a general rule of thumb have not changed much.
 
it's hard to tell where they're going to be in 5 years, even for themselves (10Ghz anyone :D). I think the quote "Anything sufficiently advanced will be indistinguishable from magic" sums things up pretty nicely.
 
I think it 20 years we will see some massive integration to System on a Chip (SoC). Just be modules that plug in to a backplane.

I know we want to think things will be wild in 20 years. But looking back 20 years ago, computers as a general rule of thumb have not changed much.

you're looking at the wrong time period, there has been no paradigm shift in the last 20 years. Go back to the late 50s to the late 70s to see the real picture, and I think it will be even more dramatic considering how small the computing industry was back then compared to now.
 
2042, I'm not sure. But I know for sure that in 2142 that the ice caps expand and nations fight over the last remaining ice-free lands. Battlemechs are used in a war between the EU and the PAC. Large hovering Titans with vulnerable power cores will hover over battlefields. This ice age that will be started due to carbon reductions from Intel's integration of the GPU onto the CPU die. We have no idea what a gigantic mess has been started by Intel here.
 
wont we just be plugging cords into our heads that have 80gb hard drives in them?
 
To me this is not an easy prediction. I expect silicon to finally hit the wall engineers have been talking about for decades. And that could make CPU improvements impossible making a CPU a commodity item where anyone can do it (since they will have the time to catch Intel). Although quantum computing could take over or even some other technology.
 
Last edited:
wont we just be plugging cords into our heads that have 80gb hard drives in them?

No, it's all going to be wireless.

Also :

PSU : It will be a miniature fusion reactor, so computers will no longer be required to be plugged in.
Monitor : Will be phased out completely and replaced with 3D holograms.
CPU : Option of capable AI, and indistinguishable from another human being when enabled.
KB/Mouse : Phased out, and will be replaced with human thought.

And all of this is going to fit in a tiny 3in cube, which opens up a tiny hole at the top (for the 3D hologram monitor) when turned on via your mind. OK maybe not 2042, but at some point way in the future...
 
No, it's all going to be wireless.

Also :

PSU : It will be a miniature fusion reactor, so computers will no longer be required to be plugged in.
Monitor : Will be phased out completely and replaced with 3D holograms.
CPU : Option of capable AI, and indistinguishable from another human being when enabled.
KB/Mouse : Phased out, and will be replaced with human thought.

And all of this is going to fit in a tiny 5in by 5in cube, which opens up a tiny hole at the top (for the 3D hologram monitor) when turned on via your mind. OK maybe not 2042, but at some point way in the future...


Minority Report :D
 
You'll have to license it from Comcast since they'll have bought up everything by then. You will only be able to perform 250 GFLOP per month or they'll disconnect you from the power grid and send drones to kill you from Comcast-Northrop.
 
2042, I'm not sure. But I know for sure that in 2142 that the ice caps expand and nations fight over the last remaining ice-free lands. Battlemechs are used in a war between the EU and the PAC. Large hovering Titans with vulnerable power cores will hover over battlefields. This ice age that will be started due to carbon reductions from Intel's integration of the GPU onto the CPU die. We have no idea what a gigantic mess has been started by Intel here.

And Core 2 Duo Extreme billboards. C2D billboards everywhere.
 
30 years... For the home user maybe 8-12 CPU cores, A replacement for HT, 2-4 on die graphics cores, some further integrations of components on the processor instead of being on the motherboard. Running at 5ghz.
 
It's going to be some badas* DNA bio chip.

You are gonna feel a bit bad overclocking it, because you know some tiny creatures inside will have to work overtime.
 
It's going to be some badas* DNA bio chip.

You are gonna feel a bit bad overclocking it, because you know some tiny creatures inside will have to work overtime.

Overclocking = putting the nano-hamsters ("namsters" if you will) on speed and dangling a nano-sized carrot in front of them?
 
Probably be using some kind of CPU based off the layout of brain cells: bunch of electron transmitters and receptors. And it WILL run Crysis. Finally.

Albeit, at only 30fps max with no AA/AF...
 
I think it 20 years we will see some massive integration to System on a Chip (SoC). Just be modules that plug in to a backplane.

I know we want to think things will be wild in 20 years. But looking back 20 years ago, computers as a general rule of thumb have not changed much.

What hasn't changed is the ways computers are used - a computer plus mouse, keyboard and monitor. Even the SoC that plugs into a backplane of peripheral sockets will be the same. Until the interface changes significantly - touch screens, voice input, mind control - that's unlikely to ever change.

But I think that computers have changed a lot, and with the very thing you talk about - integration. We no longer have network cards. Business class computers no longer have video or sound cards. No more fans on chipsets. Computers in office settings (which, including laptops, make up the vast majority of computers in existence) are much more reliable than the systems being built just 10 years ago with all of those add-on cards and exra fans to break down.

Then there are LCD monitors, which have been revolutionary, despite doing exactly the same job as before. We have larger, brighter, clearer displays and you don't get a hernia lifting them.

I think the trend will continue, with smaller and smaller PCs, rather than the suitcase sized units sitting on the floor in every cubicle. Fanless computers should become the norm within a couple of years and the energy savings of these more efficient PCs will be significant.
 
The way Intel is going with their IGPU performance, how long before there even are GPUS below the mid range?
 
The way Intel is going with their IGPU performance, how long before there even are GPUS below the mid range?

You should be looking at where AMD is going.

One of the leaked pieces of information has the top mobile Trinity chip performing just below that of the desktop GT 550.

However, GPU tech isn't standing still either. I think what we will see happen is the disappearance of the ultra low range cards (GT 430 level in terms of pricing). I don't think we'll see mid-range cards disappearing anytime soon, especially with the power and cooling restraints on CPUs.
 

That's a great video, but they've been saying that Moore's Law will break down within a decade for probably over 15 years. Every time it's in jeopardy, some new piece of technology goes from research to production and we're given another 5-7 years of headroom.

When you have tens of billions of dollars of potential revenue at stake, you find out how to cheat physics. It's almost as if Intel has found a way to show that economics is stronger than physics.
 
My prediction.... No home computers, centralized computing hubs (super computers) you access from home.
 
it's hard to tell where they're going to be in 5 years, even for themselves (10Ghz anyone :D). I think the quote "Anything sufficiently advanced will be indistinguishable from magic" sums things up pretty nicely.

I have a pretty good bead on 5 years. 10 years is more of the crapshoot timeline.
 
In 30 years, I'll be 61 and need a C2D just for my dick. Ok really signing out, just couldn't help that one, made me laugh.
 
In 30 years, computers will be pretty much the same, except they'll be worth less than a gallon of milk because we'll all be living in a radioactive post nuclear war wasteland.
 
In 30 years, computers will be pretty much the same, except they'll be worth less than a gallon of milk because we'll all be living in a radioactive post nuclear war wasteland.

Not me, I'm keeping my ass in the Vault! And I'm taking my computer with me so I can play Fallout 1, 2, 3, and NV over and over again to keep reminding myself why I'm keeping my ass in the Vault.
 
My prediction: All the predictions are wrong. You cannot predict technology that far out.
 
Quantum computing, though what it would look like I have no idea.

The D-wave looks like this

d-wave-quantum-computer.jpg

granted it's not the whole thing and it isn't technically a quantum computer either.

We might be using some diamond-like multi-layer graphene structure.
 
Cutting edge consumer processors in 2042? :eek: I have a hard time imagining past the middle of next decade.

I mean, I can see the end of scaling in atomic gate structures, then possibly the move upwards to "true" 3D chips, then some brilliant smarty pants developing a new computing paradigm which is useful for general computation to finally replace the pimped out concepts of 70 year old theories we have now, after which hardware will follow a few years later. Then people will be debating if the replacement is better than the old running Crysis Centari. ;)
 
Well here's a slightly easier question, then. How much longer do you think Moore's Law will hold up?
 
Back
Top