By 2020, You Could Be Using an Optical SuperComputer

I assume I will be able to connect it to my jetpack for navigation?
 
I assume I will be able to connect it to my jetpack for navigation?

I remember a lot of discussion about this in the late 80s early 90s when I was an EE student. 20+ years later it still has not happened..
 
six years ago 1366 cpus came out, and todays cpus havent changed all that. and now your telling us that in six years from now we will have a light speed computer?
 
Intel or AMD would never push something like this so soon (even if they had the capability)...too much money to be made by baby-stepping the performance of their CPU offerings with each new release in terms of slight IPC and clock speeds. If they were to release a CPU that operates at the speed of light, then where can they really go from there that would want to make people upgrade again?

More cores? That's fine, as long as the software can utilize them all.
More IPC? How much real-world difference is a small IPC increase going to make when running at the insane clock speed of light speed?

Just because it could be feasible to mass produce such a processor in 6 years is not a guarantee that we will see it happens in 6 years, if even in our lifetime.
 
*yawn*... i'm pretty sure electricity "travels" at light speed already... or near it...
 
it still does not solve the foundation problem, the "parallel management" issues
 
I remember a lot of discussion about this in the late 80s early 90s when I was an EE student. 20+ years later it still has not happened..
Boy, you're telling me. I have to navigate my jetpack manually and it's a bitch. When Google's self-flying jetpacks come out, I'm totally jumping into one.
 
Yeah, just wait until a bit flip happens on your data collection sensor! It's a cool idea, because essentially you're evading the branch predictor in calculations by running them all at once, but the system seems weak at the collection point.

*yawn*... i'm pretty sure electricity "travels" at light speed already... or near it...

Circuit components have quite a bit of drift velocity, and reflections compared to a pure optical solution without a damaged pathway.
 
Hell in college (back around 2000) I remember my prof talking about asynchronous cpus Intel had in their test labs that ran up to 7ghz with 33% more power since none of it was wasted keeping them on the clock.
I also remember laser cpus being discussed as the future due to less power and less heat.
What have we seen since then?
Not a damn thing, more of the same.
If Intel and AMD build these super computers, they'll use them to design more of the same thing we've already got.
 
These will fit great into everyone's flying car as well!

Love how their marketing crew made sure to use "Professor Einstein" as narrator to lend it that touch of gravitas.
 
Where Intel's Light Peak fiber optic cables that we where all suppose to be using by now.
 
Could be fun buying special math co-processors for your computer like in the old days.
 
I do not buy it at all. By 2020 we will have better versions of what we already have.

This. Corporations are always looking for the cheapest way to profit. Why alter the pot if it boils all the same?

This might see usage inside of a college but I doubt we'll have anything special by 2020.
 
According to the article, it will cost $3500 a year in energy. At the average US cost of $0.122 per KWH and assuming the computer is run 24-7, you're looking at 3275W. In reality their estimate is probably an average and thus the wattage is probably way higher than that. This isn't a machine that will be running in a consumer's home any time soon. Not that the article says that...it's just talking about size.
 
It's been a glacial pace the last 10 years. Every 4 years I would upgrade and my clock speed would quintuple. With a 386SX-20 in 1991, I went to a 100MHz Pentium in 1995 (a case I kept as my main rig for 17 years), a 500MHZ Pentium III in 1999, and 2.53MHz P4 in 2003....We know NetBurst turned out to be NetBust, and there's other metrics besides clock speed in measuring processing power, but had the trend continued, we'd have 12.5GHz in 2007, 50GHz in 2011, and next year in 2015, 250GHz. That would have put the first desktop terahertz processor in our hands by 2019....5 years from now!

If you had told me back in 2003 after seeing the first 3GHz processors, that it'd be another 10 years before seeing the first stock 4GHz (besides OC), I'd have thought that would be the most nutso thing I'd heard, on a level of Family Radio rapture end-of-the-world insanity. So computers running on light? HAHA no....even by 2030 I'd be surprised, too bad I'll hafta wait a long time to tell them "told ya so!" Put them alongside the bubble memory and holographic storage that was supposed to give us 50GB drives in 1993. Maybe hair cloning will also come out around the same time this does.
 
If this is really a feasable option, it's possible that ARM or samsung could invest in it.
 
I always find it fascinating that people expected clock rates to continue going up...completely ignoring the fact that like everything else, it isn't a linear process. You get difficulties at higher frequencies (of anything) that have NOTHING to do with direct efficiency, heat, etc., like crosstalk interfence. It's simply a limitation of physics. When you've seen things posted in the news about 10 GHz chips in labs, they're not talking about chips hitting 10 GHz as we know them but rather switching frequencies for things like communications chips. It's a common mistake to equate the two but really...if you were expecting 12.5 GHz in 2007 then you knew nothing of physics, electronics, or computing.

I hate to take that tone with it but as a computer engineer it just bugs me to NO END hearing people say the same thing about "frequency dearth" for the past 5-10 years when no one in the field ever expected or stated chips would achieve a certain frequency.
 
Intel or AMD would never push something like this so soon (even if they had the capability)...too much money to be made by baby-stepping the performance of their CPU offerings with each new release in terms of slight IPC and clock speeds. If they were to release a CPU that operates at the speed of light, then where can they really go from there that would want to make people upgrade again?

More cores? That's fine, as long as the software can utilize them all.
More IPC? How much real-world difference is a small IPC increase going to make when running at the insane clock speed of light speed?

Just because it could be feasible to mass produce such a processor in 6 years is not a guarantee that we will see it happens in 6 years, if even in our lifetime.

You are right to some extent, although there will be a performance war of some form as AMD try to reclaim a performance crown and more business.
But also if both CPU giants stall progression when it is readily available (or easy enough to develop further), some other business or startup will steal the march.

Once it is viable, we will get it.
Price is another matter though :p
 
Circuit components have quite a bit of drift velocity, and reflections compared to a pure optical solution without a damaged pathway.
But the speed of the electrons is irrelevant, the signal which moves electrons closer to where they are used travels at close to the speed of light.
 
I think the thing that saddens me is that in 2020 at my present rate, I will still be using my Amd thuban 1055t. :(
 
Anyone else want to start placing bets on what will happen first? Social Security will run dry before these computers are manufactured in decent numbers.

Can't wait for the first overclock attempt. Somebody is going to turn that thing into an awesome laser death show.
 
I do not buy it at all. By 2020 we will have better versions of what we already have.

Slightly better at the rate we're going. Seems we've been going extremely slow since the Core i series was released. Minor improvements, but nothing huge. If you're running an overclocked 2600K or similar, you're still fine with what's out today.
 
So in the future all computers will have lots of blinking lights?
Just like predicted, in every old movie?
 
Slightly better at the rate we're going. Seems we've been going extremely slow since the Core i series was released. Minor improvements, but nothing huge. If you're running an overclocked 2600K or similar, you're still fine with what's out today.

I say this is a result of the 22nm process not being an improvement over the 32nm process at frequencies over 4GHz. AMD has a similar problem their 28nm bulk is not better than their 32nm SIO at the same frequencies.
 
So..... I now know there is no chance that I'll be using an optical supercomputer by 2020. Are these predictions ever right?! After 30 years of reading them, rarely, if never.
 
Back
Top