By 2020, You Could Be Using an Optical SuperComputer

Babbster

[H]ard|Gawd
Joined
Jan 13, 2006
Messages
1,434
I assume I will be able to connect it to my jetpack for navigation?
 

drescherjm

[H]F Junkie
Joined
Nov 19, 2008
Messages
14,938
I assume I will be able to connect it to my jetpack for navigation?

I remember a lot of discussion about this in the late 80s early 90s when I was an EE student. 20+ years later it still has not happened..
 

rocketr2

Gawd
Joined
Mar 16, 2007
Messages
770
six years ago 1366 cpus came out, and todays cpus havent changed all that. and now your telling us that in six years from now we will have a light speed computer?
 

DejaWiz

Fully [H]
Joined
Apr 15, 2005
Messages
21,587
Intel or AMD would never push something like this so soon (even if they had the capability)...too much money to be made by baby-stepping the performance of their CPU offerings with each new release in terms of slight IPC and clock speeds. If they were to release a CPU that operates at the speed of light, then where can they really go from there that would want to make people upgrade again?

More cores? That's fine, as long as the software can utilize them all.
More IPC? How much real-world difference is a small IPC increase going to make when running at the insane clock speed of light speed?

Just because it could be feasible to mass produce such a processor in 6 years is not a guarantee that we will see it happens in 6 years, if even in our lifetime.
 

ShadowVlican

Limp Gawd
Joined
Nov 12, 2006
Messages
420
*yawn*... i'm pretty sure electricity "travels" at light speed already... or near it...
 

allenpan

[H]ard|Gawd
Joined
Jul 27, 2005
Messages
1,724
it still does not solve the foundation problem, the "parallel management" issues
 

Babbster

[H]ard|Gawd
Joined
Jan 13, 2006
Messages
1,434
I remember a lot of discussion about this in the late 80s early 90s when I was an EE student. 20+ years later it still has not happened..
Boy, you're telling me. I have to navigate my jetpack manually and it's a bitch. When Google's self-flying jetpacks come out, I'm totally jumping into one.
 

Farkle

Lurker
Joined
Jan 1, 2007
Messages
1,608
Yeah, just wait until a bit flip happens on your data collection sensor! It's a cool idea, because essentially you're evading the branch predictor in calculations by running them all at once, but the system seems weak at the collection point.

*yawn*... i'm pretty sure electricity "travels" at light speed already... or near it...

Circuit components have quite a bit of drift velocity, and reflections compared to a pure optical solution without a damaged pathway.
 

SGA76

[H]ard|Gawd
Joined
Jul 23, 2013
Messages
1,955
Hell in college (back around 2000) I remember my prof talking about asynchronous cpus Intel had in their test labs that ran up to 7ghz with 33% more power since none of it was wasted keeping them on the clock.
I also remember laser cpus being discussed as the future due to less power and less heat.
What have we seen since then?
Not a damn thing, more of the same.
If Intel and AMD build these super computers, they'll use them to design more of the same thing we've already got.
 
Joined
Nov 13, 2006
Messages
3,322
These will fit great into everyone's flying car as well!

Love how their marketing crew made sure to use "Professor Einstein" as narrator to lend it that touch of gravitas.
 

BladeVenom

Supreme [H]ardness
Joined
Jun 29, 2005
Messages
7,707
Where Intel's Light Peak fiber optic cables that we where all suppose to be using by now.
 

Cheetoz

[H]ard|Gawd
Joined
Mar 3, 2003
Messages
1,972
Could be fun buying special math co-processors for your computer like in the old days.
 

Godmachine

[H]F Junkie
Joined
Apr 7, 2003
Messages
10,472
I do not buy it at all. By 2020 we will have better versions of what we already have.

This. Corporations are always looking for the cheapest way to profit. Why alter the pot if it boils all the same?

This might see usage inside of a college but I doubt we'll have anything special by 2020.
 

Maxx

[H]ard|Gawd
Joined
Mar 31, 2003
Messages
1,648
According to the article, it will cost $3500 a year in energy. At the average US cost of $0.122 per KWH and assuming the computer is run 24-7, you're looking at 3275W. In reality their estimate is probably an average and thus the wattage is probably way higher than that. This isn't a machine that will be running in a consumer's home any time soon. Not that the article says that...it's just talking about size.
 

Synful Serenity

[H]ard|Gawd
Joined
Jun 4, 2004
Messages
1,256
It's been a glacial pace the last 10 years. Every 4 years I would upgrade and my clock speed would quintuple. With a 386SX-20 in 1991, I went to a 100MHz Pentium in 1995 (a case I kept as my main rig for 17 years), a 500MHZ Pentium III in 1999, and 2.53MHz P4 in 2003....We know NetBurst turned out to be NetBust, and there's other metrics besides clock speed in measuring processing power, but had the trend continued, we'd have 12.5GHz in 2007, 50GHz in 2011, and next year in 2015, 250GHz. That would have put the first desktop terahertz processor in our hands by 2019....5 years from now!

If you had told me back in 2003 after seeing the first 3GHz processors, that it'd be another 10 years before seeing the first stock 4GHz (besides OC), I'd have thought that would be the most nutso thing I'd heard, on a level of Family Radio rapture end-of-the-world insanity. So computers running on light? HAHA no....even by 2030 I'd be surprised, too bad I'll hafta wait a long time to tell them "told ya so!" Put them alongside the bubble memory and holographic storage that was supposed to give us 50GB drives in 1993. Maybe hair cloning will also come out around the same time this does.
 

silk186

[H]ard|Gawd
Joined
Feb 26, 2008
Messages
1,623
If this is really a feasable option, it's possible that ARM or samsung could invest in it.
 

Maxx

[H]ard|Gawd
Joined
Mar 31, 2003
Messages
1,648
I always find it fascinating that people expected clock rates to continue going up...completely ignoring the fact that like everything else, it isn't a linear process. You get difficulties at higher frequencies (of anything) that have NOTHING to do with direct efficiency, heat, etc., like crosstalk interfence. It's simply a limitation of physics. When you've seen things posted in the news about 10 GHz chips in labs, they're not talking about chips hitting 10 GHz as we know them but rather switching frequencies for things like communications chips. It's a common mistake to equate the two but really...if you were expecting 12.5 GHz in 2007 then you knew nothing of physics, electronics, or computing.

I hate to take that tone with it but as a computer engineer it just bugs me to NO END hearing people say the same thing about "frequency dearth" for the past 5-10 years when no one in the field ever expected or stated chips would achieve a certain frequency.
 

Nenu

[H]ardened
Joined
Apr 28, 2007
Messages
20,134
Intel or AMD would never push something like this so soon (even if they had the capability)...too much money to be made by baby-stepping the performance of their CPU offerings with each new release in terms of slight IPC and clock speeds. If they were to release a CPU that operates at the speed of light, then where can they really go from there that would want to make people upgrade again?

More cores? That's fine, as long as the software can utilize them all.
More IPC? How much real-world difference is a small IPC increase going to make when running at the insane clock speed of light speed?

Just because it could be feasible to mass produce such a processor in 6 years is not a guarantee that we will see it happens in 6 years, if even in our lifetime.

You are right to some extent, although there will be a performance war of some form as AMD try to reclaim a performance crown and more business.
But also if both CPU giants stall progression when it is readily available (or easy enough to develop further), some other business or startup will steal the march.

Once it is viable, we will get it.
Price is another matter though :p
 

sfsuphysics

[H]F Junkie
Joined
Jan 14, 2007
Messages
15,259
Circuit components have quite a bit of drift velocity, and reflections compared to a pure optical solution without a damaged pathway.
But the speed of the electrons is irrelevant, the signal which moves electrons closer to where they are used travels at close to the speed of light.
 

Dekoth-E-

Supreme [H]ardness
Joined
Mar 23, 2010
Messages
7,599
I think the thing that saddens me is that in 2020 at my present rate, I will still be using my Amd thuban 1055t. :(
 

jon666

Limp Gawd
Joined
Feb 5, 2013
Messages
241
Anyone else want to start placing bets on what will happen first? Social Security will run dry before these computers are manufactured in decent numbers.

Can't wait for the first overclock attempt. Somebody is going to turn that thing into an awesome laser death show.
 

Ur_Mom

Fully [H]
Joined
May 15, 2006
Messages
20,595
I do not buy it at all. By 2020 we will have better versions of what we already have.

Slightly better at the rate we're going. Seems we've been going extremely slow since the Core i series was released. Minor improvements, but nothing huge. If you're running an overclocked 2600K or similar, you're still fine with what's out today.
 
Joined
Nov 12, 2012
Messages
44
So in the future all computers will have lots of blinking lights?
Just like predicted, in every old movie?
 

drescherjm

[H]F Junkie
Joined
Nov 19, 2008
Messages
14,938
Slightly better at the rate we're going. Seems we've been going extremely slow since the Core i series was released. Minor improvements, but nothing huge. If you're running an overclocked 2600K or similar, you're still fine with what's out today.

I say this is a result of the 22nm process not being an improvement over the 32nm process at frequencies over 4GHz. AMD has a similar problem their 28nm bulk is not better than their 32nm SIO at the same frequencies.
 

Starcrossed

2[H]4U
Joined
Sep 27, 2008
Messages
2,160
So..... I now know there is no chance that I'll be using an optical supercomputer by 2020. Are these predictions ever right?! After 30 years of reading them, rarely, if never.
 
Top