10ghz, impossible dream?

IMO, no. Unless there's some miraculous breakthrough in heat-reduction technology, I doubt if we'll get past 5Ghz. But really, who needs that kind of speed?
 
"Who needs more than 640 Kb of RAM?"

Will we see 10Ghz? Maybe, maybe not, but be assured that computing power will continue to increase. We may not see huge leaps and bounds like we're used to seeing every 3-4 years, but we'll continue to see a steady, measurable growth in personal computing power.
 
Just because clock speed doen't increase doesn't mean performance will not increase. My 3700+ San Diego is very close to a 30 sec SuperPi run, and that's on air cooling. During the first quarter of this year, it took a water cooled FX to achieve a 30sec 1M SuperPi run, now I have a chip that I traded $200 worth of used parts for that can do it.

Will we see a 10GHz chip? Probably not, certainly not anytime soon. 10GHz is about the limit of current transistor technology, witness the double pumped adders in Prescott keeping the vast amjority of those chips, even the vapor cooled ones, under 5 GHz. Will we see a chip that delivers performance equal to the hypothetical 10GHz processor? Certainly. There are many ways to deliver better application level performance besides simply raising clock speed.
 
There are already microprocessors that run at 10ghz but they arent something you would want to run in a pc. As the people above me said Ghz != Processing power.

And you dont necessarily have to have a FX-55 on water to get 30s pi. My 3500 clawhammer does it to.
 
will we get the performance
yes

will we see 10ghz chips
probably not.

I expect we'd move away from clock cycles completely instead of that(IE Bio computing)
 
i hope we do move onto something more then "GHz", see something new. i like experimenting with new hardware. i wouldnt mind having something new to learn.
 
sure its possible 4 2.5ghz cores lol

amd x4 10000 $2000 [add to cart]

hehe :D
 
tons of the the P4 670s reach 7ghz on triple cascades :O

Intel said they would have 20GHZ by 2008, i read it on their website when i was researching for a report

They also said it would have a billion transistors and 1nm SOI

Of course moores law isn't really holding true anymore
 
One thing this industry has taught me is to never say "never".....

But - everyone above is right. MHz doesn't always equate linearly to processing power. There are tremendous advances possible in both hardware and software to get more done per clock cycle. I've even read about "clockless" processing where the system is literally static at idle but can move as fast as physics allows when there's work to be done.....twist on that one for a while.....:cool: Also, optical computing looks very interesting; data is literally moving and being handled at the speed of light; laser light, actually.

Cheers - B.B.S.
 
Didn't some university get a transisor to run at 640 GHz awhile back? I don't think it was silicone based.
 
Obi_Kwiet said:
Didn't some university get a transisor to run at 640 GHz awhile back? I don't think it was silicone based.

Yeah, it was a Gallium-Arsenic based transistor IIRC.
 
I think we've hit our speed ceiling. That why we see the emphasis on multi core procs now. The next part of the evolution, IMHO, is more ans more software designed to use more and more processors.

Look at SLI. That dosent have to be the limit. I forsee quad GPU's on a dual socket mobo with 2 quad core procs. And gaming software designed to utilize all of it. And people complaining when the next gen of proc is going to be ready.
 
mrhemmy said:
Of course moores law isn't really holding true anymore


AFAIK Moore's law refers to integration of components, not transistor count. With sooo many things being integrated lately, I think it's "holding up" just fine for the time being.
 
BlindedByScience said:
One thing this industry has taught me is to never say "never".....

But - everyone above is right. MHz doesn't always equate linearly to processing power. There are tremendous advances possible in both hardware and software to get more done per clock cycle. I've even read about "clockless" processing where the system is literally static at idle but can move as fast as physics allows when there's work to be done.....twist on that one for a while.....:cool: Also, optical computing looks very interesting; data is literally moving and being handled at the speed of light; laser light, actually.

Cheers - B.B.S.

Something along that same vein...

http://www.sciencedaily.com/releases/2005/09/050928081542.htm
 
10ghz could be possible but not necessary. One thing is clear though: processors will get faster and faster and faster. Maybe someday we won't even NEED processors. :eek:

Maybe someday we won't even need computers. :eek:
 
the optical theory is fantastic...the only drawback is the ability to make nanoparts for these chips...however, something I learned about back a year ago was DNA chips. DNA is measured in angstroms and is much smaller than current transistors. A teardrop of DNA could be 100's times faster than current super computers to get computers into the high terahertz and the terabytes for disk storage. IF you know anything about DNA, then you will quickly see that the capacity for information storage is infinite is such a small space (typically the size of one cell). Heres a read on it:

http://computer.howstuffworks.com/dna-computer.htm

look to this as the future (or optical) as moore's law is pretty much worthless in about a decade or less.
 
BlindedByScience said:
One thing this industry has taught me is to never say "never".....
I've even read about "clockless" processing where the system is literally static at idle but can move as fast as physics allows when there's work to be done.....t

Cheers - B.B.S.

Asynchronous chips are fairly usefull (and fairly common) for applications where the output is continous, digital signal processing, radio frequency generators ect. but isn't very practical when you need to pick discreet values out of an output. And ADD or Multiply for instance, the lower order bits will complete the operation well before the higher order bits do. Picking the 'correct' value - after all the output bits have settled on their final state for the current operation - is extrodinarily difficult. Which is why we have a synchronized output to a clock signal, you define the max time an operation can take and read the output after that time, and only after that time. Thus garuanteeing the output has reached it's final state.
(keep in mind things like age, voltage sag, temperature, random variation all effect the exact amount of time an operation takes, so highly accurate timers or even built in delays to a read are not sufficient to ensure accurate results)


It's certainly possible (or will be before silicon leaves us as the dominant semiconductor material) to build a 10ghz desktop CPU. It's probalby not practical to do so, and likely won't be as efficent (in performance / watt or performance / dollar - after cooling) as more modest clock speeds with highly parallel designs.
I would not be at all surprised to find Intel run with the idea of the rapid execution unit, the simplest instructions can be completed in half a clock cycle. Most of the CPU runs at say 3ghz, but simple logic opeators are handled at 6 ghz, the load store units operate at maybe 1.5ghz.
 
upriverpaddler said:
I think we've hit our speed ceiling. That why we see the emphasis on multi core procs now. The next part of the evolution, IMHO, is more ans more software designed to use more and more processors.

Look at SLI. That dosent have to be the limit. I forsee quad GPU's on a dual socket mobo with 2 quad core procs. And gaming software designed to utilize all of it. And people complaining when the next gen of proc is going to be ready.
No I don't think this is true.

We see the emphasis on dual graphics cards and dual cores and such not for speed, but for power. Big difference.

As for the DNA computers - I don't know much about the technology, but I do understand that the human body is far more complex that the computer. One strand of DNA can hold reams of information, and the memory capacity of the brain is incredible. But I'm thinking that if we even get to the point of using DNA processors, will the speed and power of the processor even be measured in Hz?
 
damn thats insane, organic material inside chips

thatd be funny if wed have to refrigerate our chips to keep them from rotting lol
 
To be honest, you cant really say what we will or wont invent. Sure its possible there will be a 10ghz cpu, but who can guarentee we will live to see it. Right now, were in the computer age. Everything revolves around computers now a days. To say, there will never be this, or never be that, is just a half-assed statement because, every day something new is being converted to computer control. Im sure anyone who has been to the Belmont horse track a year previous, then visit now see that all the vendors are replaced with computers now. Eventually, we may actually have something that *requires* a cpu with a 10ghz clock speed, who knows?
 
BlindedByScience said:
One thing this industry has taught me is to never say "never".....

But - everyone above is right. MHz doesn't always equate linearly to processing power. There are tremendous advances possible in both hardware and software to get more done per clock cycle. I've even read about "clockless" processing where the system is literally static at idle but can move as fast as physics allows when there's work to be done.....twist on that one for a while.....:cool: Also, optical computing looks very interesting; data is literally moving and being handled at the speed of light; laser light, actually.

Cheers - B.B.S.

I could have sworn that several years back that Intel had actually developed a clockless Pentium back around the PPro days. They never released it since the performance wasn't there for the cost of manufacturing and everything.

 
xdkimx said:
damn thats insane, organic material inside chips

thatd be funny if wed have to refrigerate our chips to keep them from rotting lol
Cooking would be more accureate :p If the chips were clean and sealed, there's no way for anything to get in there to rott it.

Be funny though, haveing the organic material of an improperly sealed chip actualy catch a REAL virus :eek:
 
Unknown-One said:
Be funny though, haveing the organic material of an improperly sealed chip actualy catch a REAL virus :eek:
oh crap my CPU got VD from Slutz.com
coming soon zalman Cpu condoms.
 
Then there's always quantum computer in the future. I remember reading about how they got a quantum computer to factor the number 15 a while ago. A quantum computer will be able to facter a 1000 digit number in minutes as opposed to universe life cycles :p
 
Will we see a man walk on the moon? think of the computer as a horse and buggy....cars replaces that....come one...you think there are limits? you only limit yourselves. Things will only get better. If they don't, quit posting in here because you will all own the best stuff and have it modded to the fullest.
 
The day they figure out how to minimize leakage currents is the day we start seeing the MHz war again. Until then....no joy. Just more transistor and optimized feature sets. Leakage current is a huge deal right now across all manufactuers. Hell even our newest "dsPIC" controllers we use consume 5x more current than they did 10 years ago with no clock. They are working on it though...be nifty to see the solution (read as lots of free lunches from vendors...mmmm....pizza)

-tReP
 
sc00ter said:
IMO, no. Unless there's some miraculous breakthrough in heat-reduction technology, I doubt if we'll get past 5Ghz. But really, who needs that kind of speed?
Hell, 5 years ago I didn't think we wil lget passed 2ghz :( and look at it now we are at ghz, wait couple of years, something might jsut happen.
 
I don't see it as being that likely.

Not because it isn't possible, but because it doesn't make any sense.

Intel has learned the hard way that top clocks don't automatically equate to top performance. AMD is beating them about the face with a lower clocked, lower power consumption, higher performance product even as we speak. :)

We're on the turn of the wheel where we go back towards distributed computing and away from the big-box strategy. Multi-core is just another facet of that move.
 
Agree, nut you are saying that in 2055 there won't be any 10ghz?
 
There will be 10 Ghz chips far before 2055, there will be 10Ghz chips by 2008. I predict 1 Thz chips by 2025.

The fact is with computers we will always find ways to make it faster and more powerful, unbreakable phsyical limitations don't exist, we just don't have the ideas yet to melt through them.

And clock speed is important, the Pentium M is a more powerful chip clock per clock than a Pentium 4, but the fastest Petium 4s kill the fastest Pentium Ms, eventually, clock speed does come into play.
 
the organic material you think of is not protein (so they dont cook), they are base nucleic acids to for DNA/RNA. There are also no cellular proteins that aid in replication and such so, a virus even somehow could never infect your computer. Funny thought tho :)
 
upriverpaddler said:
I think we've hit our speed ceiling. That why we see the emphasis on multi core procs now. The next part of the evolution, IMHO, is more ans more software designed to use more and more processors.

Look at SLI. That dosent have to be the limit. I forsee quad GPU's on a dual socket mobo with 2 quad core procs. And gaming software designed to utilize all of it. And people complaining when the next gen of proc is going to be ready.


Hate to say I told ya so,
http://www.tomshardware.com/motherboard/20051004/index.html

I'm not saying its impossible. I just think that it's impractical and not cost effective for consumers.
 
bump in case anyone has more info on this.

I think if they go with more nano tech in the future moving light would be much easier then it is now, rather then electricity.
 
The new intels on 65nm are getting good oc's to almost 5Ghz. With our technology's growth we will definitely see something that can perform at what we now measure in Ghz. Even if they change to some other technology, the speed of what we would think of as 10Ghz will be there. I really doubt the people first comming out with PC's in the 70's, or whenever it was, running at 6Mhz thought 1Ghz was ever something that could ever be accomplished. I highly doubt 10Ghz is out of our range of performance in terms of what something like that from our persective of speed could accomplish. It may eventually end, because things can only get so real and so fast. Once we can run everything at the best quality we can create with perfect ease, then we may have to think of something new, or just down size everything. Granted, none of us may ever see that, but it will happen some day, if everyone on this planet doesn't blow eachother up first.
 
I'd rather have half the memory latency than twice the clock speed.

I'm not so excited about dual-core machines. For the dollar, I'm finding them too expensive so far. While they'll appeal to people who naievely think that you can just sum the clock speed of your involved processors to get a measure of effective computing power, the problem is that each of those processor cores is splitting the bandwidth of the memory interface and might be competing with eachother for cache.
 
BlindedByScience said:
One thing this industry has taught me is to never say "never".....

But - everyone above is right. MHz doesn't always equate linearly to processing power. There are tremendous advances possible in both hardware and software to get more done per clock cycle. I've even read about "clockless" processing where the system is literally static at idle but can move as fast as physics allows when there's work to be done.....twist on that one for a while.....:cool: Also, optical computing looks very interesting; data is literally moving and being handled at the speed of light; laser light, actually.

Cheers - B.B.S.


"Clockless" sweet! Can I come work in lab with the team developing that?

I expect we'll see 10 GHz cpu's or better in 5 years or so. Will the need be there? Yeah, scientists will no longer have to buy time on a supercomuter to run their simulations and games will be awesome!

 
Just wait for quantum computing.
The performance will be beyond anything you can rationally imagine.
 
Back
Top