How the 'Memristor' Could Revolutionize Electronics

CommanderFrank

Cat Can't Scratch It
Joined
May 9, 2000
Messages
75,400
Are you ready for a revolution in computing that took technology 37 years to catch up with the original concept? Hewlett Packard has been hard at work on the memristor that can and will change the way computers work. HP plans to introduce ‘The Machine”, a computer based on the memristor technology in 2020.

The memristor technology is a candidate for this crucial step: It could mean the end of the silicon era, giving us lower power consumption, the ability to compute more information, increased data storage and completely new logic patterns for our computers.
 
Also lasers, fiberoptics, carbon nanotubes, etc. I'll get excited when it's available for my PC.
 
Memristors, she points out, function in a way that is similar to a human brain: "Unlike a transistor, which is based on binary codes, a memristor can have multi-levels. You could have several states, let's say zero, one half, one quarter, one third, and so on, and that gives us a very powerful new perspective on how our computers may develop in the future," she told CNN's Nick Glass.

Such a shift in computing methodology would allow us to create "smart" computers that operate in a way reminiscent of the synapses in our brains.

Free from the limitations of the 0s and 1s, these more powerful computers would be able to learn and make decisions, ultimately getting us one step closer to creating human-like artificial intelligence.

I take it that everything will have to be re-written? Seems like a shitload of work.
 
I remember the initial news report when the first memristor was made. Couldn't really wrap my mind around how it could help increase computing power though.
 
My favorite memristor feature: how it's always going to change computing within 10 years and has been doing so for twenty years straight!

Way to go memristor! :rolleyes:

Memristors childhood friend, Next Generation Nano-Silicon Anode could not be reached for comment, it was too busy doubling Lithium Ion battery capacity and reducing charge times to 30 seconds within five years.
 
Memristors, she points out, function in a way that is similar to a human brain: "Unlike a transistor, which is based on binary codes, a memristor can have multi-levels. You could have several states, let's say zero, one half, one quarter, one third, and so on, and that gives us a very powerful new perspective on how our computers may develop in the future," she told CNN's Nick Glass.

Wait, so it's MLC Flash?

We've already had dozens of technologies that can operate at different voltage levels, including transistors. While irt's easier to think of these things as "digital," in the end everything is analog (and sometimes it's the only way to increase performance).

You can probably get a whole lot of distinct logic levels in a Memristor, but in the end of the day the number of steps will be finite. We will still live in a digital world, because it's easier.

Speaking of making like more difficult, wasn't the huge selling point of Memristors that we could use the same storage for online and offline storage, forcing us to rewrite software so that there is no ram?
 
I thought that Intel was working on this a couple years ago.

It was closer to three years ago but I found an article discussing it: http://www.technologyreview.com/view/428235/intel-reveals-neuromorphic-chip-design/

Intel was working with memristors and lateral spin valves to create a chip they called a neuromorphic chip that was supposed to mimic the way that the human brain works. Hmmm, sounds familiar(to be fair it sounds like HP has been working on it for longer).
 
I take it that everything will have to be re-written? Seems like a shitload of work.

This technology wouldn't be for normal computers, at least not anytime soon.

Its more for AI and Supercomputer type stuff where everything is written from scratch anyways.
 
I take it that everything will have to be re-written? Seems like a shitload of work.

there may be new languages to take advantages of these technologies, but at the same time only the low level machine code would need to be written from scratch. With that they will make compilers for all the common languages and the apps should in theory run with a recompile.
 
Free from the limitations of the 0s and 1s, these more powerful computers would be able to learn and make decisions, ultimately getting us one step closer to creating human-like artificial intelligence.

I don't see how this is possible. Is this analogous to saying that once we came up with the decimal system, we were no longer limited by the binary system? You can still display the same number, it just takes longer. And the 1 & 0 still represent the basic True/False concept that humans base almost all decisions on.

I take it that everything will have to be re-written? Seems like a shitload of work.

So I would say that this is going to require a change to the way we visualize logic. Computer code is a result of humans trying to put our own thought process into a machine. For it to change, would mean our thought process would have to change. Every language I have ever looked at is linear, that's probably the limitation more then anything.
 
Wait, so it's MLC Flash?

We've already had dozens of technologies that can operate at different voltage levels, including transistors. While irt's easier to think of these things as "digital," in the end everything is analog (and sometimes it's the only way to increase performance).

A "floating" voltage on a transistor is not ideal. It is subject to decay and error detection. Where as if you have something that can sit at a definent value, it is much easier to work with. When using the floating voltage on NAND, you have to put in a good deal of error detection because there is the chance that the voltage will not be exactly where you want it. Now you have to decide is this voltage in no mans land because it's a high voltage of the low voltage or low voltage of a high voltage?

For example if you have 3 states 1, 1.5, 2. If you get a 1.25, is it supposed to be a 1 or a 1.5? If you just have a value of 1 and 2, then a value of 1.25 is easy to figure out.
 
A "floating" voltage on a transistor is not ideal. It is subject to decay and error detection. Where as if you have something that can sit at a definent value, it is much easier to work with. When using the floating voltage on NAND, you have to put in a good deal of error detection because there is the chance that the voltage will not be exactly where you want it. Now you have to decide is this voltage in no mans land because it's a high voltage of the low voltage or low voltage of a high voltage?

For example if you have 3 states 1, 1.5, 2. If you get a 1.25, is it supposed to be a 1 or a 1.5? If you just have a value of 1 and 2, then a value of 1.25 is easy to figure out.

True, since the state of the Memristor cannot fade over time, you really can just set it and forget it. With NAND Flash, you have to predict the decay state over time, and the longer it's been since you've read the cell (and the more times you've written to it), the less lifetime you have.

You're also right that since the "value" does not decay like trapped transistor charge, it will be the same 10 years from now, allowing for much tighter logic levels (i.e. massively higher density of data). Or even the possibility of infinite analog storage (assuming we don't find some unanticipated method of decay of data during these years of research).

Thanks for that insight, I guess I wasn't thinking very clearly last night : )
 
everyone wants a yes, no and maybe logic gate but no had gotten a practical version to work yet. They work on paper but when it comes time to build the silicon it always ends a series of yes and no basically a faster mouse trap not a smarter one.

Should be interesting to see how long it takes before they figure out if this is another Russian doll or the next step in chips... plus they still have to shrink it down once it does what they want it to.
 
Back
Top