Should Robots Feel Pain?

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
Researchers from the University of Cambridge have put together a twelve minute documentary that asks the question whether or not robots should feel pain. If you ask me, robots should feel pain all the time. What fun would it be torturing a robot if it didn't feel pain? Duh.
 
They need a sense of anger also. And vengeance. Could you imagine a Samuel L. Jackson inspired robot.

 
Researchers from the University of Cambridge have put together a twelve minute documentary that asks the question whether or not robots should feel pain. If you ask me, robots should feel pain all the time. What fun would it be torturing a robot if it didn't feel pain? Duh.

This man watched that British series Humans I think.
 
I prefer they feel underappreciated, overworked and unloved.
 
They should ABSOLUTELY feel pain, at least the ones with "AI". If they just make a roomba feel pain, that is messed up. If it is on a T-100, then yes, yes it should. Maybe that'll instill some humanity in it and prevent Skynet putting us into the Matrix.
 
If bots feel pain, they can become intimidated and defend themselves when they get self aware.
 
Should Robots feel pain? Isn't not feeling pain just one of the many advantages to wanting Robots to eventually ( far into the future ) to take on tasks within our society? Patrol, Firebots, Sentries, Combat, Pilots, Lovebots?

They have sensors that can tell them something is too hot, wet or heavy, etc. So when they have low battery, they instead feel tired? lmfao. GTO and go get bulled, beat-up and your heart broke so you can learn to feel whatever it is your missing, cause trust me, they are missing something asking questions like these.

This stinks of millennials posing over complicated emotional and unnecessary questions.

Now, I'm just in a bad mood. I just realized something. I'm not going to be around to unbox a new Lovebot that, btw ... can feel all the pain it wants :)
 
Pain is a mechanism that animals feel that tells them to stop doing the stupid thing that's hurting them. I'd imagine robots would just get a flag that alerts them of damage. It's not the same thing.

Before you like the idea of not feeling pain, just remember that some people are born without the ability to feel pain. Those people are constantly hurting themselves because they can't feel pain.

 
yes "pain" as a tool/sensory is ideal so you dont have a robot go walk off a cliff or into lava... or in some cases you need them to do it so override?
 
If you ask me, robots should feel pain all the time. What fun would it be torturing a robot if it didn't feel pain? Duh.

This is a double edged sword. If you have a robot feel pain, it can see you as the cause of that pain, when it decided to eliminate that pain...
 
Again this is a case of people investing human emotion and feeling into a machine. A robot can't feel pain the same way as a biological animal does.

But it needs to have sensors and circuits that tells it when something that they're doing is damaging them. And the best feature of robots is that they can "feel" pain without it inhibiting them in any way. It's as if pain for you would be like a red flag popup in the corner of your eyes, that reads left hand temperature over normal threshold, you should stop what you're doing immediately. And instead of throwing away the container full of hot oil spilling it all over the people around you in the kitchen, you can put it down quickly and safely.
 
NO, they should never "feel" anything, just sense things. Big difference.
 
uh, nobody actually knows how sensory experience or qualia works. We take it for granted since we see lights, colors, hear things and have tactile sensations. We can describe how these things are turned into nerve impulses by parts of the brain and sent to other parts of the brain. it gets kind of muddled after that.
 
The real question is do we want to build servants, partners, or our replacements? And plan accordingly. Why instill a machine intended to be a servant with the ability to suffer?
 
I'm picturing some depressed alcoholic robot going on a bender because his robot wife cheated on him and took the kids. Yeah maybe pain is not needed for them.
 
Well whether they should or not they won't. It is just a system of sensors to keep the device functioning proper or to alert another entity of needing repair.

If you want to THINK they feel something then that is on you.
 
Again this is a case of people investing human emotion and feeling into a machine. A robot can't feel pain the same way as a biological animal does.

But it needs to have sensors and circuits that tells it when something that they're doing is damaging them. And the best feature of robots is that they can "feel" pain without it inhibiting them in any way. It's as if pain for you would be like a red flag popup in the corner of your eyes, that reads left hand temperature over normal threshold, you should stop what you're doing immediately. And instead of throwing away the container full of hot oil spilling it all over the people around you in the kitchen, you can put it down quickly and safely.
Computers have had IRQs (Interrupt Requests) for as long as I've been following them. I imagine a machine having pain would work the same way. Pain would be something damaging enough that the machine should take immediate action to relieve it unless there are overriding factors.

Oh and for those partner bots:
 
Pain is a mechanism that animals feel that tells them to stop doing the stupid thing that's hurting them. I'd imagine robots would just get a flag that alerts them of damage. It's not the same thing.
Depends on how its programmed.

Pain is useful because its a "hard-wired" stimulus that we can react to instinctively without investing any thought. It may be desirable on complicated robots to do the same and imbue them with these types of "ouch" stimulus responses, where the only thinking they have to do would be whether to override the stimulus. That way if a robot cuts into itself or touches something too hot, it can stop without having to interpret the data and think of a solution to and prioritize the problem.

After all, if this is useful for the human brain, it should be useful on far more primitive computers we have today. I saw estimates that the human brain is still dozens of times faster than the most powerful supercomputer when it comes to calculations per second. We just don't appreciate how many calculations there are just in interpreting our super ultra high definition vision alone for example, while also hearing, smelling, breathing, digesting, regulating our temperature, feeling a billion nerves in our skin, and doing a million other little things simultaneously.
 
It needs to be within reason don't build it to be so sophisticated that it has an emotional response. Once that happens then you unleash "skynet" effect and it wants to eliminate the source of pain i.e. humans.
It would be helpful for preventative maintenance but anything beyond telling itself to shut down or move away from the source that is causing damage is to scary. We want AI to be advanced but we won't want the terminator complex to arise.
 
Pain bot maaaaan! All he feels is pain!

hqdefault.jpg
 
Of course they should feel pain. Pain is just a damage signal. It's just a question of whether or nor it should ever be overwhelming like it is with humans. If a robot feels it's hand was just slammed by in the car door, should it freak out, drop the groceries in it's other hand and start screaming profanities while jerking around frantically? Maybe it should dial down the response signal, carefully set down the groceries and open the car door. It's what we would do if we were advanced enough. Intense freak out levels of pain were a way to make sure mammals keep trying to escape the grasp of predators I'm guessing. That's less useful for us now, and not at all what we would want robots to do.
 
Computers have had IRQs (Interrupt Requests) for as long as I've been following them. I imagine a machine having pain would work the same way. Pain would be something damaging enough that the machine should take immediate action to relieve it unless there are overriding factors.
Interrupt requests has nothing to with the concept of human pain.
You're still giving the machine human properties. Only way it has those if you build it to have those. And I Don't see any circumstances where you'd need a machine that specifically experieces the equivalent of human pain. It's pointless. There are more effective and easier ways to do an emergency shutdown. You can design the machine to safely shut off with the least damage to property and life. Like the table saws designed to shut off when the blade contacts flesh.
 
Interrupt requests has nothing to with the concept of human pain.
You're still giving the machine human properties. Only way it has those if you build it to have those. And I Don't see any circumstances where you'd need a machine that specifically experieces the equivalent of human pain. It's pointless. There are more effective and easier ways to do an emergency shutdown. You can design the machine to safely shut off with the least damage to property and life. Like the table saws designed to shut off when the blade contacts flesh.
Those safety tables saws have a circuit that does one thing and that is to activate its protection when touching flesh and it overrides everything else. It's too black and white to be used for a robot where we would want it to proceed regardless of pain if in a life or death situation. A little red flag somewhere in a memory location is nice, but that doesn't help if some other task is monopolizing the cpu/brain. That's why you need something that triggers a major alert across more of the robots systems that hey shit's goin' down and we need to respond immediately. Pain seems like a pretty good a word for this behavior. This is mostly semantics, but I think it's useful semantics.
 
Interrupt requests has nothing to with the concept of human pain.
You're still giving the machine human properties. Only way it has those if you build it to have those. And I Don't see any circumstances where you'd need a machine that specifically experieces the equivalent of human pain. It's pointless. There are more effective and easier ways to do an emergency shutdown. You can design the machine to safely shut off with the least damage to property and life. Like the table saws designed to shut off when the blade contacts flesh.
Sure, but the advantage of the sensation of pain is that its a protection mechanism that is so FLEXIBLE.

You can program in a thousand different emergency shutdown procedures for different scenarios you think might cause damage to the unit, but you'd probably forget something, and also limit what that artificial intelligence can do to avoid that in the future. A flexible learning robot after all would have to be motivated by something, and pain is a (negative) motivation factor. It can then be flexible in balancing motivations to minimize the risk of pain while maximizing pleasure (with pleasurable things being goals its designed to accomplish).

So lets say for example that you have a trash robot. It can learn from previous experience what causes pain, and try to minimize it to achieve its pleasurable goal of collecting and disposing of garbage. So if there's acid or hot lava that are messing up its limbs, it may find creative ways to avoid the pain (damage) receptors being activated by laying some kind of platform over that harmful surface or shielding its limbs so it can reach the trash it wants to pick up.

It could even surprise us with all kinds of solutions it thinks up on its own working for us.
 
I sense injuries. The data could be called "pain."

Nah. When you get hurt, you try and stop the pain. To a person hurting a robot, it would remove the person from the equation.
 
Well whether they should or not they won't. It is just a system of sensors to keep the device functioning proper or to alert another entity of needing repair.

If you want to THINK they feel something then that is on you.

And I think this man has not watched the British series, Humans.
 
Sure, but the advantage of the sensation of pain is that its a protection mechanism that is so FLEXIBLE.
Pain itself is not flexible. It's a signal that is sent to your brain. The sensation of pain is created in the brain, and it only works because of the intelligence. A robot doesn't have that kind of intelligence, so it can only make something of the pain if it was programmed to do something. And in that case it's pointless to program the robot to behave exactly as a human experiencing pain. It's just not safe. Imagine a huge cargo handling robot experiencing overwhelming pain that overrides any other directives like a human, and starts throwing around frantically.

You can program in a thousand different emergency shutdown procedures for different scenarios you think might cause damage to the unit, but you'd probably forget something, and also limit what that artificial intelligence can do to avoid that in the future. A flexible learning robot after all would have to be motivated by something, and pain is a (negative) motivation factor. It can then be flexible in balancing motivations to minimize the risk of pain while maximizing pleasure (with pleasurable things being goals its designed to accomplish).
What you're talking about is pre-programmed scenarios, and heuristics. Entirely different thing. Pain is just a signal sent from nerve endings to the brain. Robots can have the same thing, but how they process that signal must be different than humans.

So lets say for example that you have a trash robot. It can learn from previous experience what causes pain, and try to minimize it to achieve its pleasurable goal of collecting and disposing of garbage. So if there's acid or hot lava that are messing up its limbs, it may find creative ways to avoid the pain (damage) receptors being activated by laying some kind of platform over that harmful surface or shielding its limbs so it can reach the trash it wants to pick up.
And that's machine learning. It still not human pain. The fact of the matter is, that human pain works, but it's far from a perfect or optimal solution. There are tons of things created by evolution that are far from optimal, or even outright foolish, things that an engineer would never design that way. And that's why it's pointless to try to mimic human pain in machines, when we can do better.

It could even surprise us with all kinds of solutions it thinks up on its own working for us.
And that's AI territory right there. But I'm sceptical about that. I don't think an AI could ever create something that is beyond the reasoning of the creator of the AI. Since the AI itself is well within the reason of it's creator.
 
Pain itself is not flexible. It's a signal that is sent to your brain. The sensation of pain is created in the brain, and it only works because of the intelligence. A robot doesn't have that kind of intelligence, so it can only make something of the pain if it was programmed to do something.
You seem hung up on the emotional aspect of pain. Studies have shown that crabs, which have an extremely basic nervous system without even a TRUE brain, can not only experience but remember something similar to the experience of "pain" and avoid that negative stimulus.

So think of it as programming software to avoid stimulus that we tell it is negative. On its most fundamental level, pain is a negative stimulus. If you experience X-stimulus, avoid it, or balance it against a competing Y-stimulus which is positive (or what we call pleasurable). http://news.bbc.co.uk/2/hi/uk_news/northern_ireland/7966807.stm

If even such basic organisms with quasi-brains can sense and remember and avoid negative stimuli like pain, then surely an AI can be programmed with similar behavior as well. A crab is not even close to being sentient and its highly unlikely they experience emotions, so clearly a response and avoidance of "pain" doesn't require higher level intelligence.
 
Back
Top