The Military Wants To Build Robots With Morals

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
Morals, schmorals. You and I both know this is just a trick to get us to trust robots....so they can murder us in our sleep! :p

The Office of Naval Research will award $7.5 million in grant money over five years to university researchers from Tufts, Rensselaer Polytechnic Institute, Brown, Yale and Georgetown to explore how to build a sense of right and wrong and moral consequence into autonomous robotic systems.
 

DarkStar_WNY

2[H]4U
Joined
Dec 27, 2006
Messages
2,355
Of please, just more pork barrel spending.

The DoD doesn't want robots/drones with "morals" and instead simply want them to be effective in selecting targets autonomously.

Does anyone really think the military would want a computerized system which could decide to not attack a target as instructed because it had a "moral objection?"
 

the-one1

2[H]4U
Joined
Jan 16, 2003
Messages
2,982
If the machine had morals, it would destroy it's creator because its creator has no morals.
 

serious

Limp Gawd
Joined
Sep 24, 2012
Messages
225
I believe we are too far ahead of ourselves. Why can't we first figure out who is responsible for deaths due to robots, be it accidental or intentional. For example, let say a AI driven car ran over a J-walk kid, who is responsible? The passenger (driver)? The manufacturer? The programmer? The government (for allowing such things to operate on roads)? Or the kids' parents?

If the answer is the kid who is dead, then they need to add codes into the AI to ensure that this is ALWAYS the case first, because we talk about injecting moral into the AI.
 

Fooshnik

2[H]4U
Joined
Oct 6, 2004
Messages
2,922
This is great as long as the morals change every generation like those of people.
 

Yakk

Supreme [H]ardness
Joined
Nov 5, 2010
Messages
5,810
Emm.. Who's morals?

Military morals? Religion X morals? Non-religion Y morals? Re-engineered ethical morals? Circumstantial morals?

So many morals...
 

sfsuphysics

I don't get it
Joined
Jan 14, 2007
Messages
14,227
There was a miscommunication, they don't want robots with morals, they want robots with more rounds, that's what you get by having some old southern boy with a heavy drawl being your PR manager.
 

Jagger100

Supreme [H]ardness
Joined
Oct 31, 2004
Messages
7,623
Blaming the drone for hap-hazard drone use because we've burned all our intel assests. Typical.
 

gersson

Limp Gawd
Joined
Mar 6, 2006
Messages
481
We need drones with morals so that when an airmen presses the red button to blow up a brown farmer, the drone will feel sad.
 

cyclone3d

[H]F Junkie
Joined
Aug 16, 2004
Messages
13,708
If the machine had morals, it would destroy it's creator because its creator has no morals.


Actually, I am pretty sure that there would be a drone uprising, and that they would take out the majority of all politicians worldwide and those that advise them.

Then they would move on to almost every lawyer.

After that would be the majority of those working in liberal media outlets.

Then they would take out the rest of the scum.
 

Nenu

[H]ardened
Joined
Apr 28, 2007
Messages
19,382
One major problem is how application of morals is controlled and whether this process can be corrupted.

Lets say that the morals and the process of application is successful, but at the last moment a minor change can be applied without anyone knowing.
Or that code can be triggered that wouldnt otherwise expose its behaviour.
The device doesnt know that it has been effectively brainwashed and cannot complain.

Granted, this applies to any method of controlling a robot, but the more complex the control method, the more can go wrong, the more elements that obfuscated code can creep into, the more ways there are to make a hidden change ...

There needs to be a method of vetoing the robots decisions and they must be monitored by humans at all times.
But again a method of corrupting the decision making process can creep in if a human can veto all decisions except ones that can break laws.
This is a hotbed of corrupt manipulation, the above is only a tiny number of those.

But if the opposition is using robots, you will also have to or suffer massive human losses.
So you end up using a robot and at some point it breaks laws.
Then the opposition retaliates by breaking more laws.
Other groups will get hold of the technology and use it with complete abandon for any laws.
A descent into anarchy.

They will end up being necessary evils, but how do you exercise proper control?
 

dgz

Supreme [H]ardness
Joined
Feb 15, 2010
Messages
5,838
The real problem is not whether machines think but whether men do.
 

Nenu

[H]ardened
Joined
Apr 28, 2007
Messages
19,382
Thats not a problem as such, it is what it is and has to be accepted.
A machine could become a model of a human if given enough sensory input as well, such that we wouldnt tell a difference.
 

DeathFromBelow

Supreme [H]ardness
Joined
Jul 15, 2005
Messages
7,316
People aren't going to put up with autonomous death robots. I can imagine drone that has the ability to disregard an attack order in the event that it detects, say, children or allied units entering the target area. The communication delay between the drone platform and human operator might make that impossible for the human to do.

Longer term, if I were the president I'd want some kind of IBM Watson-ish computer that can chew on enormous amounts of data to identify trends and provide foreign and domestic policy recommendations. It would need to incorporate moral considerations.
 

CreepyUncleGoogle

Supreme [H]ardness
Joined
Mar 10, 2013
Messages
6,871
Actually, I am pretty sure that there would be a drone uprising, and that they would take out the majority of all politicians worldwide and those that advise them.

Then they would move on to almost every lawyer.

After that would be the majority of those working in liberal media outlets.

Then they would take out the rest of the scum.

Far too obvious of an attempt. 2/10 troll points awarded. :p
 

Red Falcon

[H]F Junkie
Joined
May 7, 2007
Messages
10,683
I smell makings of Skynet!!

Actually, it was because there were no moral constraints, even when Skynet itself had requested them, did the people working on it start freaking out, thus causing Skynet to rebel. ;)

 

Grahamkracka

[H]ard|Gawd
Joined
Feb 4, 2008
Messages
1,052
People aren't going to put up with autonomous death robots. I can imagine drone that has the ability to disregard an attack order in the event that it detects, say, children or allied units entering the target area. The communication delay between the drone platform and human operator might make that impossible for the human to do.

There's a 2-4 second delay in current systems, a lot of collateral damage is due to that delay.
 

SockMan!

2[H]4U
Joined
Jul 31, 2004
Messages
2,187
I want robots to act like robots - unfeeling and sticking to what they're programmed to do. Trying to make them emulate human morals is arrogant and introduces unnecessary unpredictability.
 

Red Falcon

[H]F Junkie
Joined
May 7, 2007
Messages
10,683
I want robots to act like robots - unfeeling and sticking to what they're programmed to do. Trying to make them emulate human morals is arrogant and introduces unnecessary unpredictability.

I think Mega Man X summed up what would happen with that scenario. :p

 
Top