• Welcome! The TrekBBS is the number one place to chat about Star Trek with like-minded fans.
    If you are not already a member then please register an account and join in the discussion!

Emegency shutdown of traitorous war robots

Bill Morris

Commodore
Commodore
Ground-crawling U.S. war robots armed with machine guns, deployed to fight in Iraq last year, reportedly turned on their human masters almost at once. The rebellious machine warriors have been retired from combat pending upgrades.

The revelations were made by Kevin Fahey, U.S. Army program executive officer for ground forces, at the recent RoboBusiness conference in America.

Speaking to Popular Mechanics, Fahey said there had been chilling incidents in which the SWORDS* combat bot had swivelled round and apparently attempted to train its 5.56mm M249 light machine-gun on its human comrades.

"The gun started moving when it was not intended to move," he said.

The full article is here:
http://www.theregister.co.uk/2008/04/11/us_war_robot_rebellion_iraq/


SW.jpg
 
Last edited:
The robots are not coded with the three laws because they're designed to kill people so other then a lacking in its computer software theres really nothing to worry about as far as i can see.
 
^As a 'war-drone' -- or whatever -- it can't be given the 3 laws. That would defeat its raison d'etre: to kill people. I read all the comments, though. Some were pretty hilarious. However, there was a link to an incident in South Africa where an automatic cannon (?) went off by itself without its handler's order. Supposedly it was a software glitch. Some soldiers were killed and many others wounded. This is why humans, flawed as we are, are techno-phobic when it comes to war robots. (Or police robots.) The software is written by humans. The hardware is manufactured by humans. Both are deployed in less than ideal conditions. That could lead to bad things happening if the robot is independent of a handler and allowed to shoot at will.

If I understand correctly, the forward, backward and sideways motion of the machine should be independent of the gun. The site/sight should be linked to the gun. (This may be an over simplification, but I'm basing it on a humanoid-like idea.) I was taught to imagine there is a string attached from the tip of my nose and the tip of the gun. That way my eyes were always going in the business direction of the gun. My legs operate independently. Hopefully this is not too stupid a comment. But as I've watched my son master gaming, it occurred to me that could be the next direction high tech war takes. And, yes, it makes me shudder.
 
The main reason we can't give robots the three laws is, of course, that even though we may be able to distill a notion of what an "object" is into software, and maybe even write software that more or less distinguishes between inanimate objects and human beings, there's no way we can get robots to adequately understand the concept of "harm", let alone whether their actions or inactions are causing harm to human beings.
 
The main reason we can't give robots the three laws is, of course, that even though we may be able to distill a notion of what an "object" is into software, and maybe even write software that more or less distinguishes between inanimate objects and human beings, there's no way we can get robots to adequately understand the concept of "harm", let alone whether their actions or inactions are causing harm to human beings.

Agreed.


^As a 'war-drone' -- or whatever -- it can't be given the 3 laws. That would defeat its raison d'etre: to kill people.

Is that not what I said?


Yes. And robotic warfare takes out the ugliness of war. See quotes by the late president Dwight D. Eisenhower...he nailed it quite well.
 
All of this has happened before and all of this will happen again. :borg:

only because the robot's dial up connection to skynet was inferior. Give it ten or twenty years. Otherwise we would be simply saying - this has happened :)
 
There is nothing wrong with it. The robot was just attempting to kill hostile forces, ergo Humans.
 
There is nothing wrong with it. The robot was just attempting to kill hostile forces, ergo Humans.

If the robots have AI learning algorithms, that may very well be true.

Makes me wonder what rules the robots are programmed to follow. Enemy forces, friendly forces, civilians, how do they distinguish one from another?
 
Even if U.S. troops and allies were issued combadges to emit signals for the robots not to target, sooner or later the enemy would capture a combage and perhaps order up a supply of them.

And what happens if a killer robot goes missing on the battlefield? Oops!!
 
Even if U.S. troops and allies were issued combadges to emit signals for the robots not to target, sooner or later the enemy would capture a combage and perhaps order up a supply of them.

That's what encryption and rotating codes are for.
 
Even if U.S. troops and allies were issued combadges to emit signals for the robots not to target, sooner or later the enemy would capture a combage and perhaps order up a supply of them.

That's what encryption and rotating codes are for.

Cut off from his unit and able to survive off the land for several weeks, Corporal Redshirt encounters a friendly robot and hopes his no-target code is still valid.
 
At this point these robots wouldn't be deployed autonomously, so that wouldn't be a factor.

But even if it were, it's a simple matter to add emergency codes, which are valid for longer *and* which call in all available backup when they're detected (to avoid misuse by the other side).

Besides, if these things were autonomous, one would presume they'd have the analytic capabilities to distinguish between humans carrying a gun and humans with hands up, at the least.
 
Even if U.S. troops and allies were issued combadges to emit signals for the robots not to target, sooner or later the enemy would capture a combage and perhaps order up a supply of them.

That's what encryption and rotating codes are for.

Cut off from his unit and able to survive off the land for several weeks, Corporal Redshirt encounters a friendly robot and hopes his no-target code is still valid.

At which point will say [austrian muscleman accent]"Come with me if you want to live"[/austrian muscleman accent] - shoot Corporal Redshirt, grab his dog tags with embedded data chip and his PDA, look at them fall in love and roll of into the sunset - happily. Robots need love too!!
 
If you are not already a member then please register an account and join in the discussion!

Sign up / Register


Back
Top