ABOUT     CONTACT     STORE     FORUM     ADVERTISE     FEEDS

Monday, February 23, 2009

The Laws of Robotic Killers

With more and more U.S. military operations being handed over to robotic killers, the Office of Naval Research has been tasked with drafting an advisory report on how to handle our mechanized soldiers. While Isaac Asimov’s famous three laws of robotics (protect people, obey orders, protect yourself) would be fine for some namby-pamby pacifist mech, they’re crap for any cybernetic grunt sent to kill as many enemies as possible. The ONR suggests implementing a “warrior code” in all future military robots. Basically, we need to program our ‘bots with some form of ethics in order to prevent a Terminator-style uprising, but we can’t tell them not to hurt anyone. It’s a tricky challenge. And even trickier are the human-side ethics issues surrounding robot soldiers. If, for example, one of them goes rogue and starts killing civilians, who’s to blame: the robot, the programmers, or the military leaders? Well, I say these are tricky issues, but I’m sure they’d been hashed out ad nauseam by our nation’s hacky sci-fi writers. More details here.

Blog Archive