United Nations Worried About Killer Robots; Invoking Isaac Asimov’s Three Laws of Robotics

Robots are a major part of geek culture.  From DC’s Red Tornado to Hasbro’s Transformers franchise, we as geeks and nerds have a fondness for robots.  Would today’s Star Wars-themed holiday May the fourth be with you.) be as fun without C-3PO or R2-D2?  Truly we as geeks love our robots.  But in loving our robots, we sometimes forget that robots have a dark side.


Six months ago, the United nations published a 50 page report, ‘Losing Humanity: The Case Against Killer Robots’.  This report details autonomous weapons under their own power and control, and the downsides to using them in the world’s military where they can be used to attack or kill humans.  So it comes as no surprise that the UN’s Human Rights Commission may soon invoke Isaac Asimov’s first law of the three laws of robotics.

On May 29th Losing Humanity‘s author Christof Heyns, a South African law professor focusing in human rights, will be leading a debate for a moratorium on any and all research or development of ‘killer robots’ by the governments of the world.   But while the UN wants to enact the first law of robotics, in reality this debate is only for robots used in war independent of human control.  After reading the report, I noticed the UN were not upset that these autonomous robots could kill humans.  Instead by being autonomous killer robots, the UN would then be unable to know what country to blame when these killer robots hurt or killed someone.  In other words, they want Bender instead of the possible creation of Cylons or Terminators that are operating independently from any and all human controls.  At least when Bender says, “Kill all humans” the UN would knows whom to blame:  his pet human Philip J. Fry, of course!


If you want to know more about the UN’s Human Rights Commission’s debates or wish to get more information about killer robots, you can sign their internet petition at stopkillerrobots.org.