Science fiction has tried to define the laws of the robotics world for many decades. Isaac Asimov was one of the first to suggest three laws that should govern all robotics in his 1942 short story "I, Robot".
Asimov's Laws:
01. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
02. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
03. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
Moral Math of Robots full program coming soon!