Isaac Asimov: The Three Laws of Robotics 1942

  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

  2. A robot must obey orders given it by human beings except where such orders conflict with the first law.

  3. A robot must protect its own existence as long as such protection does not conflict with the first or second law. :astonished:)

Is this outdated or still applies, what do you think?

  1. A robot may not harm humanity, or, by inaction, allow humanity to come to harm.

Quite a few papers have been written about the four laws. They are quite difficult to implement in practice.