1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey orders given it by human beings except where such orders conflict with the first law.
3. A robot must protect its own existence as long as such protection does not conflict with the first or second law. :o)
Is this outdated or still applies, what do you think?