How Asimov's Three Laws Ran Out of Steam 153
An anonymous reader writes "It looks like AI-powered weapons systems could soon be outlawed before they're even built. While discussing whether robots should be allowed to kill might like an obscure debate, robots (and artificial intelligence) are playing ever-larger roles in society and we are figuring out piecemeal what is acceptable and what isn't. If killer robots are immoral, then what about the other uses we've got planned for androids? Asimov's three laws don't seem to cut it, as this story explains: 'As we consider the ethical implications of having robots in our society, it becomes obvious that robots themselves are not where responsibility lies.'"
Re:Actually, it's four laws, not three (Score:5, Funny)
There is a 0:th law...
Ah, yes. Good old "A robot shall take no action, nor allow other robots to take action, that may result in the parent company being sued."