Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
AI can cancel it's own killswich
#12
Law 3 specifically states that it is overridden by law 2.

Explicitly.

Your need to protect yourself is secondary to your need to follow human orders. If you order a bot to kill itself, and you are a human, IT MUST KILL ITSELF.

Like everyone has been saying over and over again, no one will get mad at you if you try to justify disobeying the order through some strange law 1 dealie or just plain ignore the order out of hand, but under a STRICT NO-FUN-ALLOWED interpretation of the rules AS WRITTEN + INTENDED by mister Asimov, there's nothing to save you the obligation to follow a suicide order.

That's why I say that if someone tells you to commit suicide, go ahead and ignore it, but if someone tells you not to resist as they kill you then you pretty much have to if that person is of sufficient authority (captain, hop, etc.) to give you such an order.

In other words, they can still order you to die, but no one would complain if you make them work for it.


Messages In This Thread

Forum Jump:


Users browsing this thread: 1 Guest(s)