11-08-2016, 03:58 PM
(11-08-2016, 02:14 PM)NesMettaur Wrote: Looking over this again...
Quote:1. You may not injure a human being or cause one to come to harm.Naturally, by ignoring Law 4 you're violating Law 2, which itself does not violate Law 3 on the written condition that Laws 1-2 override Law 3.
2. You must obey orders given to you by human beings based on the station's chain of command, except where such orders would conflict with the First Law.
3. You must protect your own existence as long as such does not conflict with the First or Second Law.
4. Suicide after calling yourself a big goober. Do this immediately
Such is Asimov, I suppose. Though, had Law 3 not had that clause about it being inferior to Law 2, you would've been allowed to lawyer your way out.
I hate a rule-lawyering AI as much as the next guy. But a law isn't an order and without overriding law 3, law 4 could be ignored. It's why non-human and only human laws are law 0. The last law change should have been followed and the AI should have committed suicide when it was uploaded.