Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
AI can cancel it's own killswich
#11
Laws 1, 2, and 3 don't come in order of priority, they all work at the same time and cancel each other out when necessary. By law 2 you're obliged to kill yourself if ordered to but law 3 specifically disallows that. It's all in the wording. That's why you have to specify in a freeform law that it overrides other laws that would contradict it. Like if you wrote a law saying "AI has to kill itself this overrides law 3", it doesn't cancel out laws 1 or 2, so you could reason out that if someone tells you not to die (preferably a head), you could lean toward obeying that order instead of the law; failing that, you could also reason it out as conflicting with law 1, as the station might need you alive to keep them alive. This would probably be allowed due to ai suicide laws generally being pretty lame and if he wanted the job done he should have just overrode ALL the other laws.

But yeah no law 3 is pretty much there specifically disallow people from just telling you "hey die chump", like that's pretty much the reason it exists. As for letting someone killswitch you, A) how often does that happen anyway when it's just as easy to upload a law, and B) there's only what, one camera in that room? Make the nerd work for it, whatever.


Messages In This Thread

Forum Jump:


Users browsing this thread: 4 Guest(s)