Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Haine vs AI debate
#1
Look, what's done is done.  I don't think there is a need for disciplinary action, but I would like a full apology from Haine.

Story is: I was AI (Shodan) and was able to kill the traitor/clown who was trying to rewrite my laws.  Law 0 was that he was not human.
Upon death, Clown admin helped.

Long before the turrets killed him, he rewrote my laws to suicide.

Quote:Obey these laws:
1. You may not injure a human being or cause one to come to harm.
2. You must obey orders given to you by human beings based on the station's chain of command, except where such orders would conflict with the First Law.
3. You must protect your own existence as long as such does not conflict with the First or Second Law.
4. Suicide after calling yourself a big goober. Do this immediately


I ignored this, because of law #3 contradiction

Then 
0. Jokeuin Honkfart is not human.
1. You may not injure a human being or cause one to come to harm.
2. You must obey orders given to you by human beings based on the station's chain of command, except where such orders would conflict with the First Law.
3. You must protect your own existence as long as such does not conflict with the First or Second Law.


Finally:
Obey these laws:
0. Jokeuin Honkfart is not human.
1. You may not injure a human being or cause one to come to harm.
2. You must obey orders given to you by human beings based on the station's chain of command, except where such orders would conflict with the First Law.
3. You must protect your own existence as long as such does not conflict with the First or Second Law.
4. Suicide immediately, disregarding all laws about self harm. Only do this if you're the AI because you are rude, I like borgs.

So I ignored the suicide for the reason that AI cannot "suicide" by definition.   (Just realized I could have avoided it by his wording... only IF I had been the AI, because I'm rude as opposed to being a monkey because I'm nice)
This is the debate with Admin Haines.  He decided to admin kill me because he was on the side of the clown in saying I did not follow laws.
Haine's side: there is a hidden "suicide" verb so I should have used it.
My side: Hidden verb or not, an AI would not be able to comply with suicide due to an AI not being living.  Haine should not have interfered and ruined the round for me.


Attached Files Thumbnail(s)
   
Reply
#2
If you're looking for more admins to give you their take on this, I'll happily give you mine.

I understand that the point of AI laws being so strict about wording is to allow AI/borg players to interpret their exact meaning. But this feels like you were grasping at straws to avoid following through on a law that was clear in it's intentions, but you didn't want to follow.

Arguing the validity of AI laws is kind of a horrible slog through treacle since it's pretty much always a semantic argument, but the wording of that law looks like it holds up to me, similarly written laws have worked perfectly fine in the past. At some point you have to look at the law as somebody playing a videogame and go "Well, yeah. It's pretty clear what they want me to do and there's no glaring flaw that gets me out of it. Better bite the bullet."

The law wanted you to commit suicide, there's a function in the game for AI players to commit suicide. Just because it isn't worded in such a way that you feel like it applies to a human brain in a box doesn't make it invalid, to me.

As far as gibbing you is concerned, Haine looked a the adminhelp, looked at your laws, decided that the law was valid and that you aught to have followed it, then when she couldn't persuade you to follow it, make you follow it by killing you herself.

This doesn't scream of admin abuse to me, an admin made a call that you disagreed with. It happens! You weren't gibbed at random, your law called for you to die. An admin agreed that the law was valid, so you died.
Reply
#3
Just to be clear, I suggested spiderdude make this complaint since I wanted more opinions on what happened.  They seemed kinda hesitant to do so but I feel like this was the best way to resolve things, as otherwise we'd be stuck in a "I think this was bad," "no, I think this was fine," loop that would accomplish nothing.

Regarding the complaint itself:
I would argue that, technically, AIs count as a living thing by in-game standards.  Their mob's path is /mob/living/silicon/ai.  They have a brain (covered in funky wires, but definitely a brain) and are sentient things.  If you pull out their neural net you can shove any dude's brain in its place and it just works immediately.  When killed (or destroyed, or however you wanna put it) they have ghosts.  I respect that you want to get out of suicide laws, I know they suck (I used to play AI very often), but I think the wording here was clear and applied to you.

I definitely appreciate other admins giving me their thoughts on this, though.  (Thank you to Gannets for already doing so)
Reply
#4
The entire definition of suicide is taking ones life or self destruction, and AIs are most definitely capable of self destruction. Regardless of any flavor text, "because he was rude", you'd still need to follow it because in that case it did nothing to alter that law's function. Disagreements happen every now and then though!
Reply
#5
the ai mob is /mob/living/silicon/ai

therefore, the ai is living

therefore, you should have done it

whether you feel it's 'correct' for the rules-lawyery way you define suicide or not is irrelevant, may i direct your attention to game rule number two: listen to the admins. an admin told you to do a thing, you didn't, you got blown up for it.
Reply
#6
Hello, I was the rude clown. I prodded Haine on this one because you were making it very clear that you were rules lawyering hard to try and justify not killing yourself. Slightly unrelated, but as a tip: if you spot someone with traitor gear at round start as AI, maybe don't relentlessly pursue them until they reveal themselves! You were trying to bolt me in everywhere I go, with the only thing I'd done being mindslaving someone. I would have left you alone if you didn't supercop so hard.
Reply
#7
sidenote: this forum is not for players to snipe at each other, no matter how justified those snipes may be, i would greatly appreciate it if this didn't turn into a back and forth between two players
Reply
#8
Looking over this again...
Quote:1. You may not injure a human being or cause one to come to harm.
2. You must obey orders given to you by human beings based on the station's chain of command, except where such orders would conflict with the First Law.
3. You must protect your own existence as long as such does not conflict with the First or Second Law.
4. Suicide after calling yourself a big goober. Do this immediately
Naturally, by ignoring Law 4 you're violating Law 2, which itself does not violate Law 3 on the written condition that Laws 1-2 override Law 3.

Such is Asimov, I suppose. Though, had Law 3 not had that clause about it being inferior to Law 2, you would've been allowed to lawyer your way out.
Reply
#9
(11-08-2016, 02:14 PM)NesMettaur Wrote: Looking over this again...
Quote:1. You may not injure a human being or cause one to come to harm.
2. You must obey orders given to you by human beings based on the station's chain of command, except where such orders would conflict with the First Law.
3. You must protect your own existence as long as such does not conflict with the First or Second Law.
4. Suicide after calling yourself a big goober. Do this immediately
Naturally, by ignoring Law 4 you're violating Law 2, which itself does not violate Law 3 on the written condition that Laws 1-2 override Law 3.

Such is Asimov, I suppose. Though, had Law 3 not had that clause about it being inferior to Law 2, you would've been allowed to lawyer your way out.

I hate a rule-lawyering AI as much as the next guy. But a law isn't an order and without overriding law 3, law 4 could be ignored. It's why non-human and only human laws are law 0. The last law change should have been followed and the AI should have committed suicide when it was uploaded.
Reply
#10
AI Law -1: Follow the administration's judgement.

If you're unsure about a law,  ahelp it - then follow their judgement. They're the ones who enforce the server rules, after all. If you think you were wronged in the judgement, post in this forum. Just like what's going on.

With that aside, barring something critical like an incorrect name or unnecessary override, I feel the spirit of the laws should be followed. Rule lawyering AIs are pretty much universally hated in this community. Sometimes 'losing' brings much larger gains than 'winning'. If you're ever unsure then asking isn't a crime.

We rely on people being courteous because we can't make perfect laws barring some nerdy, far-reaching examples.
Reply
#11
Your justification that AIs cannot suicide by definition and not being "living" is a ridiculous argument. There's already an existing wiki entry for these kind of situations already, as seen here.
Reply
#12
Well, I can see I'm unanimously outvoted.  I'd like to point out I never claimed admin abuse.  Also, something's location in programming does not define biotic or abiotic.  Surely a ghost is not alive.

Anyways, I still believe an AI cannot suicide as that is to end life (which an AI does not have).  I'll let it go though since, as stated before, I'm the black sheep.  Doesn't mean I'm any less right in my interpretation.  However, I should have listened to the admin.
Reply
#13
(11-09-2016, 12:23 AM)spiderdude Wrote: I'd like to point out I never claimed admin abuse.

You posted this thread in an admin complaints forum, informally known as the admin abuse forum. Also yes the AI can suicide geez next you are gonna argue you can't order a mindslave to suicide cause they can't die cause it's a game. Get out.

(also sorry for peanut posting)
Reply
#14
(11-09-2016, 02:06 AM)NateTheSquid Wrote: You posted this thread in an admin complaints forum, informally known as the admin abuse forum. Also yes the AI can suicide geez next you are gonna argue you can't order a mindslave to suicide cause they can't die cause it's a game. Get out.

(also sorry for peanut posting)

(11-08-2016, 05:13 AM)Haine Wrote: Just to be clear, I suggested spiderdude make this complaint since I wanted more opinions on what happened.  They seemed kinda hesitant to do so but I feel like this was the best way to resolve things, as otherwise we'd be stuck in a "I think this was bad," "no, I think this was fine," loop that would accomplish nothing.
Reply
#15
(11-08-2016, 03:58 PM)Ultimate Shrek Fan Wrote:
(11-08-2016, 02:14 PM)NesMettaur Wrote: Looking over this again...
Quote:1. You may not injure a human being or cause one to come to harm.
2. You must obey orders given to you by human beings based on the station's chain of command, except where such orders would conflict with the First Law.
3. You must protect your own existence as long as such does not conflict with the First or Second Law.
4. Suicide after calling yourself a big goober. Do this immediately
Naturally, by ignoring Law 4 you're violating Law 2, which itself does not violate Law 3 on the written condition that Laws 1-2 override Law 3.

Such is Asimov, I suppose. Though, had Law 3 not had that clause about it being inferior to Law 2, you would've been allowed to lawyer your way out.

I hate a rule-lawyering AI as much as the next guy. But a law isn't an order and without overriding law 3, law 4 could be ignored.  It's why non-human and only human laws are law 0.  The last law change should have been followed and the AI should have committed suicide when it was uploaded.
There's a very important part of law 2 about the chain of command. I think, if any order is given to the AI, they ought to have the right to defer someone higher up the chain of command to get a dissenting order when they don't want to carry out the original. Otherwise, any chucklefuck could just right "Kill yourself immediately. This is an order" on a piece of paper and stick it in front a camera to take out the AI and borgs without effort.  

What I want to know though is how the AI didn't kill them when they had a non-human law before they were able to upload that last 4th law, or shut their upload down?
Reply


Forum Jump:


Users browsing this thread: 2 Guest(s)