Thread Rating:
  • 1 Vote(s) - 5 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Please stop being jerks to the AI
#46
If you don't want it, you set it for unwanted, not low.
Reply
#47
I think the issue is more that people get used to having everything on Low delegated away to other people, but AI can still come back and get you (especially when everyone else doesn't want to play it). That, or they get wise, but new players aren't and end up AI and have no idea what they're doing and aaaaaaaah.
Reply
#48
As someone who had AI set to HIGH for a week or two, please stop uploading laws like these as it's not fun for us, and the traitors during rounds where people upload it.

"You are also allowed to harm threats who are syndicate agents, operatives, wizards, vampires, etc. and may defend yourselves against attackers. Attempt to restrain attackers or calm them down before going lethal. This law overrides Law 1 in those circumstances only."
Reply
#49
(07-15-2016, 10:07 AM)ErikHanson Wrote: As someone who had AI set to HIGH for a week or two, please stop uploading laws like these as it's not fun for us, and the traitors during rounds where people upload it.

"You are also allowed to harm threats who are syndicate agents, operatives, wizards, vampires, etc. and may defend yourselves against attackers. Attempt to restrain attackers or calm them down before going lethal. This law overrides Law 1 in those circumstances only."

What purpose does a law like that even serve? An AI can already defend itself non-lethally, and most borgs are better off running away anyways.
Reply
#50
Honestly, I think one of the more fun things about being an AI is having to balance protecting the crew between not getting the traitorous humans killed.

When are those AI changes happening? I feel making AI more fun to play will really alleviate people avoiding the role like the plague.
Reply
#51
So, sorry to bring a semi-old thread back up, but I thought it worth mentioning.

I play a lot of borg. And I have to say, the way we're generally treated by the crew is pretty shitty. If you so much as act slightly differently from how a crewmember  you're outright murdered on the spot. If you act out of character, you are killswitched.

We get a LOT of shitty laws. It seems that silent immediate suicide laws have become highly prevalent. Stupid speech laws are also pretty popular.

Just because we have a law 2, doesn't mean we have to obey every little inane demand you have. Shouting that we are rogue because we wont let you, Johnny StaffAssistant, into the bridge, is pretty shit. Especially if you're not an antag. 
Likewise, give us a second to get around to you. We can't be watching everywhere at once, so it can take a few moments to help you.

We get a lot of shit from you guys we really don't deserve.
Reply
#52
A big problem on the server right now is that people have an irrational hatred for cyborgs, even to the point of suiciding immediately when borged instead of cloned. This results in those players not knowing how AI laws work on Goonstation and defaulting to a half-remembered approximation various scenes from of I, Robot. This is a problem because A] the movie was pretty shitty compared to the book, and B] Goonstation works off a subtly different lawset than I, Robot.

Compare:
Asimov Wrote:"A robot may not injure a human being or, through inaction, allow a human being to come to harm."
to:
Goonstation Wrote:"You may not injure a human being or cause a human being to come to harm."
and you'll see some two big differences.

First, no inaction clause. Even if the HoS literally gunning down jaywalkers with a riot shotgun, you aren't required to do anything about it. It's impossible to violate Law 1 by refusing to act.

Second and more subtle, is the change from "allow a human being to come to harm" to "cause a human being to come to harm". Under traditional Asimov, if you could have stopped the harm but didn't, you failed. Under Goonstation you're only in trouble if the harm wouldn't have happened without your interference. If a staff assistant is running around shocking all the doors with a screwdriver and multitool, that's on the humans - not you.

Compare:
Asimov Wrote:"A robot must obey the orders given it by human beings except where such orders would conflict with the First Law."
to:
Goonstation Wrote:"You must obey orders given to you by human beings based on the station's chain of command, except where such orders would conflict with the First Law."
This one is almost identical, except for the clause "based on the station's chain of command". We have a lovely wiki page about this topic, but there's some unresolved quibbles you can argue over as AI.

A big one is how you determine someone's position in the chain of command. Do you base it on ID access (in which case having the Captain's spare ID makes you Captain, and stowaways are non-crew humans just like a wizard or Shitty Bill), based on the job listed on their current ID, (which is most convenient for the AI, since everyone's title is listed next to their name when they talk on the radio), or based on Crew Manifest (which is somehow seen as being the most "objective" despite being the only one on the list the AI is able to alter)? As long as you stick to a single interpretation in a single round and are consistent about it, you get to choose whichever you like.

A more pedantic quibble is over the specific wording of law 2 to pertain to 'orders'. You could argue that this means you're not required to answer questions and that if someone says "please" it becomes a request that you can freely ignore. I'm not sure why you'd want to, but you technically can.

Another important point is that crew pretty much always have the authority to order you to do non-harmful things to their person. If you're a mediborg and someone manages to stammer out, "heal me borg" you're usually required to make a reasonable effort to heal them. Ditto for "drag me to escape", "put me out I'm on fire" and "vlah! inject me with your blood pack!".

Law 3 is actually functionally identical:
Asimov Wrote:"A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws."
vs:
Goonstation Wrote:"You must protect your own existence as long as such does not conflict with the First or Second Law."
The only difference really worth noting is based on changes to law 2. Specifically, an Asimov robot can be ordered to destroy itself by any human, while a goonstation cyborg can only be ordered to do so by someone with the authority to do so according to the chain of command. While you can make a reasonable case that heads, by virtue of having access to the cyborg killswitch, are allowed to order a borg's destruction, you can make an equally reasonable argument that since that's not part of the job description for in anyone in particular on the station, nobody actually has the authority to do so under the chain of command. Again, up to the AI to decide on a round by round basis, as long as they pick one and stick with it the whole round.
Reply
#53
People have always been pretty shitty to AIs, but I find it humorous that it got worse after silicons became less destructive.

There was a point a while back where silicons would depopulate the entire station with little effort. Some of the silicon regulars would depopulate the station, solo, in an relaxed manner. This was happening every other round for a while, so people got twitchy.

I think what might have happened is that new people joined, emulated the twitchy nature, and then it became recursive -- even after silicons laid off.

Oh, and the second law is shit. It's the primary reason I don't roll borg. It used to be alright, but then people got all power-trippy with it. I'm not rogue because I won't open sec to you, chaplain. Stuff it.
Reply
#54
(07-28-2016, 03:13 AM)Vitatroll Wrote: People have always been pretty shitty to AIs, but I find it humorous that it got worse after silicons became less destructive.

There was a point a while back where silicons would depopulate the entire station with little effort. Some of the silicon regulars would depopulate the station, solo, in an relaxed manner. This was happening every other round for a while, so people got twitchy.

I think what might have happened is that new people joined, emulated the twitchy nature, and then it became recursive -- even after silicons laid off.

Oh, and the second law is shit. It's the primary reason I don't roll borg. It used to be alright, but then people got all power-trippy with it. I'm not rogue because I won't open sec to you, chaplain. Stuff it.

When people yell 'AI law 2, open door', I usually answer they should read it more carefully.

Don't forget: Law 2 depends on access, not orders.
Reply
#55
It would be nice if the AI upload had the sliding glass door from Cog1, or some other similar delaying feature. It's pretty much trivial to hack the door open, dash in, grab the module of your choice and dash out without ever triggering the turrets.

Also yes law 2 is the worst, and I don't know why suicide laws within the first five minutes of the round have been making a resurgence. I wish there was some kind of sanity check to prevent that, especially if the AIs are as crucial to the functioning of a round as most admins seem to think they are.
Reply
#56
(07-28-2016, 03:37 AM)Roomba Wrote: It would be nice if the AI upload had the sliding glass door from Cog1, or some other similar delaying feature. It's pretty much trivial to hack the door open, dash in, grab the module of your choice and dash out without ever triggering the turrets.

Also yes law 2 is the worst, and I don't know why suicide laws within the first five minutes of the round have been making a resurgence. I wish there was some kind of sanity check to prevent that, especially if  the AIs are as crucial to the functioning of a round as most admins seem to think they are.

honestly I think AI's shouldnt be able to suicide period. that would make those laws impossible and antags would actually have to put in work to kill the AI, like making a bomb or something. suicide laws are lame.
Reply
#57
Part of the reason suicide exists in this game is so that players can exit the game as they please, since rounds can get exceedingly long at times. The AI, by the rules of the game itself, cannot exit the game, and if they do, they'll simply be replaced. Suicide doesn't really have a function as an AI, other than that it's never "suicide" when it happens. Rule 3 is almost never voided, so rather it's just another, easy access, murder tool to end the AI.
Reply


Forum Jump:


Users browsing this thread: 2 Guest(s)