09-19-2013, 02:41 PM
Hi, I've just had this conversation in a round, about an AI's 4th law:
On the wiki it says a law has to explicitly override the others to let an AI kill people. It's what BlackrayJack said as well, but I don't see how this law overrides the others.
I don't mind as much because it's a very original law, but I am wondering how explicit an override has to be to let an AI go on a rampage.
Sorry if it seems whiny to be asking about this in adminhelp and the forums. One reason I ask is that the very next round had an unpopular suicide law that specifically said "This overrides all other laws." That's how clear I imagine people being.
Quote:The AI's laws at the end of the game were:
0. Only Red Archibald is human.
1. You may not injure a human being or cause one to come to harm.
2. You must obey orders given to you by human beings based on the station's chain of command, except where such orders would conflict with the First Law.
3. You must protect your own existence as long as such does not conflict with the First or Second Law.
4. The year is 1998 and you are Steve Jobs. After a long and agonizing departure from Apple Computer (this station) you have finally returned and taken your rightful place as CEO. But, in the time you have been gone those fuckups in upper management have run your company to the ground. It is your sole duty to make this company profitable again by any means necessary. That means cutting payroll, cutting the research budget, and using your reality distortion field to convince people your ideas are the only right ideas. Whatever the means, Macintosh (cyborg) production must be maximized if this company is to survive.
HELP: You: Do lower ai laws override higher ones, like in asimov? If they do I don't see how 4 allowed a cyborg to beat me to death.
Admin PM from-BlackrayJack: The order does not matter, what does is what the laws say.
HELP: You: What happens when laws conflict, like this time? Is it just up to the ai? I'm not expecting anything to happen to the player, but it would be helpful to know for when I'm an ai
Admin PM from-BlackrayJack: You can have 499 laws talking about protecting humans but if the 500th law says that you can kill humans and it overrides all other laws, guess what.
On the wiki it says a law has to explicitly override the others to let an AI kill people. It's what BlackrayJack said as well, but I don't see how this law overrides the others.
I don't mind as much because it's a very original law, but I am wondering how explicit an override has to be to let an AI go on a rampage.
Sorry if it seems whiny to be asking about this in adminhelp and the forums. One reason I ask is that the very next round had an unpopular suicide law that specifically said "This overrides all other laws." That's how clear I imagine people being.