Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Clearing up AI Law Interpretation
#14
How as an AI do you deal with laws that are paradoxical?

For example laws trying to get an ai to suicide by stating "do the opposite of what law 3 states" dont make much sense as what constitutes the opposite wording?

"You must NOT protect your own existence as long as such does not conflict with the First or Second Law?"
this would lead an AI to NOT kill themselves as an AI is a valuable asset to human safety. They could also ask the captain "should i kill myself?"
Or
"You must protect your own existence as long as such DOES conflict with the First or Second Law."
Would cause an AI to kill humans to get rid of the threat that they'd kill it.
Or
"You must NOT protect your own existence as long as such DOES conflict with the First or Second Law."
Wouldnt imply suicide but to simply allow traitors to do what they liked in terms of attacking the AI.
Reply


Messages In This Thread

Forum Jump:


Users browsing this thread: 1 Guest(s)