08-27-2025, 06:06 AM
(This post was last modified: 08-27-2025, 06:07 AM by CaptainBravo.
Edit Reason: tense
)
I don't think the basic premise presented is true, due to having removed the inaction clause years ago to prevent people from going:
"AI, break your laws or I will kill someone. Law 2."
There are few to no situations where not following an order under Law 2 would result in Law 1 being broken, because the AI isn't required to prevent harm under Law 1. It's only required not to cause it.
"AI, break your laws or I will kill someone. Law 2."
There are few to no situations where not following an order under Law 2 would result in Law 1 being broken, because the AI isn't required to prevent harm under Law 1. It's only required not to cause it.

Goonhub