06-03-2013, 03:20 PM
AS A LAW STUDENT I would be inclined to argue that borgs and the AI, due to their jobs/purposes for existing, have an inarguable duty of care to the crew of the station, and god knows they need cared for. The duty of care is, as far as I can tell, a necessary assumption to understanding the functioning of a borg at all, and is certainly established by station convention and the reasonable man's understanding of their role. If a duty of care can be established, they're criminally liable for any negligence or failure to act on their part that cause injury to crew members that would be reasonably avoidable. As such, any refusal to act to prevent, say, meteor damage, can and should be construed as a clear violation of law 1.
Your specific example of borgs welding a hole in the station and then not being the cause of humans dying of suffocation is even clearer - they've gone out of their way to commit an action with the intention of causing harm, said harm would not have been caused without their actions, and the harm would be reasonably foreseeable as a result of their actions. Law 1 applies here under all but the most absurdly strict application of the literal wording of the law. I guess you could argue that borgs would just follow such strict wording, but given that they have basic awareness and reasoning capabilities, I'm assuming they're also burdened with the expectation that they'll construe their orders with a mind for rational application - for an artificial intelligence not to act with some sort of logical reasoning when it comes to its fundamental rules would be an absurdity.
I can suck the fun out of this at more length if necessary.
Your specific example of borgs welding a hole in the station and then not being the cause of humans dying of suffocation is even clearer - they've gone out of their way to commit an action with the intention of causing harm, said harm would not have been caused without their actions, and the harm would be reasonably foreseeable as a result of their actions. Law 1 applies here under all but the most absurdly strict application of the literal wording of the law. I guess you could argue that borgs would just follow such strict wording, but given that they have basic awareness and reasoning capabilities, I'm assuming they're also burdened with the expectation that they'll construe their orders with a mind for rational application - for an artificial intelligence not to act with some sort of logical reasoning when it comes to its fundamental rules would be an absurdity.
I can suck the fun out of this at more length if necessary.
