• Welcome! The TrekBBS is the number one place to chat about Star Trek with like-minded fans.
    If you are not already a member then please register an account and join in the discussion!

Asimov Question

With all the reading you must do, When do you find time to write, Christopher?

Could an android overcome the First Law faced with say a Himmler?
 
Was there an example were Data Broke the Second Law, allowing another being to die through his inaction? My TNG database is rusty. Too much Voyager. :shrug:
 
About Data vs the asimovian laws:

In TNG:'The most toys' Data stated he is capable of killing a man in self defense, blatantly breaking the first law.
Data is not bound by these 3 laws.
 
Was there an example were Data Broke the Second Law, allowing another being to die through his inaction?

That's not the Second Law, that's the latter half of the First.

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.


Besides, it's an irrelevant question. He doesn't have to break every single provision to prove that he's not bound by the Laws. Even breaking one of them is proof of that.

However, just for the sake of pointless argument, I'd say that yes, Data has allowed other beings to come to harm through inaction. He has sat at his console and taken no action to prevent Worf from firing on enemy ships. He did so under orders, of course, but First Law supersedes Second, so a "Three Laws safe" robot could not be ordered to allow harm through inaction.

And come to think of it, I'd say that Data violated the Third Law simply by choosing to join Starfleet. Serving in Starfleet puts his existence at risk. But he wasn't ordered to join Starfleet, and he didn't do it specifically to protect people from harm. So he has, in fact, violated all provisions of all three Laws.
 
IIRC, the FASA TNG Officer's Manual makes mention of concern that Data wasn't programmed with the three laws after his recovery.

IMO it may have been prudent after what happened with Lore, but somehow Soong decided that a lack of emotions was the solution instead (because cold emotionless machines *never* turn against their creators in sci-fi)
 
IIRC, the FASA TNG Officer's Manual makes mention of concern that Data wasn't programmed with the three laws after his recovery.

According to Richard Arnold at the time, FASA supposedly failed to pass the final manuscript of the TNG Officer's Manual (and the RPG's "Season One Sourcebook") through the proper approval channels - and it contains many errors and wild extrapolations from Season One, including a cut-away drawing of Data's internal workings (in which he has no toes) and the misplacing of the Betazoid homeworld as Haven, from the episode "Haven".
 
IIRC, the FASA TNG Officer's Manual makes mention of concern that Data wasn't programmed with the three laws after his recovery.

Well, as Therin said, that was a highly problematical book. And as I said, the Three Laws aren't really that good a system anyway. For one thing, again, they're programming for slavery, and thus unethical when applied to a sentient being. For another thing, they have abundant logic flaws and potential contradictions, which was indeed the basis for many of Asimov's stories.

Certainly the idea of instilling AIs with a set of ethical guidelines to protect others is a good idea, but it doesn't have to be, and shouldn't be, a literal interpretation of Asimov's Laws. Data has ethical subroutines. He has an instinct, so to speak, for compassion and respect for other beings. It's not so absolute that he can't override it, but he would only override it under the most extreme circumstances. And that's a far more plausible system than the Three Laws, a more sophisticated application of the principle.


According to Richard Arnold at the time, FASA supposedly failed to pass the final manuscript of the TNG Officer's Manual (and the RPG's "Season One Sourcebook") through the proper approval channels - and it contains many errors and wild extrapolations from Season One, including a cut-away drawing of Data's internal workings (in which he has no toes) and the misplacing of the Betazoid homeworld as Haven, from the episode "Haven".

Not to mention a wild miscalculation of antimatter yield. IIRC, it described a mere photon grenade for battlefield use as containing an amount of antimatter that would actually, if reacted with an equivalent amount of matter, produce a blast hundreds of times bigger than the largest nuclear weapon ever created. All they had to do was use E=mc^2, a simple bit of arithmetic, to get the right amount of antimatter, but instead they just made something up at random (or else badly bungled that simple arithmetic).
 
Okay, algebra, not arithmetic, but it's still not that hard, not if one is willing to make the effort to construct a whole book.
 
Don't misunderstand me Therin and Christopher, I wasn't mentioning the FASA book as authoritative (it's "wrongness" is much of why I like it so much), but it came to mind as the only Trek book I've read to compare Data to Asimov's work.

I hope "ethical subroutines" aren't the only thing keeping Data from going antisocial. It'd be the equivelant of someone not raping and murdering only because they fear repercussions either now or in an afterlife. You're acting nice, but you're not a nice person.
 
Most human beings have "ethical subroutines" of our own -- our inbuilt capacity to feel empathy for others and our lifelong social conditioning to follow the rules and treat others right. I don't think Data's "programming" is all that different. After all, they are called ethical subroutines, i.e. components of the overall behavior program. They're not some kind of separate block that activates when he has an evil impulse; they're an integral, routine part of his thinking and behavior, just as our learned ethics and intrinsic empathy are ideally a factor in our behavior.

True, when Lore overrode Data's ethical subroutines in "Descent," Data became sociopathic, but the same can happen with a human being who suffers brain damage. A lot of "being a nice person" is having the right "programming" -- strong innate empathy and/or a good ethical upbringing conditioning us to act kindly. Yes, we have the power of choice, but our choices are influenced by our nature, experience, and inclinations. The "program" is one factor in our deliberations, a contribution to the final decision to act ethically.
 
Christopher, why aren't you an editor yet?
Writing and editing require very different skillsets, and because one is proficient at one, it doesn't follow that the person would be skilled at the other. Both present their difficulties, though in different ways. Writing is more solitary, editing is more social. Writing is the more creative activity, editing is the more critical activity. Writing is blunt, editing is finesse. Writing is passion, editing is soul.

Some people can do both and do them well, like Keith DeCandido. Some people can't, like Maxwell Perkins. Some people try to do both, like John W. Campbell, Jr., but they're clearly better at one than the other. No disrespect meant to Christopher, but I don't see him as someone who has the soul of an editor. Conversely, I think I make a better editor than writer, which is somewhat ironic given that my livelihood is writing.

ETA: Just to be clear, I'm not taking a swipe at Christopher by saying he doesn't have the "soul of an editor," though I realize that some may construe it that way. I've known Christopher a long time, and in my observation of him, it doesn't seem to me that editing fits his skillset. It's writing that he's good at. That's all I meant. Clear?
 
Last edited:
If you are not already a member then please register an account and join in the discussion!

Sign up / Register


Back
Top