• Welcome! The TrekBBS is the number one place to chat about Star Trek with like-minded fans.
    If you are not already a member then please register an account and join in the discussion!

The Most Toys...... Oh and Transporters.

The three laws were just programming, a robot that didn't receive that programming would not have that programming.
Asimov explains that they are actually enfolded pathways, hardwired so-to-speak into all positronic brains. And that how low the activation potential is (or some such technobabble, I remember "potential" used alot) determines its strength to suppress or compel other action. But the positronic pathways in Asimov's robots are physical things, not just software.

I'd have to go back and mine exact quotes. Here's one that makes it pretty clear though from The Naked Sun:


"None at all. It was absolutely useless. Its positronic brain was completely
short-circuited. Not one pathway was left intact. Consider! It had witnessed a
murder it had been unable to halt-"



But Soong Androids aren't Asimovian robots, and I would guess you are correct for Data. He does talk about his programming alot.
 
Last edited:
The point is that he's basically programmed not to kill.

But he isn't. Before "The Most Toys", he kills with abandon, unhesitatingly firing the ship's guns to deadly effect against a variety of foes. He never expresses the sentiment that he would feel bad about that, let alone inhibited in doing this.

What Data thinks of killing is made explicit in this very episode:

" I have been designed with a fundamental respect for life in all its forms and a strong inhibition against causing harm to living beings."
"I would not participate in murder."

Nothing there about not being able to kill.

That's more problematic than Spock's pacifism, but never mind.

Spock is a pacifist? In "Balance of Terror", he was the bloodlustiest of the warhawks. He never expressed a pacifist sentiment as such - and while he clearly held Surak in high regard, his illusory interpretation of Surak never uttered a pacifist sentiment, either.

The three laws were just programming, a robot that didn't receive that programming would not have that programming.

In Asimov's universe, the three laws were always built into the hardware; it was basically impossible to design a positronic android brain that would lack the three laws and still be compatible with market needs, because the effort required to create such a product essentially out of scratch did not appeal to any manufacturer. The early models had it; all later models were derived from the early models; and there was no incentive to scratchbuild a lawless brain.

Asimov's villains (human or robotic) tended to circumvent the laws in two ways: they lied to the law-abiding brains about the circumstances, or they built teeny weeny brains on the cheap, incapable of much thought but affordably designed for the tasks at hand.

Timo Saloniemi
 
I generally don't believe in the concept of a "lie of omission" - I'm a literalist. One only tells a lie if they answer a question in a way they know is false. Data did no such thing.

It's not a lie to say nothing or to dodge the question. "Perhaps something occurred during transport"? In a way, that's TRUE! Something did occur...so where's the lie? ;)

Besides, as I said, it's just not important. Data knows he fired, Riker probably does too, but Fajo is still alive and able to be arrested (without violence). So there's no point in pursuing the matter further.
 
Has it ever been explicitly stated that Data was programmed with the Three Laws of Robotics?

I know it is implied, especially with having Asimov's Positronic Brain installed, but I cannot recall anybody ever saying he was either restricted by the three laws or had been programmed not to kill.

As mentioned above, he is serving on board a starship which does enter into conflicts and death is the result at times. If he is performing his duty in combat and is called upon to fire phasers which would kill an enemy which while serving a just and larger cause, would conflict with the first law on some level.
 
In later books Asimov gave us the zeroth law of robotics, ingrained to some extent in R Daneel Olivaw and his followers:


A robot may not harm humanity, or, by inaction, allow humanity to come to harm.


In Asimov's stories, it isn't an initially enfolded positronic pathway like the 3 laws, but one that some robots themselves manage to follow and ingrain into their pathways over time. For them, though, actually employing the zeroth law to allow killing a human causes severe damage anyway (as with Dors Venabali) that takes a long time to come back from (with the implication that some don't).

Data doesn't seem to have the really strict 3-law pathways, though. They appear more to be strong guidelines for him. In fact the second law is not part of him--Data has no compulsion to obey humans (or sentients). He obeys his Starfleet superiors, like any Starfleet officer, but that's it. Asimov robots MUST obey a human command. Well actually the 2nd law is kind of dead in later stories, even among non-zeroth law robots, because all of them are secretly babying humanity some way, and no one really knows about them; but in the non-Trantorian Empire/non-Foundation books (I, Robot, Caves of Steel, The Naked Sun, Robots of Dawn, etc), they have to obey any human.

Data has much weaker versions of the 2 of the 3 laws (in fact, in him, the third law of robotic self-preservation might be even stronger than some others, while weakest in Asimov robots), but he might have evolved a zeroth law as well. And with him, the zeroth law would be about the same strength as the others, which are not even remotely Asimov strength.


I looked up positronic brain and under Star Trek, it does seem to support this.
https://en.wikipedia.org/wiki/Positronic_brain#Star_Trek

Star Trek[edit]
Several fictional characters in Star Trek: The Next GenerationLieutenant Commander Data, his "mother" Julianna Soong Tainer, his daughter Lal, and his brothers Lore and B-4—are androids equipped with positronic brains created by Dr. Noonien Soong.

None of these androids are constrained by Asimov's robot laws; Lore, lacking ethics and morals, kills indiscriminately. Data, though his actions are restricted by ethical programming provided by his creator, is also capable of killing in situations where it is absolutely necessary.

"Positronic implants" were used to replace lost function in Vedek Bareil's brain in the Deep Space 9 episode "Life Support".
 
TNG taught me a lot about lies of omission, like that time Picard nearly beat the crap out of Wesley Crusher for doing so. Data is just lucky Picard didn't blow him out the airlock for that... I mean lecture him.
 
The general idea was that Data would develop naturally over time into a more "complete" representation of a human being..the chip just gave it to him all at once..and of course he couldn't handle it well.

RAMA

I find it funny at the end of "The Most Toys" that Data is about to put a hole into the body of Kevis Fajo but at that same moment the Enterprise finds him and beams him back to the ship with the phaser detected in firing mode.

He later lies about this. Isn't this some kind of emotional response but yet he doesn't have the emotion chip?

Also the transporter can shut off weapons that are in the matter stream. Aren't they like disassembled like the poor passengers in that stream?
 
I don't remember specifics, but the episode is pointless without that moment where Data gathers himself to do this thing that he's not supposed to be able to do, shooting Fajo. Fajo was certain that Data would not shoot, and that he could not shoot. Fajo was stunned when Data started to shoot anyway.

If it wasn't that he was programmed not to kill at all, maybe he was programmed not to kill in cold blood. It was something. We are supposed to wonder what change Data made in himself that moment, so he could shoot.

Maybe I'll have watched it again by tomorrow.
 
Since Data reasoned - quite correctly - that Fajo would kill again if not stopped (and Fajo himself SAID he would), then, logically speaking, killing Fajo is not acting 'in cold blood'.

Data was acting to save others from certain death. Which is his right.
 
Last edited:
Data lied, that was pretty clear. It was not an emotionally based lie though, it was a pragmatic one. He calculated that telling the truth would do more harm than good. What Data took away from his experiences in that episode was that you have to take the greater picture into account instead of sticking to simple directives.

He has a directive not to kill, and is presented in a situation where he has three choices. Killing one person, perpetual enslavement, or causing the torturous death of many other people, and his programming resolved the conflict by calculating that in the greater picture, killing one person was the best choice.

I'm not sure why he felt he need to lie to Riker as it was a clear act of self defense, but he must have calculated that if he had fired at Fajo it would increase the odds of him winning his trial. It would be out of character for him if he lied to protect himself from consequences.
 
^ Data didn't lie. He was never asked flat-out IF HE FIRED. All Riker said was "What happened?"

Data was evasive, yes. But he never actually said he didn't fire. Thus, he didn't lie.
 
Data made an intentional misrepresentation, implying to Riker that the phaser somehow misfired during beam-out. His directives against killing would be programmed inhibition; his ethical subroutine would inform his active programming whether a specific act of killing would be ethical (self defense; defending his crewmates and/or innocents, firing at an enemy ship, etc) or unethical. The grey area with Fajo; there was no immediate danger to himself or anyone else. He chose to fire, knowing that he would be killing Fajo extra-judiciously; without benefit of a trial or lawyer.
I think the episode would have been stronger if Data had actually killed Fajo or hadn't dissembled with Riker.
 
Data didn't say the phaser misfired. He said, and I quote, "Perhaps something occurred during transport." THAT IS NOT A LIE. It's being evasive, yes. But it is not a false statement.

And like I said, Data's statement is factually true. Something DID occur during transport. Data just didn't say what it was. ;)
 
Since Data reasoned - quite correctly - that Fajo would kill again if not stopped (and Fajo himself SAID he would), then, logically speaking, killing Fajo is not acting 'in cold blood'.

Sure ain't acting in hot blood! Data is a coldly calculating killer; whether there's fault in that, well, he apparently doesn't care.

Data was acting to save others from certain death. Which is his right.

Depends on UFP laws and customs, I guess. Typically, it isn't legal to kill people even in the hopes of saving other people through said act of murder. Except if you happen to carry exceptional powers, say, those of a police officer. Which Data in practice appears to be - Starfleet is the only known law enforcement organization in the 24th century Federation. As well as the only known organization for defensive or offensive warfare, another profession where killing for greater good generally is considered okay.

His directives against killing would be programmed inhibition; his ethical subroutine would inform his active programming whether a specific act of killing would be ethical (self defense; defending his crewmates and/or innocents, firing at an enemy ship, etc) or unethical.

Sounds likely - essentially, Data has built-in pangs of conscience, which he generally acts upon because they appear quite pathologically severe (at least in ST:INS) and he very seldom is lacking in means.

While Data may be driven by programs/habits somewhat different from those of the average (20th century western) human, he still strives to be an average human in most cases. But inhumanly excessive attention paid to pangs of conscience seems to be something he doesn't want to unlearn, any more than, say, Odo does.

Timo Saloniemi
 
Data made an intentional misrepresentation, implying to Riker that the phaser somehow misfired during beam-out. His directives against killing would be programmed inhibition; his ethical subroutine would inform his active programming whether a specific act of killing would be ethical (self defense; defending his crewmates and/or innocents, firing at an enemy ship, etc) or unethical. The grey area with Fajo; there was no immediate danger to himself or anyone else. He chose to fire, knowing that he would be killing Fajo extra-judiciously; without benefit of a trial or lawyer.
I think the episode would have been stronger if Data had actually killed Fajo or hadn't dissembled with Riker.


The very first time I saw this episode that is exactly how I had wished it had ended while talking about it with others. I would have preferred he had killed him, and it would have made for a stronger ending.
 
Did Fajo not tell anyone "By the way, did Data mention he tried to gun me down in cold blood? You might want to run a diagnostic on the killing machine? Oh, did he not tell you? Why do you suppose he saw fit to lie about that in his official report?"

A lie of omission is considered a lie in military tribunals. It is deliberate misleading, a breach of command structure. Fajo could raise enough stink to get Data court martialed and very publicly smeared. Not that he didn't deserve what he did to Varia.
 
Data killing Fajo would've ruined the episode. That would've been explicitly framed as a malfunction.

It's the ambiguity that makes this episode work. For all of Fajo taunting him, the question that is posed is whether has Data for a moment has transcended the sum of his circuitry? In his defeat and in his disgrace and in his own twisted way, it's Fajo that has uncovered insight into Data that noone else is privy to. Data with his "lie of omission" is also subtle evidence of Data transcending his the sum of his parts.

If you are looking in tech manuals and crying foul that Data's behaviour doesn't square with his program, it's an OK observation as far as it goes, but you're missing the point of the episode.
 
Data made an intentional misrepresentation, implying to Riker that the phaser somehow misfired during beam-out. His directives against killing would be programmed inhibition; his ethical subroutine would inform his active programming whether a specific act of killing would be ethical (self defense; defending his crewmates and/or innocents, firing at an enemy ship, etc) or unethical. The grey area with Fajo; there was no immediate danger to himself or anyone else. He chose to fire, knowing that he would be killing Fajo extra-judiciously; without benefit of a trial or lawyer.
I think the episode would have been stronger if Data had actually killed Fajo or hadn't dissembled with Riker.

Data killing Fajo might be a legal grey area when it comes to self defense but no court would convict him under those circumstances, even if he weren't the main character of a television series. If somebody kidnapped you and held you against your will you would be justified using lethal force to defend yourself. Kidnapping and coercion is a form of violence.

Data pulling the trigger did square with his program. We have evidence from which episode I can't remember when Data is talking about his formative years that Data's circuits form new and different connections much like neurons and axons, so it makes sense that his circuits would adjust in a situation that warranted killing. The part that doesn't square is that he was not open with Riker afterward.
 
If you are not already a member then please register an account and join in the discussion!

Sign up / Register


Back
Top