• Welcome! The TrekBBS is the number one place to chat about Star Trek with like-minded fans.
    If you are not already a member then please register an account and join in the discussion!

Why would Lore want Data's emotion chip?

Sentience isn't irrelevant to rights, lol?. That's in gross contradiction. Sentience resides at the heart of rights since It's legal farce to award a non-sentient entity rights. It would be an absurdity to award a lawn-mower rights.

Well, nope. That's fossil thinking - today we're seriously pondering about giving animals rights that are independent of the property rights of their human owners. Tomorrow, our lawnmowers will be significantly more sentient than our poodles, and more deserving of independent rights.

We already have to face the fact that there are degrees to "sentience" or whatever we wish to call the quality of worth. We already have decided that the level of intellect is not significant, as stupid people get the same rights as smart ones, within limits. But that can't stretch forever, and the rights of a cow must introduce a metric here.

However, that doesn't yet touch the basic error you are making - that of assuming that sentience would be relevant to law. It clearly isn't, because law doesn't acknowledge the concept. Now, in common law this may not matter, but much of the world lives by Napoleonic Code nowadays, and what is not written does not exist.

There may be assumptions behind the law, but those will only be brought up when the law faces a practical test. Poodles and cows first, mowers next, and aliens and positronic androids must be down the line somewhere in the universe that features those.

Now, back to the matter at hand: The issue of Data's sentience is supposedly (according to Maddox even if by Picard's formulation) relevant because if nonsentient, Data would not be entitled to "all" the rights reserved for all "life forms" (a concept that in Trek doesn't include plants and sometimes doesn't include animals). We don't learn that he would be completely devoid of rights even if nonsentient, though. Nor do we learn whether either Maddox or Picard is right about sentience being legally relevant. The judge ultimately decides it is irrelevant, completely skipping the issue and merely choosing to give Data "the right to choose", rather than the famed set of "all" rights.

Timo Saloniemi
 
And you're a fairy-worshipping bigoted freak who not apparently but clearly hasn't read his Bible,
And for that, Timo, I have to give you a warning for flaming. It might have felt justified but you know the rules and leave me no choice.
 
God created man and whatever rights man has, are God given/created, doesn't matter if any versiojn of the Bible doesn't specifically state "God given inalienable rights".
You need to keep your Biblical perspective out of this subject and out of the TNG forum. It is 100% out of place and unwelcome. Please do not reply again in this thread or I will grant you a warning for Trolling.
 
There's no basis for thinking that two androids of identical make would have identical programming. That's something specific to biologically evolved entities. If the UFP law does not recognize such elementary issues, how can it deal with the factual variety of the universe it exists in?

20th century law just doesn't cut it here - believing in its antiquated notions about identity would be no better than believing in God as the basis of legal or moral judgement. It's a personal opinion an old curmudgeon is allowed to hold, but it cannot be applied professionally.

Timo Saloniemi
People don't have identical programming either Timo, but we make general rules for them. The rights of one are implied in the other in an egalitarian society, which the Federation is. So I'd be inclined to think that Data would have set the precedent for Lore and that Lore would have been granted equal status with Data--had he not been a criminal mass murderer and terrorist! That sorta puts a crimp in any I'm-a-real-live-boy scenario for Lore. Also the matter would actually have to be brought by people of standing into a court. But I don't see a jurist like Phillipa Luvois creating new categories or opening new cans of worms when she can just claim a precedent (that Data is not Starfleet property--she says "our" property; this might be taken to exclude a future Sung lawsuit from offending public policy [and thus allow it] had Sung been inclined to do any such thing, ie sue to make Data officially his property). Of course, as any litigator knows, the judge you get matters.
 
Last edited:
Of course, the question of Data being Soong's property is irrelevant with regards to the hearing shown in MoaM, as Soong was considered dead at the time, but it might have made for an interesting wrinkle.

I'm reasonably certain that Starfleet has the authority to confiscate dangerous technologies, which rather begs the question of whether Data might be classifiable as such.

I'm vaguely tempted to rewatch the episode to try to determine whether the ruling is intended to apply to all Soong androids or Data specifically. If the latter, then while it might be a precedent with regards to Lore, it's hardly conclusive. And again, given Lore's homicidal behavior, even if he was given rights they probably would have been mooted almost immediately.
 
While Picard's closing speech to Philippa Louvois (the JAG officer) did expand the whole issue to include the far-reaching implications for other beings like Data, and matters of liberty and servitude, Louvois's ruling for that particular hearing was simply that "Data has the freedom to choose" whether he would undergo Maddox's procedure or not.

Kor
 
Well, nope. That's fossil thinking - today we're seriously pondering about giving animals rights that are independent of the property rights of their human owners. Tomorrow, our lawnmowers will be significantly more sentient than our poodles, and more deserving of independent rights.

We already have to face the fact that there are degrees to "sentience" or whatever we wish to call the quality of worth. We already have decided that the level of intellect is not significant, as stupid people get the same rights as smart ones, within limits. But that can't stretch forever, and the rights of a cow must introduce a metric here.

However, that doesn't yet touch the basic error you are making - that of assuming that sentience would be relevant to law. It clearly isn't, because law doesn't acknowledge the concept. Now, in common law this may not matter, but much of the world lives by Napoleonic Code nowadays, and what is not written does not exist.

There may be assumptions behind the law, but those will only be brought up when the law faces a practical test. Poodles and cows first, mowers next, and aliens and positronic androids must be down the line somewhere in the universe that features those.

Now, back to the matter at hand: The issue of Data's sentience is supposedly (according to Maddox even if by Picard's formulation) relevant because if nonsentient, Data would not be entitled to "all" the rights reserved for all "life forms" (a concept that in Trek doesn't include plants and sometimes doesn't include animals). We don't learn that he would be completely devoid of rights even if nonsentient, though. Nor do we learn whether either Maddox or Picard is right about sentience being legally relevant. The judge ultimately decides it is irrelevant, completely skipping the issue and merely choosing to give Data "the right to choose", rather than the famed set of "all" rights.

Timo Saloniemi
No. Sentience is tethered to law. We treat animals, spanners and people in different ways, lol, according to our assessments of their sentience. We're not blind to the differences here. We don't give spanners the same rights as humans. Animal rights is a discourse that involves animal protection, ensuring that animals are not uncomfortable, injured or in pain. Animal meat is still eaten in Trek without legal repercussions although it's regarded as a throwback. Of course today we have quaint American states where budgies inherit estates. But in the Trek world, people still own pets. Spot the Cat for example. The right to choose is only coherent to people who are fully sentient. i.e spot the cat isn't Data's hostage. So with Data, we either have a piece of equipment that merely simulates full sentience and therefore we can dismantle without controversy or he is fully sentient which must follows on from being granted the right to choose. The ruling recognises Data as fully sentient. Lore gets that too because he shares virtually all of the same traits that Lore does. It's a binary equation, the Federation is an egalitarian society, it isn't stuffed with subspecies and master races nor are they judging members of particular species at a one at a time basis, either you and your fellows are fully sentient, entitled to full rights or you're not, whereby you and other equipment of the same manufacture can be dismantled. And I'm pretty sure that the writers intended Data to win parity of esteem with other Federation citizens in this episode. He's neither a pet nor a piece of equipment.
 
You might want to rewatch "Measure of a Man", which specifically involves the question of whether Data can be treated as Starfleet property...

Well that's kinda my point. Measure should have never even happened in the first place. I mean, it's a great episode, but predicated on a faulty assumption - that Starfleet ever owned Data even if he was a toaster; and if they made him an officer, then they already regarded him as a free, sentient being - or they wouldn't include him in chain of command, responsible for lives.

If anything such a story would be about some villain abusing the organizational powers; but Starfleet? Demanding Data's head, and later, his offspring? Starfleet is not answerable to civilian oversight and Federation rights? Because that's what Measure and Offspring suggest.

(And yes, it's only a show)....
;)

I keep looking for a reason Lore wouldn't have "humanoid rights" - but I can't find one. I can only see his just loss of legal rights for criminal behavior - which can extend to capital punishment. Can, worms, open.
 
Last edited:
I don't know...does Starfleet have any regulation saying that property can't be officers? Probably not, or it would have come up during the episode.
I mean, Starfleet apparently had the right to force Tuvix to undergo a medical procedure (let's not get bogged down on this one) despite his unequivocally stating that he didn't want to undergo the procedure and at least one medical professional refusing to perform it on the grounds that it might harm his patient. In that sense, Tuvix was treated as property. I think that's a scenario that should have also resulted in a full hearing (including appointing legal counsel for Tuvix), but I digress...

As I mentioned earlier, my memory of the specifics of the ruling in MoaM seem to suggest to me that the ruling explicitly applies to Data himself, not all Soong-type androids, much less all androids in general. If Lore's status were to be contested the MoaM ruling would therefore be a precedent, but not a binding one.
I'm also not sure deactivation, which can always be reversed, can be reasonably equated with capital punishment (irreversible), but that's for the jury to decide, heh.
 
I think that Lore would be presumed sentient, but if it were ever to come into question it would deserve it's own hearing. Deactivation can be likened to an induced coma; reversible but perhaps considered "like" a medical procedure.
 
Lore took it so his brother couldn't have it...it's also a biblical reference to Jacob stealing the Birthright of his brother Esau.
This dovetails neatly into the fact that essentially speaking Soong type androids are a race, and the closest thing to a head of their race recognised in some way by the federation is Data.
Ultimately it's Data who shuts down Lore (essentially life in prison...even a death sentence) and later B4.
It's entirely likely that this could be argued as a sovereignty issue should starfleet not agree with his course of action (Worf fights Gowron to the death does he not? Same issue.)

In terms of all the arguing over 'god given' human rights, well, it was accurate for the majority of people at the time it was written in, and arguing or making snide remarks about it are....misguided at best. Especially as Biblically speaking, there are plenty of moments where rights are given (The Rainbow itself, used as a symbol of diversity now, has its roots in one of the earliest biblical examples I can think of.)

Anyway...

Lore is evil. And usurps his brothers inheritance.
 
I don't think life in prison, much less a death sentence, is an accurate analogy, given that generally when one goes through such things one is also aware of the passage of time, which is arguably part of the punishment itself.
 
As I mentioned earlier, my memory of the specifics of the ruling in MoaM seem to suggest to me that the ruling explicitly applies to Data himself, not all Soong-type androids, much less all androids in general.

Yes, see my post above referencing Picard's speech and Louvois's simple ruling.

Kor
 
I don't think life in prison, much less a death sentence, is an accurate analogy, given that generally when one goes through such things one is also aware of the passage of time, which is arguably part of the punishment itself.

Absolutely, it's an imperfect analogy...he could also be reassembled (it's not like he was dispersed in a transporter beam) So it's not really a death sentence...but we see Data shut off B4 in Nemesis so he does seem to act in a...judicial manner for want of a better word.
 
Lore's body is a shell. What matters is his programming, which isn't apparent to the court at a glance. Quite possibly, Lore and Data could be as dissimilar as wolves and humans despite the confusing external appearance - a situation obviously possible with androids even when our intuition tells us that biological species would have inner qualities dictated by their outward appearance aka evolutionary history.

It's not as if Moriarty would automatically be entitled to anything despite looking and sounding and walking like a human. Or even as if his rights would naturally flow from those given to the EMH or to Vic Fontaine or to the fourth Vulcan Love Slave from the right in the back row.

Timo Saloniemi
If Moriarty had a body thenyes he would be considered part of “Data’s” race so to speak. Moriarty exhibits the three criteria necessary for him to be considered sentient.

DB
 
I'm going to go straight back to the topic, folks lol.
Lore was already capable of emotion. He told Data in Brothers that he didn't have to imagine; he knew how hard it's been, and that Data would be surprised. That Data might even be able to understand his "evil" brother.

Lore was jealous and wanted to deprive Data of his birthright - but what would motivate Lore to use the chip, and fundamentally alter his nature, and very possibly his moral alignment?

(Kudos on Spiner's acting choices here. Lore's emotions were much cooler than Data's emotional slapstick routine in First Contact, where he sounded a bit more like the Night Court hillbilly character).

Clearly Lore still had feelings for Soong; he took news of his father's terminal condition badly. Why would he then want to destroy the last wishes of his dying father?

Was it an act of pure mischief? A desire for redemption?

Faulty programming? Inconsistent writing?

What do you think?
I'd say there's probably a couple of factors at play in Lore choosing to not only take Data's emotion chip from him, but to decide to use it himself.

#1, he's a maniac, and his particular emotional instability leads him to react in the extreme. My theory is that Soong simply dialed up his emotional responses too high, & couldn't correct it. He's the opposite of Data. When he is angry, it's murderous. When he's sad, it is probably depression. When he's jealous & petty, it is all consuming, & other concerns fall prey to it. He is tragically led by his extreme emotional instability.

He flies all over the place emotionally, in just his one scene with Soong in Brothers. (Really watch the nuance in that performance) His emotions are his undoing. He can't control them. He's by all rights a psychopath, or at least suffering something analogous to bipolar disorder, or wildly erratic mood swings. So when he becomes jealous of Data getting something he could never have, he uses it simply out of petty spite, which consumes him.

#2, and continuing on that point. He knows he is a psycho. He knows Soong didn't get it right, & he probably knows that he is uncontrollably led by his emotions. It's reasonable to assume he knows exactly what is wrong with him by now. In this way he is maybe the most tragic character in all of Star Trek. Nobody would want to live that way, constantly a slave to erratic emotional swings, but it's not like he can take an anti-psychotic medication, now is it?

So Soong makes the perfect emotion chip for Data. Lore already knows he's damaged goods, sentient, sapient, intelligent, but tragically flawed. What does he have to lose? Adding Data's programing could have any kind of effect, even possibly a positive one, that makes his own condition better or more bearable. Somehow, the fact that he murders Soong directly after getting it, would seem to suggest it just made him crazier lol

But basically, he's self medicating. Using it just because it's there, & there's no other options for him
 
If I'm answering the title question only, I think the major reasons for Lore to take that emochip are :

# To prevent Data from having it, as simple and selfish as that ! There's a lot of selfishness in Lore. I already read people comparing Data to a sociopath for not having empathy, but the one who's acting more as a sociopath is Lore. If Data doesn't have (conscious) emotions, he'll obviously struggle to demonstrate empathy. Whereas Lore does have emotions, but still doesn't seem to express any form of empathy for others. He'd rather use others for his own plans.

# To assert what he imagines to be superiority over Data. Lore takes pride and arrogance in thinking he's superior to his brother by processing emotions. So preventing Data from processing emotions (in a conscious way) allows Lore to keep feeling superior and more 'perfect'.

# To hurt their father. It is pretty obvious from Brothers that Lore still feels a lot of anger and resentment towards Noonien. He deems his father responsible for everything he went throught with the colons (which may be partly true) and clearly wants to take revenge for it.

# To manipulate Data. It's never exactly stated when Lore had that (evil genius) idea to use the emochip to control his brother, but we can imagine he already had some plans well before Descent happened. As someone who's had emotions and feelings for his whole life, he certainly realized immediately what a temptation they would be to Data. Especially when it comes to emotional satisfaction and pleasure. Someone who's not used to them would probably be vulnerable to some form of mind control. Such a brilliant yet horrific plan.
 
To be fair though @TauCygna. Those are all legitimate reasons why Lore would take the chip away from Data, but it doesn't sufficiently explain why he would then choose to have it put in him. It really is a whole separate choice, to use it himself, even though he already has emotions. It's almost certain to alter him in some way, & why would he want that? My theory is because he knows he is flawed where it comes to emotions, & that Soong's impending death is likely to mean there will never be any chance for him to be improved in that way. This chip, even though it isn't designed for him, offers the only viable possibility for him to be different than he currently is... for good or ill
 
To be fair though @TauCygna. Those are all legitimate reasons why Lore would take the chip away from Data, but it doesn't sufficiently explain why he would then choose to have it put in him.

As I stated, I was answering strictly to the title, not paying attention to what was previously said. But your extended question sure is legitimate. I should watch Brothers again and pay more attention to Lore's behaviours, they would certainly give me new insights on the topic. To be honest, I don't even remember Lore installing the chip on himself, I just remember him taking the chip away.
 
If you are not already a member then please register an account and join in the discussion!

Sign up / Register


Back
Top