• Welcome! The TrekBBS is the number one place to chat about Star Trek with like-minded fans.
    If you are not already a member then please register an account and join in the discussion!

The Measure of a Man: personhood, artifical intelligence, etc.

Or we could all be disembodied brains in jars networked to a shared simulation of reality. :eek:

Kor
 
For example - Is Data a life form or property? It seems to me that once he became an officer in StarFleet, those questions were already answered. "Toasters" do not become officers. So, the issue never really should have come up. Is there any other example in all of Star Trek where "property" was a StarFleet Officer? I don't think so.

Agreed. In terms of production and casting and so forth, "Measure" has always been a good episode for me. But I think the story suffers from trying to make a mountain out of a molehill from a philosophical standpoint, to ask a question that can't be easily resolved, and it never felt to me like Data was under a legitimate threat. That Maddox was stated to be the only holdout when Data first entered the academy doesn't help, and even he seems to quickly realize that "Data's sentience can't be proven" is harder to defend than "Data is a machine and property of Starfleet."

I also think "The Offspring" was another brilliant episode but once again, the events with the "bad admiral" never should have occurred. It raised the whole property issue again even though that was decided in "The Measure of a Man." It was nice to see Picard refer to those events and stand up to the Admiral but it really never should have happened. Even when Picard found out that Data had created a daughter, he seemed to chastise him, as if Data did something wrong so it seems while the property issue had been decided, there still remained some doubt, even in Picard's mind.

There are some extra lines of dialogue in the script that presumably weren't filmed for time constraints, but I rather like them because they give Admiral Haftel more context for why he's uncomfortable keeping Lal on the Enterprise. He uses the phrase "effective isolation" in the aired version but these unused lines give this a wider context, when Haftel says that Daystrom also worked in isolation when he designed the M-5 and that was one of the reasons it failed so badly. Data, and by extension Lore, Lal and similar androids, represent both a considerable asset and a major threat by way of their unique abilities, and Starfleet has an understandable concern of what might happen if Lal were to prove dangerous or unstable. The subject of Lore interestingly never comes up, but I can see him being an argument against Soong-type androids and technology.

Picard also has a few extra lines in the scene where they're in Data's lab, where he points out that the Enterprise crew has a unique degree of experience working with Data and understanding him better than those at a Starfleet lab would.

You can read this version of the script here, around scene 40 or so (extended conversation between Picard and Haftel in the ready room).
 
I wonder why the EMH would even be considered a life form when it's just a hologram?

I'm with you. My thinking is that the EMH is nothing but a program being run on Voyager's main computer. If the main computer isn't sentient, how the heck can one of its programs be? I had the same problem with Moriarty.
 
I can't understand that argument. Surely only some small part of the computer could be running a sentience program, just like only a small part of this computer is running the program that allows me to type this response!

There's no need for sentience to permeate every single console, chair or piece of carpeting on Voyager. It can be created when and where needed, in whatever form is needed. The form the ship usually needs does not include comical banter, writing of poetry or sadistic violence, or whatever the usual characteristics of human sentience; the form needed to run the EMH has its uses for all those aspects, though.

Timo Saloniemi
 
My stance on the subject is rare here on TrekBBS but I'm one of the few who don't think Data or the Doctor or anyone like them are life forms. Don't get me wrong, in The Measure of a Man, Data should have had the right not to be broken down but that's more because his protagonist had no right to take him apart, not to forget they made him an officer and he get's the rights thereof. Though, if they did that same episode but had Noonian Soong say something like "I forgot to put this in and that other system is faulty. Let me dismantle him," I would have sided with Soong, Data is his. As for following him, I would and not repeat what that guy did in "Redemption Part II". I respect Soong's greatness and acknowledge Data's past deeds but I'd never rank him up there with us evolved life-forms. Those millions of years of effort can't be matched in some guy's afternoon.


Why is the nature of Data's creation determinative of whether he is a life form?

They aren't alive, just fantastic simulations.
So in your opinion sentience does not factor into the determination of what qualifies as a life form. I must point out the relevant question is not so much whether Data matches the definition of a life form. As one could simply define a life form to be any cell or combination of cells made of organic material. That would also mean the Q, and some of the gaseous beings, or say the race of photonic beings or voyager. Would also not be life forms.

The relevant question is what qualifies a being to self determination. Many life forms are not capable of self determination. They are called pets. It seems like your definition would make Data in fact Dr. Soong's pet. Or something very similar. That's not totally unsound, as Soong created him, but in my opinion if Data or the Doctor is capable of self determination to deny them that or even worse replicate them, and deny self determination to all the replicants set in motion a very old, and very inevitable chain of events that eventually leads to violence, and revolution.

Picard touches of this somewhat when he recognizes that it would be creating a slave race. I would say I agree with him
 
How does a computer say "Cogito ergo sum" when it's possible that it is merely parroting those words programmed into it, and not grasping self-awareness?

Well, for human beings - same question. How do we know we are self aware? We do not. We simply act on the assumption, which is "good enough," and "all we can do."

So while I can call the EMH merely a program - I can't say that I am not one.
 
Many life forms are not capable of self determination. They are called pets.

Or slaves, prisoners, or sometimes wives.

Self-determination is something one must express. But lack of expression may stem from various sources, among them oppression (refusal to accept the expression), language barriers (failure to understand the expression), and fundamental inability to be self-determining (lack of means to create the expression). So, whazzup with cats? They definitely express self-determination on occasion, saying "I want ooooooout!", and most people do speak enough cat to understand that. Often they just oppress their cats and refuse.

Requiring the defendant to have a desire for self-determination isn't all that different from requiring the defendant to be a lifeform. Single-celled organisms have goals in life; OTOH, humans may well lack the wish to self-determine.

Significantly, the court did not dabble too deep in definitions until Picard brought up his army of straw men. The court, like most courts, was interested in settling a specific issue, that of Maddox claiming control over Data on the argument that Data was property. Humans can be property, de facto if not de jure; being human might not have been enough to save Data. But the court erred on the side of certain basic rights by granting freedom to those who were clever enough to ask for it (without creating any precedent stating that those who don't ask cannot be given, of course).

Timo Saloniemi
 
Significantly, the court did not dabble too deep in definitions until Picard brought up his army of straw men.

The writer had previous experience as an attorney, so maybe she saw stuff like this work in real courtrooms.

But in any freshman writing course, you're taught to eschew straw man arguments as logical fallacy.

How ironic.

Kor
 
Hmm, if Data was property, why should he have to take an oath to Starfleet? We know there was one, as was mentioned in A Matter of Honor, when Riker was aboard the Klingon vessel and says, about his loyalties, 'I will not dishonor my oath to the Enterprise."

In Descent, refusing to mass-murder the whole Borg species, Picard refers to the oath he took to "uphold certain values."

The Starfleet Oath was mentioned in TOS, Into Darkness and Voyager too. It's canonical.

Which implies that Data will have taken the oath as a Starfleet officer. My point is, an oath is an act of free will, and free will is an act of an individual, and an individual is, ipso facto, free under the Federation Charter, which the Starfleet mandate is sworn to uphold.

Otherwise, if the question of Data's freedom really were at issue, so would be his oath to Starfleet and his sworn duty to obey its orders. The very act of ordering him to surrender his freedom would release him from his obligation to Starfleet as a sworn officer - a role he chose as an individual in the first place, as confirmed when Noonian Soong asked Data why he chose to enlist in Starfleet, and concluding "to emulate your emancipators." Data was not forced to join Starfleet. Free to enlist, he was also free to resign.

In other words - it was never for Starfleet to rule on Data's status as an individual; it was only for Starfleet to manage its own response to Data's inherent individuality - to acknowledge it or to deny it, but never to rule on it.

If I were Data's lawyer, I would argue this point, and then I would "Turn off" the opposing party with a stun gun and claim "Fair conduct, Your Honor."
 
Last edited:
Even though it's given a lot of weight by the commanding officers, the Oath itself may just be little more than a technicality. Like if you want to play Racquetball, you have to join the club. You either take The Oath, or you'll never take part ... whether you're programmed to, or choose to, or whathaveyou.
 
^Data does have an exception due to being a Starfleet officer. His rights was bestowed on him.

Why is the nature of Data's creation determinative of whether he is a life form?

They aren't alive, just fantastic simulations.
So in your opinion sentience does not factor into the determination of what qualifies as a life form. I must point out the relevant question is not so much whether Data matches the definition of a life form. As one could simply define a life form to be any cell or combination of cells made of organic material. That would also mean the Q, and some of the gaseous beings, or say the race of photonic beings or voyager. Would also not be life forms.

The relevant question is what qualifies a being to self determination. Many life forms are not capable of self determination. They are called pets. It seems like your definition would make Data in fact Dr. Soong's pet. Or something very similar. That's not totally unsound, as Soong created him, but in my opinion if Data or the Doctor is capable of self determination to deny them that or even worse replicate them, and deny self determination to all the replicants set in motion a very old, and very inevitable chain of events that eventually leads to violence, and revolution.

Picard touches of this somewhat when he recognizes that it would be creating a slave race. I would say I agree with him

I would count Q and the other noncorporeal life in Star Trek as life forms. They were evolved beings, they weren't made.

Also, though Picard said something along those lines, I doubt he'd take any action if he ran into aliens who created a workforce of artificial intelligent androids that maintained their economy and it's not like one of those android could say "I wasn't put on this world to serve you" because they were. I don't think slavery would be the right word for something like that.
 
I am disturbed by how easily both Data and the Doctor can be reprogrammed and turn to malicious intentions and actions, for their free will and responsibility to be so limited does suggest that there rights do deserve less consideration than biological people (true that others can essentially be brainwashed/"reprogrammed" but not so easily).
I'd say that Stat Trek has demonstrated just as many instances of biological life forms succumbing to external influence as artificial ones... Probably a lot more, whether it's Picard believing he could actually "see five lights" or Geordi's brainwashing, or Riker's institutionalization or Troi & O'Brien getting possessed, or the entire Enterprise crew being mind controlled by amnesia, xenophobes or a stupid game of Candy Crush, and those are just a sprinkling of examples from only TNG. I'd maintain that the risks are at least equal and therefore not sufficient grounds to curtail artificial life form's liberties
 
Hey, I'm rather new into TNG in general (I started watching it back in February) but have made the effort to watch a majority of the episodes. My advice is probably limited, but Data being my absolute favorite of this franchise, I have some thoughts. Data cherishes memories and his fellow crew mates. He's made close friends without the need for emotions and was even intimate with Yarr, female human (him beating around the bush with it in this episode was kinda funny). Even in "In Theory" he managed to have a short term relationship. He makes choices for himself whether joining Starfleet conducting experiments, or heck even having his cat Spot on board. That alone makes me feel like there's more to him than just cogs. He has everything there, AI, superior strength and etc. Would the same argument stand if this was say, Lore who has more emotions than his younger brother? (I don't know if emotions is the right words for Lore, I've only seen mostly "Datlore" and "Brothers" so far.) Of course, when it comes down to it both of them are still considered just androids who can be turned off in an instant. All I know is, if Data can make his own choices, conduct problems/solutions to life problems and actions with free will, then he is sentient. Having a whole race of creatures like Data who can make their own choices in their minds and form friendships, is I agree turning into a form of oppression.
 
If you are not already a member then please register an account and join in the discussion!

Sign up / Register


Back
Top