• Welcome! The TrekBBS is the number one place to chat about Star Trek with like-minded fans.
    If you are not already a member then please register an account and join in the discussion!

Should Data been thrown out of Starfleet?

JesterFace

Fleet Captain
Commodore
The thread title doesn't really say much but the name of the episode is too long: 'The Quality of Life'. :)

The question is, should Data been thrown out of Starfleet or give him somekind of reprimand after he dangered the lives of his shipmates to save artificial life that had not been even proven to be sentient life. Data acted on his own judgement. I think in the end of the episode Picard was a bit too soft with him.

I think Data was foolish, he chose life of exocomps over those of people with real lives, relatives, responsibilities... I'm not saying that the exocomps are worthless (that was the point of the episode) but do exocomps have a "life to lose".

It has been a while since I watched this episode, correct any mistakes I might have made in this post.
 
Data should not have been thrown out of Starfleet. We, the audience, expect that the ship's crew do the right thing, the moral thing, the correct thing ... because it's so contrary to Real Life. These characters hold themselves to a higher standard and the point of the show, because as characters, they have the luxury of doing The Right Thing ... every time. Regardless of what our reality would demand of us, these STAR TREK people get to show us that being an idealistic dreamer is not only an admirable quality ... but a life-saving one, sometimes, as well.
 
he chose life of exocomps over those of people with real lives, relatives, responsibilities...

I often stumble upon that point of view in the anti-speciest debate. This is IMHO an anthropocentrist point of view : since we're humans, we tend to value human life and human sentience over the ones of other species (IRL : non human animals), even when there is scientific evidence that the other animal species do possess sentience and emotions similar to ours. The same would probably happen if artificial life forms existed, people would still have an anthropocentrist prejudice towards artificial life forms by thinking their lives matter less than human lives.

I personally don't think Data was foolish, and I can understand his reaction. It's not different from saving non human animals, especially when you take his situation into account : he's basically one of the only two (he didn't know about his gynoid mother yet) artificial life forms known in the galaxy, and he expressed on many occasions that he would have prefered not to be so alone. This is fully understandable that he wanted to protect a lifeform so closely related to his, and that he was ready to risk his career for it. He probably wouldn't have been able to forgive his own fault if he stayed inactive in that situation.
 
Last edited:
On the contrary, I believe he should have been awarded a commendation!

One of the themes in The Next Generation is that we do not have the right to sacrifice another life to save our own. Our lives are not more valuable than others', even if we cannot relate to them or understand them. Sometimes we can try to hold to high ideals, but then compromise them when situations get tough. Data did not have that failing, nor really should any true Starfleet officer. Captain Picard realized that after the fact, which is why there would be no question about punishing Data. This was a beautiful episode and another reason why TNG is my favorite series. :)
 
  • Like
Reactions: ALF
Picard literally did the same thing with the Crystalline Entity. He endangered them all in order to give the unknown life form a chance at life. When your main charter is to seek out new life, you have to be willing & prepared to put your own safety at some amount of risk, when encountering it. Space is dangerous, & wandering up to strange new life has its risks
 
Recently rewatched The Quality of Life and I was diggin' the exocomps. No, Picard was right on the money about the actual reason they were so interested in the possibility of the exocomps having intelligence - it is the Enterprise's primary mission to seek out new lifeforms. They are at the particle fountain on a fairly routine Federation inspection and Picard is more than happy to explore the possibility that the exocomps have life/intelligence since inspecting the particle fountain was a chore.

You should rewatch the episode, JesterFace, you'll find Picard's interest in Data's theory is keen and supportive from the beginning. The only one who gets their panties in a knot is Dr Falleron, who in her defense, created the exocomps to be tools and probably didn't want to initially deal with the fact that her tools were now considered sentient. Were she more enterprising, she might have realized the exocomps had more potential value than the particle fountain!
 
People hate "The Way to Eden" because it changes and contrives characters to fit the plot. TNG did it so often in its later years that "Eden" is positively innocent by comparison.

And, of course, Asimov's own laws of robotics, which discuss a robot's sense of self-preservation (as long as said sense doesn't conflict with human life). But TNG doesn't need to adhere to other writers' universes. Would get terribly boring...

Nor does it help that TQoL is, what, the third or four-hundredth time they re-imaged "The Measure of a Man" and with inverse proportion regarding its efficacy.

Even the beeps and chirps the metallic magical maguffins emit is meant to be cute to sell lame plastic toys to toddlers - mostly because metal is like really sharp and would hurt them, but keeping in mind said "merch" (like gag me with a spoon) never got peddled, suggesting the story wasn't that popular to begin with. Then again, nobody made a Cmdr Data figure with detachable arm to honor the popularity of "The Measure of a Man" with...

I didn't care for it in 1993. It's not aged well.
 
If it were a real situation, IE robots would be as advanced as Data, Data would have been dismantled and examined to see what's wrong with him. I don't believe we'll ever consider that the "life" of a machine is comparable to that of a human being.
 
But when it came to another artificial intelligent life, namely Lore, Data had no problem putting an end to that life. ("Descent") Data essentially gave Lore the death penalty by dismantling him. I realize that Lore committed atrocities. But I thought the death penalty was prohibited in Starfleet, except for that one exception regarding Talos 4.

What Data did to Lore, did Data violate, if not the letter, but the spirit of the no death penalty rule? Lore may have been a threat and was criminally minded, but was incarceration or reprogramming (rehabilitation) not an option?
 
If it were a real situation, IE robots would be as advanced as Data, Data would have been dismantled and examined to see what's wrong with him. I don't believe we'll ever consider that the "life" of a machine is comparable to that of a human being.

That's a separate debate. I think that if a machine could really perfectly simulate the way a human brain works, to the point that it could have its own values and own goals that are not directly programmed into it but based on its own experiences, I think they could.
 
But when it came to another artificial intelligent life, namely Lore, Data had no problem putting an end to that life. ("Descent") Data essentially gave Lore the death penalty by dismantling him. I realize that Lore committed atrocities. But I thought the death penalty was prohibited in Starfleet, except for that one exception regarding Talos 4.

What Data did to Lore, did Data violate, if not the letter, but the spirit of the no death penalty rule? Lore may have been a threat and was criminally minded, but was incarceration or reprogramming (rehabilitation) not an option?
He didn't destroy him. He deactivated him, & possibly dismantled him. That is in fact how they found the SoB. I couldn't even be sure there's a facility capable of holding Lore indefinitely. He has no life expectancy, & incalculable skills at his advantage. He clearly posed a continued danger to pretty much the entire UFP.

Plus, if Picard can send away Worf after killing Duras on active duty, during a mission Starfleet was involved with, with nothing more than a reprimand, cuz... cultural respect... Then frankly, the Soong Androids are their own culture as well, & how Data handles Lore falls into the same category
 
Data himself denied his own brother the right to be considered as a sentient being when he basically terminated him. Data was not in any danger so that self-defense doesn't apply, if it were a human, he would have been charged with murder. If you kill someone on death row, two minutes before they are executed you'll be charged with murder. The irony is that Data himself through the termination of his own brother considered that he doesn't have the same rights as a human/sentient being.
 
If it were a real situation, IE robots would be as advanced as Data, Data would have been dismantled and examined to see what's wrong with him. I don't believe we'll ever consider that the "life" of a machine is comparable to that of a human being.

But this is not happening in the real world. This is Star Trek, an utopian universe. If you want to watch stories about androids in a dystopian real world, you can watch Real Humans and its reboot Humans.
Anthropocentrism and speciesism are the reasons I hope people like Data will never exist. As much as I adore human-like AIs and androids, I'm convinced humans would enslave them and treat them like crap. Humankind doesn't even respect its non-human cousins, so there's no point in creating more sentient creatures we wouldn't respect either. As I always say, we don't deserve advanced AI and we would never deserve someone like Data Soong.
 
There seems to be an unwritten rule in Starfleet that insubordination is okay as long as your heart is in the right place.
This is true. Star Trek has always chosen the humanistic approach to problem solving, instead of blind adherence to regulations. It's a theme running through every series. Not sure about DS9 as that was the one series I didn't watch beyond the first season or two.
 
But this is not happening in the real world. This is Star Trek, an utopian universe. If you want to watch stories about androids in a dystopian real world, you can watch Real Humans and its reboot Humans.
Anthropocentrism and speciesism are the reasons I hope people like Data will never exist. As much as I adore human-like AIs and androids, I'm convinced humans would enslave them and treat them like crap. Humankind doesn't even respect its non-human cousins, so there's no point in creating more sentient creatures we wouldn't respect either. As I always say, we don't deserve advanced AI and we would never deserve someone like Data Soong.

I don't believe we would enslave robots any more than we enslave cars or washing machines. What gives the impetus to make devices is to relieve humankind of harrowing thankless tasks. If we are to treat (automatic) cars like human beings and let them go wherever they want without passengers then what's the point of making cars? Samely if we make robots it's to serve a purpose, as a matter of fact, "robot" comes from a Czech word that means "work". If robots don't work then we won't make them. Just because a robot looks more like a human being than say an automatic lawnmower doesn't mean that it's any closer to a human being than the said lawnmower. The Star Trek people (be it the writers or the fans) have sometimes a romantic irrealistic notion of technology in general and that's especially seen in (sentient) robots or holodeck programs. A skilled puppeteer can make it seem like his puppet is alive but that puppet has about as much chance in achieving sentience as a robot, be it so humanoid that people mistake it for a human.
 
Starfleet kills fairly commonly.

Killing and giving the death penalty are not the same thing. The current constitution of my country does not allow for the death penalty in any circumstance (not even during times of war or for military law), but soldiers stationed in countries like Afghanistan may sometimes still (have to) kill people. Might be the same for Starfleet.
 
Discofan, it's a science fiction show, which is similar to a fantasy. There are going to be things that are real in the show but would not be able to exist in real life! Warp drive, extra-terrestrials, and so forth. In this type of a discussion we're talking in the context of what is real as defined by the show, and in this case that includes sentient/sapient machines! "Can we have truly intelligent artificial life?" would be a whole other discussion, possibly for a science philosophy group. It would be like if someone was talking about xeno ethics in Star Trek and I replied "Well aliens don't actually exist, so ... " :)

Sometimes we have to simply suspend our disbelief, and wonder "What if?". What if there really were fully intelligent and self-aware machines which we had been using as tools? What would be the moral implications of that? Would it be slavery? I think the Exocomps were a great way to explore this, because they are not anthropomorphic. It's easy s you said for people to sympathize with a puppet that looks like a human. But what about something that looks like a screwdriver but nonetheless exhibits signs of conscious thought? This in my mind was a great Star Trek episode!
 
If you are not already a member then please register an account and join in the discussion!

Sign up / Register


Back
Top