• Welcome! The TrekBBS is the number one place to chat about Star Trek with like-minded fans.
    If you are not already a member then please register an account and join in the discussion!

Should Data been thrown out of Starfleet?

I think I said "intelligent" because it seems like such a commonly used word when discussing science fiction concepts, such as "The Search for Extra Terrestrial Intelligence", etc. I have found that even if I use "sentient" some people complained that I didn't say "sapient". I don't know how to win sometimes! :)
 
@Marynator : I understand ^^
To go a bit more in depth in my idea, I'd use sentience rather than intelligence as the main criteria because there is no formal definition of intelligence.
We often say intelligence is about adaptative skills, but ourselves wouldn't meet that milestone in some environments (ie modern day European people put in the middle of a desert with nothing but a knife and a bottle of water).
We also often say intelligence is about language, but as we say that, we tend to think only in terms of human spoken language and we forget that many different species have complex, efficient languages not based on syllabs and words.
Another common misunderstanding is to think that intelligence implies the capacity of understanding our own rights - which is not the case. Young children, babies, people in a coma and people with specific mental disabilities, don't have a proper understanding of their own legal rights and status. That's where we draw the line between moral patients (people who have rights even if they can't understand them - like exocomps) and moral agents (people who have the same rights AND are conscious of having said rights - like Data).
The common point between Data and exocomps would definitely be sentience more than intelligence, as their adaptative skills, communication and level of self awareness are different, much like a human being and, let's say, a dog.
 
Last edited:
Like the smart bomb in Voyager, you mean?
I'd forgotten that, but yeah. Dangit. Isn't there any neat idea I come up with that hasn't been done already? lol
If the exocomp were instead a Human, do you feel the scene would have played out the same, or different? Can you (ethically) send a person into danger to save multiple people who are at risk?
If their being sent is the only solution & they are agreed to do so, yes. Isn't that what Troi's whole deal amounts to in Thine Own Self?

The fact that Data volunteers & Riker refuses makes no sense. He absolutely CAN & sometimes must order him to endanger himself. He refuses to do so, only because the exocomp option is available. He clearly doesn't agree to their rights. Had this happened & there'd been no exocomps to rely on, might he have considered Data's choice to volunteer then? He might.
If the exocomps had continued to refuse, and the three men had been killed, would Data have still been correct in his actions?
As unpleasant as that outcome is, pretty much. The men signed up to be placed in dangerous, life threatening situations, for a cause. The exocomps swore no such oath. Requiring them to die to save the others is unjust, once it's established that they are indeed self aware, conscious, intelligent beings. It would be no different than someone making Data do the same, before he'd joined Starfleet himself.
 
I don't think I like the "in the real world" comparison, shouldn't we see Star Trek as reality in the future.

Some things could or couldn't happen just because it's a television series, I think of it as the real deal, things that could happen in the real future.
Star Trek past history does not match our own so no its not our future reality
 
Yes, I feel that Commander Data would unequivocally been correct in his decision. It would have been very sad that the crew members had died, but we do not have the right to force another intelligent being to sacrifice its life even to save our own.
You do when you all work for Starfleet.
 
How do you know that something is aware? And before answering, think about how easy it is to fake awareness. Just as you can play the part of a murderer without being one yourself. A non-aware device could give the same responses to questions or stimuli as a conscious being without being conscious itself.

The point is, for cases like Data there's no clear answer to that question.
 
The point is, for cases like Data there's no clear answer to that question.
Actually, there's no clear answer for anyone. Philosophers have been stumbling upon the concept of conscience and self awareness for centuries. Yet we keep considering ourselves conscient and self aware.
 
You do when you all work for Starfleet.
It's one thing when people have chosen to take risks for duty and are then ordered to do so. Someone however could not be forced to join Starfleet against their will and then be made to sacrifice themselves.
 
It's one thing when people have chosen to take risks for duty and are then ordered to do so. Someone however could not be forced to join Starfleet against their will and then be made to sacrifice themselves.
Read about conscription re WW1 and WW2
 
We're living the best years of humanity. We're at the apogee of technology and resources. Two of three decades from now, we'll remember this as the golden age.
Luckily most of us won't have to deal with it.
High Fives all around!
 
Actually, there's no clear answer for anyone. Philosophers have been stumbling upon the concept of conscience and self awareness for centuries. Yet we keep considering ourselves conscient and self aware.

Exactly, but whether or not our consciousness is true consciousness or simply an illusion manifest from chemistry, it's objectively in our best survival interest to place value on our own lives.

If we ever look elsewhere in the galaxy and find life that thinks like our own, logically, it must also have value. The quality that makes our lives more valuable than those of animals, and thus makes it morally acceptable to kill an animal but not another human, must also make other species lives equally valuable if they have that same quality. The only other explanation is that we're all a bunch of selfish mass murdering monsters and morality is a comforting lie. No, no, that can't be it.

And if that value can be given to alien life, why not machine life too, if we can observe it may also possess the same quality that makes human life special?
 
The thing that makes our lives more valuable than those of animals must also make other species lives equally valuable. The only other explanation is that we're all a bunch of selfish mass murdering monsters and morality is a comforting lie. No, no, that can't be it.
I'm very sorry, but I don't understand if you're being cynical...That's the drawback of not being a native English speaker :-/
 
If you are not already a member then please register an account and join in the discussion!

Sign up / Register


Back
Top