View Single Post
Old March 2 2013, 08:36 PM   #32
Fleet Captain
mos6507's Avatar
Re: Moral issues with Robotics

The issue of A.I. is a philosophical one, bringing up other issues like the nature of free-will.

"Strictly speaking, even HUMANS do not venture too far outside of their genetic programming which drives them to acquire food, sex and gratification. That we go about these pursuits in an amazingly complicated process doesn't change the underlying nature of that process."

This is the free-will argument. Is biology destiny? Think of how susceptible humans are to addiction. Is an addict exhibiting free-will or not? DS9 came to a rather depressing conclusion about this with the Jem'Hadar being addicted to IV drugs at birth and not being able to break the habit.

Think of people who have been molded and brainwashed by their culture to think and act a certain way. Isn't that something the Borg was meant to explore? Is a Borg drone worthy of being treated as an autonomous entity? Well, Hugh and 7 of 9 would say yes, because they at least contain the capacity to break off from the collective. But history has shown that most people are not as self-aware, individualistic, or courageous to do this. They fall in-line with everyone else. Belonging matters too much.

And let's say you ARE an iconoclast, and you do things your own way, if you always respond the same way to stimuli, are you still not exhibiting a certain pre-programmed quality? If I get to know someone well enough to finish their sentences and know how they are going to react, isn't that a little depressing? Wouldn't the measure of a man require that you sometimes be a little unpredictable? Not just learn from your mistakes, but not just be a creature of habit, learn new skills, try different things? There are many out there how live very routine and repetitive existences that are not unlike a robot.

So the question of what makes a robot seem alive really forces us to ask tough questions about what makes humans alive.

One thing JMS postulated, via B5, was that self-sacrifice is the highest form of humanity, because it requires that we override the hardwired self-preservation impulse. When the M5 commits suicide in The Ultimate Computer, for instance, it was out of guilt for the sin of murder. Likewise, V'Ger's transformation at the end of ST:TMP, after it was gifted with the capacity to feel love and empathy, could be seen as a form of suicide, in recognition that it had become too dangerous to allow itself to coexist in that universe.

So I think a big part of being sentient comes from being capable of (and really wanting to) ask big questions like what is right and wrong and "is this all that there is?" ala V'Ger. And a lot of people kind of trudge through their day not really caring that much about anything besides the next meal and what's on for TV tonight.
Star Trek: Earhart
mos6507 is offline   Reply With Quote