View Single Post
Old March 4 2013, 02:41 AM   #34
Crazy Eddie
Rear Admiral
 
Crazy Eddie's Avatar
 
Location: I'm in your ___, ___ing your ___
Re: Moral issues with Robotics

mos6507 wrote: View Post
Hugh and 7 of 9 would say yes, because they at least contain the capacity to break off from the collective.
No they don't. They were FORCED out of the collective by circumstances entirely beyond their control. Hugh ultimately decided to rejoin the collective anyway, and 7 of 9 simply assimilated with her NEW collective and decided she liked them better.

Neither had any choice in the disconnection, and both ultimately made their final choices based on what they were more accustomed to.

Wouldn't the measure of a man require that you sometimes be a little unpredictable?
No.

Not just because free will is an illusion (which it is) but because by just about any standards, a man who is predictably virtuous is judged to be more reliable, more dependable, and in almost all ways PREFERABLE to a man who whose behavior is entirely a function of mood and random chance. Indeed, even a man who is predictably EVIL is generally lauded for his consistency, since at least an evil person can be counted on to BE evil and that makes dealing with the things he does relatively simple.

But free will IS an illusion, since people cannot help but be who they are, with the experiences they have, and the behaviors they have internalized over time. You cannot simply wake up one day and choose to be someone else; you can, however, chose to ACT like someone else, and over a long enough time the aggregate of those actions results in a change of your personality (this is the principle behind behavior modification).

Therefore the measure of a man is not in his choices or his freedom, but in his habits: in what he has been trained to do, what he is accustomed to doing, what he will normally do under such and such circumstances as a matter of his experiences and the sum of the lessons that make him who and what he is.

One thing JMS postulated, via B5, was that self-sacrifice is the highest form of humanity, because it requires that we override the hardwired self-preservation impulse.
Hardly the highest. One of three, I believe, for "sentient life." It was stated to be a principle, though, not so much a law, especially since not all sentient life forms are really so inclined (especially during the run of Babylon 5, where the highly evolved Vorolons and Shadows resort to glassing whole planets just to avoid loosing an argument).

So I think a big part of being sentient comes from being capable of (and really wanting to) ask big questions like what is right and wrong and "is this all that there is?"
Possibly, but then, the ability to ask the questions doesn't make the questions particularly meaningful.

And we're also getting away from the fact that machine sentience could easily take a totally different form from human sentience. Where humans self-reflect and ask "Is this all that I am?" a machine would be more likely to ask "Is there something between one and zero?"

To quote one of my favorite scifi AIs:

"You know that "existence of God" thing that I had trouble understanding before? I think I am starting to understand it now. Maybe, just maybe, it's a concept that's similar to a zero in mathematics. In other words, it's a symbol that denies the absence of meaning, the meaning that's necessitated by the delineation of one system from another. In analog, that's God. In digital, it's zero. What do you think? Also, our basic construction is digital, right? So for the time being, no matter how much data we accumulate, we'll never have a soul. But analog-based people like you, Batou-san, no matter how many digital components you add through cyberization or prosthetics, your soul will never be damaged. Plus, you can even die 'cause you've got a soul. You're so lucky. Tell me, what's it feel like to have a soul?"
__________________
The Complete Illustrated Guide to Starfleet - Online Now!
Crazy Eddie is offline   Reply With Quote