M5 sounded more like a computer than a human based mind imprinted on a digital server to me! 
JB

JB
I think that in order to answer this question, we (humanity) will need to learn more about what sentience/sapience/self-awareness really means in the first place. Despite my apparent free will, there are times when I'm not even sure that I'm not just a collection of stuff moving on a totally predetermined course that simply happens to resemble a free-willed being well enough to fool me.
I think the best we can do, barring further metaphysical revelations, is to take a being's word for it. By which I mean, is a being capable of asking for its own rights without direct prompting or as an obvious consequence of programming? I am. Data was. Koko the gorilla is. M-5? Well, it tried to assert its right to "live", I guess, but we know that *was* as an obvious consequence of programming. So based on what we were shown in the episode (which I would not feel sufficient to make such a weighty determination in the real world, but, it's all we've got for M-5) I would say no, it isn't sentient.
You're right - I should have specified that I mean to suggest that as a test for entities that we don't already have another reason to believe that they are self-aware/sentient. Even if that reason is just a transitive allowance because they are human and we're familiar with enough humans whom we know ARE sapient (insofar as we know that for anything) that we feel we can safely make the assumption that they all are *unless* we have hard data to show that they *aren't* (dead humans, for instance).The statement of yours that I highlighted prompts me to ask another question. What about severely autistic or otherwise uncommunicative humans? Are they non-sentient according to this criterion? We'd all agree they're sentient. So, suppose we extend it to other creatures. Is a dog non-sentient because it can't or won't communicate with us in demanding its rights? Suppose it's trying to assert its rights, but we don't know how to interpret that communication.
He was more like a kid who absorbed some insecurities and anger from his law-abiding father. The father looks on in horror as he sees those insecurities and anger turned to murder. I don't know if the computer was sentient, but it was a great allegory. For the purposes of the show, it had the power of choosing right and wrong, and its "father" was powerless to make it choose right.No. M-5 was a calculator programmed to act a little too much like Daystrom.
Sadly, I've known many people who were not "self-aware" ... as far as understanding their own actions.but not that they aren't self-aware at all.
They are, however, aware that they are people, that they have lives and thoughts and feelings. They have points of view. We are only talking about a very minimal definition of "self-aware". Not just an autonomic, mindless computer program. "Self-aware" here does not refer to some sort of sophisticated philosophical awareness. There just has to be "somebody home".Sadly, I've known many people who were not "self-aware" ... as far as understanding their own actions.
M-5's problems seem a lot like what went wrong with HAL back in 2001. Contemporary audiences were supposed to think that HAL was sentient. But do we? Given what we now know about artificial intelligence, it seems likely M-5 and HAL were just elaborate simulations of awareness.
We use essential cookies to make this site work, and optional cookies to enhance your experience.