• Welcome! The TrekBBS is the number one place to chat about Star Trek with like-minded fans.
    If you are not already a member then please register an account and join in the discussion!

Data - Did he Always have Emotions?

Do you Believe Data Always Had Some Remote Level of Emotions?

  • Yes

    Votes: 30 75.0%
  • No

    Votes: 4 10.0%
  • Unsure

    Votes: 5 12.5%
  • Other - Explain

    Votes: 1 2.5%

  • Total voters
    40
Data had emotions. And I learned how to be a better human being from him.
 
Hello everyone just to let you know I'm brand new to this forum and er any forum actually! So if i do something wrong please just tell me nicely and I'll remember =]

I havn't read all the responses to this thread so sorry if I'm making a point that's already been made but I think that Data always had emotions but because he didn't think himself capable of them he kept innocently denying it to himself and everybody because he didn't recognise them for what they were.

And I think they only became apparant to him when he had his emotion chip which brought them out full strength! I mean there are so many examples of where he's blatantly revealed emotion in episodes and most have been mentioned in the first page here.

Oh and by the way, Praxius I love your pic!
 
Spiner never played Data as completely emotionless, and the writers certainly didn't always portray him that way.

The appearance of emotion is quite different to actually feeling an emotion. Anyone who has been through a nasty breakup can attest to this.
 
I find it interesting how many people think Data had emotions, when it was repeatedly mentioned on screen by Data himself and other characters that he didn't have any emotions nor an ego and didn't understand the concept of humor. He specifically needed a chip to experience emotions and understand jokes.


What I find funny is that the EMH has emotions, a sense of humor and an ego from the start.
 
EMH had simulated emotions, like any holodeck character would. They weren't real. Although his matrix was adaptive and complex enough to develop effectively 'real' emotions over time. It seems starship mainframes have no real difficulty generating sentient computer programs (see also: Moriarty) which is something I'm sure Federation science teams were eager to examine upon his return home.

What was so special about positronic brains again...?
 
EMH had simulated emotions, like any holodeck character would.

That's not what I've seen. And then again it makes absolutely no sense to intentionally give an emergency hologram emotions, even if they are simulated. It even went so far that he showed signs of insecurity (and not the "err... cannot comply... unknown question" kind of real computer programs, but human insecurity) when he was confronted with an unusual situation. And the first time activated EMH on the Enterprise-E was AFRAID of the Borg. And it didn't follow orders immediately because of its ego (I'm a doctor, not a doorstop). That's absolutely not what you want there.

What was so special about positronic brains again...?
That's another problem. Moriarty, the EMH, all of those made Data look pretty unspecial and less sophisticated.
 
I don't think he had emotions during the series. But I believe he had the potential to grow beyond his programming and develop his own emotions.

This is one area where I think his character was mishandled beginning in Generations. He took the easy way out to gain emotions. A better story would be how he would have handled the loss of that avenue, when E-D was destroyed. He could have recognized that his interpretation of fear kept him from taking the risk. Followed by sadness when we realizes he will never have the opportunity to experience the emotions his father created for him. And finally joy that Spot survived. But his level emotion would be real, while stile distinct from that of humans.
 
What was so special about positronic brains again...?
That's another problem. Moriarty, the EMH, all of those made Data look pretty unspecial and less sophisticated.

With the exception of the EMH's mobile emitter, the thing that is special about Data is his size and mobility. That intelligence doesn't need a massive computer core to be broadcast, nor is it confined to a holodeck.
 
Yes, Lore felt owed something from Soong, & since the chip is all there was, he took it. Plus, apart from the more brilliant ways he would likely find to benefit from the chip, the best use to him, would be that it was designed for Data, & in being so, it was a perfect tool to manipulate him

The added emotional intensity must have had an effect. He became hellbent on galactic domination, & knew the only real threat to that goal was Data. If it hadn't been for Picard & Geordi triggering his ethical programming, Data might never have gotten free, & Lore having Data under his control, they probably could have conquered the galaxy, especially by controlling the Borg. What a crazy thought :borg:
Except the Loreborg were extremely limited in their resources, especially after Crusher blew up their supership. Even if Data and Lore had killed everybody on the planet and crushed Hugh's rebellion, Nechayev would've just shown up and nuked the site from orbit. :p
 
That's not what I've seen

Look, have you played Half-Life 2? There's a character you interact with who seems to show concern for you, gets upset when you get hurt, has affection for you, etc. It's ludicrous to think that because this projection appears to have emotions that it is actually sentient and actually has emotions. It's just a program. The Doctor is no more or less emotional than this computer game character- he is just a better simulation (up until the 6th or 7th season, I'd say). Data may or may not have a soul, it's not established, but he doesn't have emotion prior to the emotion chip. He might be very good at imitating it (as many sociopaths can emulate emotion while not experiencing it), but he doesn't actually feel them, because he can't.
 
That's not what I've seen

Look, have you played Half-Life 2? There's a character you interact with who seems to show concern for you, gets upset when you get hurt, has affection for you, etc. It's ludicrous to think that because this projection appears to have emotions that it is actually sentient and actually has emotions. It's just a program. The Doctor is no more or less emotional than this computer game character- he is just a better simulation (up until the 6th or 7th season, I'd say). Data may or may not have a soul, it's not established, but he doesn't have emotion prior to the emotion chip. He might be very good at imitating it (as many sociopaths can emulate emotion while not experiencing it), but he doesn't actually feel them, because he can't.

You, me and everybody are also just computer programs. Very complex ones, but there is no difference.

When was it established that anyone of us has a soul? I think I didn't get that memo.
 
You, me and everybody are also just computer programs. Very complex ones, but there is no difference.

There is a huge difference between human beings, who do experience emotions, and programs that are designed to simulate emotions for the benefit of its human creators. It's facile to think that because the 'Milo' computer-generated character is smiling, or frowning, that it is genuinely experiencing emotion, or even that there is any 'it' to experience emotion at all. It's tantamount to saying that I am guilty of murder because I've played Grand Theft Auto, or that a photocopy of the Mona Lisa is sort-of happy. It's the reasoning of a child.

When was it established that anyone of us has a soul? I think I didn't get that memo.

I was referencing 'The Measure of a Man': "Does Commander Data have a soul? I don't know that he does. I don't know that I do."
 
You, me and everybody are also just computer programs. Very complex ones, but there is no difference.

There is a huge difference between human beings, who do experience emotions, and programs that are designed to simulate emotions for the benefit of its human creators.

But you cannot prove that, can you? How do you experience emotions? What are emotions? Are they some devine thing coming out of nowhere just for us special humans? I say they aren't. They are the result of complex software running on the hardware that is the brain and body. You process input (visual, for example when you see an animal) with data stored in a database (your memories, for example if that is harmless or dangerous) and create new output (such as emotions, like joy or fear). Same thing a machine does. The difference is the complexity.

Your computer game analogy is wrong, because that program is so primitive. I'm talking about complex, intelligent software (which doesn't exist yet, of course, but will some day).
 
There is a huge difference between human beings, who do experience emotions, and programs that are designed to simulate emotions for the benefit of its human creators.

The only difference between the two is biological design vs. technological design.

Our emotions, in fact everything that we respond to in our environment is just a very complex set of parameters based around electrochemical stimuli and response that is geared, programmed and trained over a number of years through our brains, either through education/exposure to our environments, or through pre-existing genetic makeup. (Instinct)

For example, depending on how you were raised and educated, something that would give you an emotional response of anger or happiness may not be the exact same response someone else may have.

In the West, we're more used to sweet and sugary foods in which we enjoy and/or take pleasure in..... however in places like Asia, they're more geared towards salty or spicy foods, in which many of us may find too hot or too spicy (Curry for example) ~ Yet those same people may find our foods too sweet for their liking.

As a more detailed example, my wife's parents when they came to Canada from Australia, found it difficult to eat our breads due to the amount of sugar included in the ingredients..... I certainly don't notice anything sweet about our bread, but they could tell there was more sugar.

Just because we're "Programmed" differently, that doesn't mean our programming is more valid then another's..... including a technological being's programming.

It's facile to think that because the 'Milo' computer-generated character is smiling, or frowning, that it is genuinely experiencing emotion, or even that there is any 'it' to experience emotion at all. It's tantamount to saying that I am guilty of murder because I've played Grand Theft Auto, or that a photocopy of the Mona Lisa is sort-of happy. It's the reasoning of a child.
IMO, the fault lies in your reasoning where you are trying to compare Data's existence of Emotions, who has Artificial Intelligence, to a scripted video game that has no AI to begin with and is not "Programmed" to expand it's knowledge beyond it's programming.... which is the basis that determines sentience.

That's the whole reason why so many people are hung up on AI and it's so controversial in some people's eyes, because it takes the machine to the next level..... which is why I find your above example faulty.
 
What JarodRussell and Praxius are saying is very true and I was going to say something along the same lines when reading the Data/Computor game analogy........you can't compare a 24th century android to a 21st century computor game! It's like saying a cave man is the same in intelligence to us........

I mean in a way it's like technological evolution and if biological creatures such as ourselves were able to go from monkeys swinging in trees to humans using tools and then machinery and also developing in emotion then why can't artificial lifeform do the same?
 
.....you can't compare a 24th century android to a 21st century computor game! It's like saying a cave man is the same in intelligence to us....

I wasn't comparing their interior workings, I was comparing what we, as humans, perceive on the exterior, and whether that can be used as a guide in determining whether something "really" has emotions or not. By the logic you are describing, if I could create a simulacrum that could inerrantly convince you that it was feeling emotions, it would, therefore, have to be feeling emotions. I think this is a fallacy, because it is dependent on what YOU think. Humans are famous for anthropomorphizing, for placing human emotions on things that do not have emotions. Data says he does not have emotions. Just because we read what appear to us to be emotions on him through his actions or his facial expressions does not add new data to the equation, because it's faulty data- they are actions specifically designed to make us think he has emotions, and given our human tendancy to project our emotions everywhere it is doubly suspect information. Data is a machine. When he says he feels no emotion, why should we disbelieve him?
 
I wasn't comparing their interior workings, I was comparing what we, as humans, perceive on the exterior, and whether that can be used as a guide in determining whether something "really" has emotions or not.

And that's where the flaw in your reasoning is. You ignore that human emotions are just the result of interior processes that perhaps could be perfectly replicated by sophisticated machines.


Again, where do you think human emotions come from? How are emotions created? What process creates emotions?
 
I wasn't comparing their interior workings, I was comparing what we, as humans, perceive on the exterior, and whether that can be used as a guide in determining whether something "really" has emotions or not.

And that's where the flaw in your reasoning is. You ignore that human emotions are just the result of interior processes that perhaps could be perfectly replicated by sophisticated machines.


Again, where do you think human emotions come from? How are emotions created? What process creates emotions?

Exactly.... no human can fully explain where emotions come from or why we have them besides the generic explanation that they are an end-result of the electrochemical responses triggered in our brains based on years of conditioning (Programming)

It's similar to how some people have a very thin line between pain and pleasure, while others will have a great divide in their perception of pain and pleasure..... while our pain/pleasure reactions are generally based around genetic makeup and natural instinct in order to protect and keep ourselves alive, others have conditioned themselves to sense pleasure in things most would consider painful...... and a select few humans were born with certain "Wires" crossed in that they never had to condition themselves, but they perceived those senses the same way all their lives.

Everything our brain interprets in our existence, be that exterior responses or responses from our own bodies (injury, illness, organ failure, etc.) are all interpreted by our brains registering these electrochemical responses to something we grew over the years to understand what they are to us.... We are just a different form of machine that has had centuries to evolve and develop what we take for granted today.

Data grew fond and accustomed to the presence of certain people in his life who later died..... but just because his reasons for this occurring might be different from some human's reasons, doesn't disqualify those reasons as not being the same value.... in fact, the way he explains why he grew accustomed to those people in his life matches almost exactly to my own reasons..... does that make me a machine with no emotions?

Data's situation with his emotions could also be related to someone born and raised in a society that deems their race as being inferior or in other words, a sub-race...... if you are born and raised in a society like that all your life, chances are, you're going to believe what you're told about yourself and indeed believe you are inferior....... Data for all his life, was continually told he was a machine, and therefore in some aspects, inferior to humans..... Humans have a hard time accepting that a machine could possibly understand and have emotions or even a soul or consciousness...... thus after a period of time, he'd begin to believe it and interpret many of his personal reasons for thinking a certain way as static programming.... but that's all we have going for us too.... it's just that some people want to dub themselves as more special over other living things in this world.

Many of Data's misunderstandings or views towards emotions were very similar to many of my own misunderstandings and views towards emotions while I was growing up.... hell, I still have some of these views of some emotions that I just don't get..... because I don't have the same emotional reaction as some others.... in fact, in some ways, I have no emotional reaction to certain things like many others would.

^ But you couldn't call me a machine because I have similar genetic makeup and design as any other human in this forum..... I just view things differently.... just like Data views things differently.

And in regards to the emotion chip..... I sum that thing up as an over-glorified hormone chip, more then actual emotions.

If you increase or decrease the hormone levels in your body for whatever reason, that will normally increase or reduce how you react to something..... Data's emotion chip would be more like an increase of hormones in his body in which there was very little..... thus directly increasing the level in which he'd react to a situation, which in turn would give him a larger impact on understanding what humans experience.

Just a theory of course.
 
aarghhh call me stupid but this has gotten really confusing now!!! However I think I get the gist of what's being said and I agree. Emotions are impossible to define as in what they are or where they come from. Oh and yeah the emotion chip was like a hormone or something......making him go crazy!
 
If you are not already a member then please register an account and join in the discussion!

Sign up / Register


Back
Top