Spiner never played Data as completely emotionless, and the writers certainly didn't always portray him that way.
Oh and by the way, Praxius I love your pic!
EMH had simulated emotions, like any holodeck character would.
That's another problem. Moriarty, the EMH, all of those made Data look pretty unspecial and less sophisticated.What was so special about positronic brains again...?
That's another problem. Moriarty, the EMH, all of those made Data look pretty unspecial and less sophisticated.What was so special about positronic brains again...?
Except the Loreborg were extremely limited in their resources, especially after Crusher blew up their supership. Even if Data and Lore had killed everybody on the planet and crushed Hugh's rebellion, Nechayev would've just shown up and nuked the site from orbit.Yes, Lore felt owed something from Soong, & since the chip is all there was, he took it. Plus, apart from the more brilliant ways he would likely find to benefit from the chip, the best use to him, would be that it was designed for Data, & in being so, it was a perfect tool to manipulate him
The added emotional intensity must have had an effect. He became hellbent on galactic domination, & knew the only real threat to that goal was Data. If it hadn't been for Picard & Geordi triggering his ethical programming, Data might never have gotten free, & Lore having Data under his control, they probably could have conquered the galaxy, especially by controlling the Borg. What a crazy thought![]()
That's not what I've seen
That's not what I've seen
Look, have you played Half-Life 2? There's a character you interact with who seems to show concern for you, gets upset when you get hurt, has affection for you, etc. It's ludicrous to think that because this projection appears to have emotions that it is actually sentient and actually has emotions. It's just a program. The Doctor is no more or less emotional than this computer game character- he is just a better simulation (up until the 6th or 7th season, I'd say). Data may or may not have a soul, it's not established, but he doesn't have emotion prior to the emotion chip. He might be very good at imitating it (as many sociopaths can emulate emotion while not experiencing it), but he doesn't actually feel them, because he can't.
You, me and everybody are also just computer programs. Very complex ones, but there is no difference.
When was it established that anyone of us has a soul? I think I didn't get that memo.
You, me and everybody are also just computer programs. Very complex ones, but there is no difference.
There is a huge difference between human beings, who do experience emotions, and programs that are designed to simulate emotions for the benefit of its human creators.
There is a huge difference between human beings, who do experience emotions, and programs that are designed to simulate emotions for the benefit of its human creators.
IMO, the fault lies in your reasoning where you are trying to compare Data's existence of Emotions, who has Artificial Intelligence, to a scripted video game that has no AI to begin with and is not "Programmed" to expand it's knowledge beyond it's programming.... which is the basis that determines sentience.It's facile to think that because the 'Milo' computer-generated character is smiling, or frowning, that it is genuinely experiencing emotion, or even that there is any 'it' to experience emotion at all. It's tantamount to saying that I am guilty of murder because I've played Grand Theft Auto, or that a photocopy of the Mona Lisa is sort-of happy. It's the reasoning of a child.
.....you can't compare a 24th century android to a 21st century computor game! It's like saying a cave man is the same in intelligence to us....
I wasn't comparing their interior workings, I was comparing what we, as humans, perceive on the exterior, and whether that can be used as a guide in determining whether something "really" has emotions or not.
I wasn't comparing their interior workings, I was comparing what we, as humans, perceive on the exterior, and whether that can be used as a guide in determining whether something "really" has emotions or not.
And that's where the flaw in your reasoning is. You ignore that human emotions are just the result of interior processes that perhaps could be perfectly replicated by sophisticated machines.
Again, where do you think human emotions come from? How are emotions created? What process creates emotions?
We use essential cookies to make this site work, and optional cookies to enhance your experience.