• Welcome! The TrekBBS is the number one place to chat about Star Trek with like-minded fans.
    If you are not already a member then please register an account and join in the discussion!

If Data couldn't feel emotions, was he truly conscious?

The things is that none of us can truly imagine a state of being completely unemotional because such a state in a human being is highly pathological. An unemotional person couldn't be trusted with anything not even with their own life. Some treatments against deep depressive states include people being unemotional but they have to be restrained for their own good (these states, of course, are only temporary) otherwise they could kill someone or themselves as a way to solve a simple problem.
 
even though the Borg used to say he wasn't needed in their new order (per Locutus, why is the Queen so hot to do some form of inversion on the trope

The Queen was manipulating him - the Borg still don't consider Data necessary in terms of assimilation, but Data had locked out the Enterprise's main computers and they needed to manipulate him into unlocking them.
 
The things is that none of us can truly imagine a state of being completely unemotional because such a state in a human being is highly pathological. An unemotional person couldn't be trusted with anything not even with their own life. Some treatments against deep depressive states include people being unemotional but they have to be restrained for their own good (these states, of course, are only temporary) otherwise they could kill someone or themselves as a way to solve a simple problem.

Emotion is neither morality nor ethics and is often the quickest way to blind oneself from either.
 
Emotion is neither morality nor ethics and is often the quickest way to blind oneself from either.

Empathy has nothing to do with either morality nor ethics, it has only to do with emotion. If you truly can't feel emotion, you won't care (in the actual sense of the word) if you see someone die in front of your eyes. Fear is also essential to our survival. Without it, you won't do what's necessary to avoid danger.
 
Is Data is programmed with empathy? He does have it. He isn't coldly logical. And he has the ability to form friendships, bonds, loyalty(beyond the chain of command), and so forth. It's too bad we didn't get to see more ethical dilemmas for him in stories, such as was done for the doctor(such as in "Latent Image").
 
Is Data is programmed with empathy? He does have it. He isn't coldly logical. And he has the ability to form friendships, bonds, loyalty(beyond the chain of command), and so forth. It's too bad we didn't get to see more ethical dilemmas for him in stories, such as was done for the doctor(such as in "Latent Image").

There's a huge gap between Data and the Doctor, even though they're both a hundred percent artificial. The Doctor started as an asocial jerk but he always was for all intents and purposes, a human being. In fact, some actual human beings are very close to the Doctor at his worst. Data, on the other hand, is odd in pretty much everything he does, for example, he's supposed to have an encyclopedic knowledge but he can't even use an expression without changing its wording. IOW, Zimmerman could run circles around Soong when it comes to creating an A.I.!!!
 
Are creatures like the Jem'Hadar, Prophets, and Founders conscious even though they don't feel emotions like humans? The question is more complex that a simple no.
 
Did "The Matrix" steal that idea from DS9 or vice versa? (the one where they're all unconscious experiencing testing by the Dominion)
 
Did "The Matrix" steal that idea from DS9 or vice versa? (the one where they're all unconscious experiencing testing by the Dominion)

That's easy to answer.

"The Matrix" was released in 1999

That episode aired around 1993

So, who do you think could have stolen from the other?
 
The Matrix is hardly original at all. There was a wonderful movie which I love called The Thirteenth Floor which I am not sure if it came out before that or around the same time but it's got similar themes as well.
 
The idea of an A.I. that destroys its creator is hardly novel. In the sci. fi. world it goes back to Mary Shelley's Frankenstein, a being that first outsmarts and then kills his own creator. However, in mythology, it goes much further back, to Oedipus for example who killed his own father. In fact, the idea that someday we could create something or someone that would end up killing us has always been present in our collective psyche.
However, I am confident that when the day comes that we create a really sophisticated A.I. we will have put enough safeguards in it to prevent that from happening. Reality tends to be better than myths when those myths are bad or catastrophic, it also tends to not be as good as those myths when they are too optimistic.
 
I think we were talking about the idea of existing in a world that's not the objective world (i.e. VR), but that one goes back to ancient philosophy, so....
 
Conscience is this respect entails self-awareness. A computer holds lots of information and can juggle that information very quickly, but it doesn't know anything, much less that it is a computer. Data is aware of himself, of life and death as it applies to humans, and in one episode even comments on how long he expects to function before he wears out, so is aware he has a finite existence.
 
The idea of an A.I. that destroys its creator is hardly novel. ....

Dwan Ev ceremoniously soldered the final connection with gold. The eyes of a dozen television cameras watched him and the subether bore throughout the universe a dozen pictures of what he was doing.
He straightened and nodded to Dwar Reyn, then moved to a position beside the switch that would complete the contact when he threw it. The switch that would connect, all at once, all of the monster computing machines of all the populated planets in the universe -- ninety-six billion planets -- into the supercircuit that would connect them all into one supercalculator, one cybernetics machine that would combine all the knowledge of all the galaxies.
Dwar Reyn spoke briefly to the watching and listening trillions. Then after a moment's silence he said, "Now, Dwar Ev."
Dwar Ev threw the switch. There was a mighty hum, the surge of power from ninety-six billion planets. Lights flashed and quieted along the miles-long panel.
Dwar Ev stepped back and drew a deep breath. "The honor of asking the first question is yours, Dwar Reyn."
"Thank you," said Dwar Reyn. "It shall be a question which no single cybernetics machine has been able to answer."
He turned to face the machine. "Is there a God?"
The mighty voice answered without hesitation, without the clicking of a single relay.
"There is now."
Sudden fear flashed on the face of Dwar Ev. He leaped to grab the switch.
A bolt of lightning from the cloudless sky struck him down and fused the switch shut.
(Fredric Brown, "Answer")
 
If you are not already a member then please register an account and join in the discussion!

Sign up / Register


Back
Top