• Welcome! The TrekBBS is the number one place to chat about Star Trek with like-minded fans.
    If you are not already a member then please register an account and join in the discussion!

moriarty

Well, goldfish are probably sentient too, just really, really stupid.

Thing is, holodeck is created to simulate reality, it is not reality. We don't think that most holographic characters seen on the show are actually sentient (nor do the characters in the show think so.) It would be pretty horrible if they were; Worf's practice opponents are just created to die in agony. No, they're just video game characters, they appear real, but are not. So it could be reasonably argued that EMH and Moriarty merely simulate sentience, i.e. appear sentient but actually are not.
 
Last edited:
They can do anything that regular life forms can do. And they can apparently have sex (The Naked Now) and reproduce (Blink Of An Eye).

However, even Data considers himself technically an artificial life form.
 
Data has a better case based on the fact that his unique intelligence was present from creation and evolves independently of any other influence. It is therefore extremely difficult to duplicate.

Thare are three problems with holographic sentience. Firstly, if you ask the computer to create a sentient life-form, is it really creating a sentient life-form or is it merely creating a convincing facsimile of a sentient life-form? When you ask the computer to create a man who believes in God... does he really believe in God?

The second problem goes back to Data and his independent evolution. Can a hologram be created and evolve... without the computer?

And finally, if the computer can create true sentience simply by being asked to, then doesn't that demonstate that the computer KNOWS what true serntience actually is and therefore must also be sentient (how can you manufacture sentience outside of nature without knowing what sentience is because you possess it?).
 
I put it simply like this: No matter what creature it may be, if it knows it exists, it is sentient...

Does the before mentioned goldfish know it exists? Maybe not, and if that's the case, it's not sentient.
 
I put it simply like this: No matter what creature it may be, if it knows it exists, it is sentient...
This depends on your definition of 'knowing'. Does computer 'know' information stored into it? But assuming that the answer is 'no' and 'knowing' denotes consciousness of knowledge, then the statement is of course correct.

Does the before mentioned goldfish know it exists? Maybe not, and if that's the case, it's not sentient.
Goldfish is unlikely to much ponder its existence, as it lacks the intellect to do so. It however probably feels pain, hunger, even fear. It perceives its surroundings with its senses. As long as it has subjective experiences, it is sentient.
 
Last edited:
Knowing about ones existence also requires intelligence to understand what that existence is, so intelligence is also needed for sentience.

- self awareness
- intelligence

These two things are needed, something else maybe too?
 
@JesterFace, no, not really.

Self awareness is a terribly vaguely defined concept and even philosophers really do not agree what it actually means. Most often it however is used to mean existence of rudimentary meta-cognition, that is cognition about cognition (it seems you're using it in this meaning here.) That is not rally required for sentience though. As I said, sentience is merely the capacity to experience qualia. Think about babies, they do not really have intellectual capacity to understand their existence, yet they have subjective experiences such as feelings and sensory experiences.
 
I put it simply like this: No matter what creature it may be, if it knows it exists, it is sentient...

But when this creature's sentience is artificially manufactured by another creature, how do you differentiate between being sentient and mimicking sentience?

Like I said before, if you program a hologram to believe in God, is it really believing in God?
 
But when this creature's sentience is artificially manufactured by another creature, how do you differentiate between being sentient and mimicking sentience?

Like I said before, if you program a hologram to believe in God, is it really believing in God?

Whether an artificial creature is created by, lets say, a human, it doesn't make the sentience any less real, aren't we all created in some fashion... Humans have evolved from other beings, our sentience is created along the way.

About that God thing... I don't know, I don't always know what to think of the whole thing in general, so I'm not the one to answer to that.
 
How about if we don't use God. Let's use unicorns or green fairies. If you program the AI to believe in unicorns...do they really believe in unicorns?
 
Perhaps a more important question: does it matter whether their belief is "real" as long as they'll behave as though it is?
 
Whether an artificial creature is created by, lets say, a human, it doesn't make the sentience any less real,

If something artificial has the appearance of sentience, then it does make the sentience less real because it wasn't naturally occurring. It therefore requires that you examine the claim rather than just accepting it otherwise your criteria for sentience isn't actually sentience, it's merely... can you mimic sentience adequately enough to trick me. If a magician successfully reads a persons mind, we don't accept that mind-reading is therefore a real thing.

aren't we all created in some fashion... Humans have evolved from other beings, our sentience is created along the way.

But ours is naturally occurring sentience (and as yet, the only example of sentience) therefore we have nothing to compare it to and therefore (quite rightly) conclude it to be real. If we met aliens, we would probably accept their sentience pretty quickly based on what we know of our own. If, however, the sentience is artificially created, we would know that its sentience has never occurred in nature and consequently, we would require a new set of criteria (for fear of being tricked by our own brilliant creations).
 
That's the thing, can artificial sentience be labelled real because it's basically naturally developed if it's created by beings that are naturally created... I don't know what I'm saying anymore, what is natural, what is not...

Simply, if one knows of its own existence, it's a sentient being, no matter where that intelligence originated from. Is sentience less real depending where it came from?
 
When a loser doesn't want to work, finds the paper work for metric ton of student loans, but isn't handy enough to learn real skills, they get a degree in philosophy. In the Star Trek Future there is no work and education is free. 95 percent of humanity are expected to spend their time "playing" or fighting like hell to be part of the %5 who keep the replicators running and the borders held.

EVERYONE on Earth, and every human colony has a PHD in Philosophy, so they're making the question of whether they are allowed to keep slaves as slaves even if their slaves might be people, to be a more difficult question to answer than it really is.
 
When a loser doesn't want to work, finds the paper work for metric ton of student loans, but isn't handy enough to learn real skills, they get a degree in philosophy.

Because nothing of any value has ever come from trying to answer questions like ``how can I tell whether something is true?'' or ``how can one live ethically?'' or ``how can one tell whether one understands an idea?''
 
Simply, if one knows of its own existence, it's a sentient being, no matter where that intelligence originated from. Is sentience less real depending where it came from?

If the putative intelligence came from charlatans who wanted to trick you then yes, it is less real. If my dog came up to you and started a conversation through a translation box about how it had become self-aware and discovered language, then it would have met your criteria for sentience.

It would not have met mine.
 
@hux, translation box or no, your dog certainly is sentient. What it is not is sapient.

@Guy Gardener, are we enslaving the characters in the Sims game?
 
Last edited:
Simply, if one knows of its own existence, it's a sentient being, no matter where that intelligence originated from. Is sentience less real depending where it came from?
You are correct but missing the point. People are arguing that perhaps the holograms actually do not know that they exist. That they are merely advanced chatbots, which may seem like real persons, but actually are not. They know nothing and feel nothing, any more than a raid boss in World of Warcraft does. Do you think that Worf's practice opponents in the holodeck feel pain and terror?
 
According to wikipedia, in science fiction, there is a special (different) definition for the word "sentient" in science fiction (because Star Trek got it wrong 30 years ago, no one corrected them, and a generation was lost.) where sentient means sentient, sapient, self-aware, and a few other words starting with S.

Classically Sentient means being able to sense/feel shit out side of you, self aware means that you know that you exist, and sapient means that are capable of constructive reason. Philosophy is often about pitting those three, and a dozen other qualities against each other to figure out if you're a person with a soul or meat going through the motions.
 
^All very well-stated distinctions here; yet our definitions are based on the biological paradigm of:
1 intelligence : 1 discrete entity

You could conceivably nullify these labels by creating an AI that is unaware of certain aspects of its existence, by compartmentalizing its processing centers. Is it self aware?

It could be programmed with completely erroneous sensors. It could believe it's immobile when it is not, or mobile when it isn't.

It can be out of control of its own responses, or even feedback wrong information to itself. It could be completely out of touch with itself and the world.

Or, it could outpace biological perspectives completely. It could perceive only one color - or beyond the visible spectrum into radio, for example. It could perceive individuals, their statistical likelihoods and their life histories, geneologies, the size of their jackets, boots and sunglasses, and lots more.

It could link and decouple with other like components at will (like Odo's Great Link). The intelligence could function as one entity, or several as needed - like a transforming mega-robot, physically or mentally.

Its S-ness (sapience/sentience/self-awareness) could be attenuated to almost any degree - with or without awareness or control. Can something so easily manipulated really be said to "have" these traits? It may exhibit them, but there's no single "it" there.

We have no choice but to be individuals. AI is not limited by this constraint (though our perception of them may be). Hence, it would be an error to look at a Cmdr Data or EMH, and not see that it is both much less than - and much more than - one entity, an individual. Again: hubris.

This raises more philosophical questions, like whether it would value being locked into a single entity.

BTW, philosophy may not produce value directly, but it does indirectly: as in the shaping of a free society that then transforms the whole planet. Yesterday I heard someone say "Why teach Language when you can teach Physics?" A fair question. To which was replied, "Try teaching Physics without Language." Yeah, when everything's working, you don't "need" it.... But when the stuff breaks down, you see that it is actually pretty damned relevant.

Personally I think it comes down to scope of concern (does it matter to me, my family, industry, society, history, species, etc ) - something of ourselves we may project onto others, but don't necessarily perceive accurately from other intelligences. Price of individuality, I suppose. Shifting values for shifting priorities or motives; and lots of assumptions that what matters to us now should always matter to others. You know, Human Bias. But - here's a thought - universities are not in the business of job training.
 
Last edited:
If you are not already a member then please register an account and join in the discussion!

Sign up / Register


Back
Top