• Welcome! The TrekBBS is the number one place to chat about Star Trek with like-minded fans.
    If you are not already a member then please register an account and join in the discussion!

Morality and the Holodeck

It seems that Federation law deals with specific incidents and not generalities.

...And I can appreciate judges shying away from establishing precedent. It's not this that I object to - it's the fact that the UFP should have centuries of experience with this very thing: specific incidents and judges-shying-away-from-making-decisions. Why is it so hard every time? There should be general precedent for specificity!

If nothing else, they could whip out the manual for morally complex decisions, with the appendix on what went wrong previously. Or call the Doctor. Or something. Something they know works.

Indeed. VOY's "Flesh and Blood" contained holograms that are sentient (most of the ones that were fighting the Hirogen) and some that are clearly stated not to be (the mining holograms that Iden stole from the Nuu'Bari).

...And "The Big Goodbye" might feature an in-between variant, in that Cyrus Redblock would get just a smidgen of extra initiative and free will (perhaps because the Jarada probe damaged some sort of an inhibitor) but essentially remain a "pre-written" character. Sentience should come in degrees, really. (And for all we know, it does, and we are just blind to those aspects of existence that our own type of sentience cannot touch - while, say, Vulcans have indeed found some deeper world of logic and are more sentient than us, rather than merely smarter and snobbier.)

Timo Saloniemi
 
...And "The Big Goodbye" might feature an in-between variant, in that Cyrus Redblock would get just a smidgen of extra initiative and free will....

Free will is a completely different question from sentience. We imagine that every person is the true source of his or her thoughts and actions, but yet the facts tell us that free will is an illusion. But that aside, I take your point.
 
Trek screwed this up by bringing up the nonsense of "sentient" hologram characters. The holodeck characters should be like video game characters, and therefore there's no ethical issue, but by the time of Voyager, they made it seem like they were creating new intelligent life every time someone writes a new program.


And don't even get me started on how absurd Holodoc is as a concept...
 
Trek screwed this up by bringing up the nonsense of "sentient" hologram characters. The holodeck characters should be like video game characters, and therefore there's no ethical issue, but by the time of Voyager, they made it seem like they were creating new intelligent life every time someone writes a new program.


And don't even get me started on how absurd Holodoc is as a concept...

According to one theory of reality addressed in The Holographic Universe, we are all holograms. So the Doctor is hardly an absurd concept.
 
That Moriarty episode should best be forgotten about. It's stupid.

"Computer, create a character that is a match for Data." and kawoops, there's a sentient, self aware being, created in a couple of seconds. And then he also had emotions, which Data didn't have.

First it's stupid from a storytelling POV, and then it's actually impossible. A simpler computer program CANNOT create a more complex one, that has been proven. The Enterprise holodeck computer software would need to be at least as complex as Moriarty, meaning it would need to be self aware, sentient, with desires and feelings.

According to one theory of reality addressed in The Holographic Universe, we are all holograms. So the Doctor is hardly an absurd concept.

The Doctor makes no sense on several levels. First of all, his programming is too complex for the purpose. There's no way such a program would ever be allowed to act as an emergency replacement. He has an attitude, a terrible bedside manner, an ego you can hurt, and so forth. It's just ridiculous.

Then the writers seriously needed a lecture in computer science. They always tie the hologram (which is just a visual representation) to the program itself (which is the core model). But in fact the Doctor could appear as a simple text message in a console whilst his entire program is intact.
 
Last edited:
Who says the computers in Trek aren't sentient? Maybe they don't display human-like sentience and don't act like they're human, but they might just be thinking deeper thoughts than humans and don't care to interact with or emulate their human "masters" the way Data does.
 
A simpler computer program CANNOT create a more complex one, that has been proven.
Naah. Dilettantes with zero understanding on the true potential or nature of "computing" (or whatever it will be called in the future) think that some equations that happen to be self-consistent apply to a specific situation when in all likelihood they don't apply to anything at all. Theologists have been doing that for ages; now it's the mathematicians' turn, but their angels don't dance any prettier on the pin, even if they get all the steps right to the infinitenth decimal.

You only need to look out of the window to see complexity emerge from simplicity. Information theory has nothing to do with it, and generally fails before the start.

As for the idea of "feelings" being a complex thing that is difficult to achieve... That's awfully sentimental and all that, but not very likely. Feelings, emotions and other primitive reactions are probably the most trivial part of any AI routine, as they are just well-learned, carefully orchestrated make-believe - at least in us humans, the current benchmark for AIs. "Ethics" is even simpler - any silly program can develop an arbitrary set of rules and see if it helps with survival, then modify them as needed. What's interesting about AI is not sentience, i.e. sentimentality. What's interesting is sapience, i.e. intelligence.

Timo Saloniemi
 
Um, how is a holodoc absurd as a concept? I'm curious.


the EMH CONCEPT was not absurd, but the idea of the way Holodoc developed on Voyager was absurd. The writers seemed to think that just leaving a somewhat sophisticated computer program running and interacting with the crew for a while would VOILA! give you a self-aware lifeform that could respond and interact just the way a Human could.:lol:
 
Um, how is a holodoc absurd as a concept? I'm curious.


the EMH CONCEPT was not absurd, but the idea of the way Holodoc developed on Voyager was absurd. The writers seemed to think that just leaving a somewhat sophisticated computer program running and interacting with the crew for a while would VOILA! give you a self-aware lifeform that could respond and interact just the way a Human could.:lol:

It is just that sort of sentientism that prompted the Doctor to write "Photons Be Free."
 
The Doctor, for example, is basically a program running on Voyager's computer. Is Voyager's computer sentient? If not, then how can one of its subroutines be sentient?
How would this follow? I live in a city, and believing that the city is sentient is not a requirement for believing that I am. I live in a body, and there's no requirement for believing my thighbone is sentient. And so forth.

Yeaaaahhh, with respect, that's a pretty ridiculous comparison. You're not a program being run by the city.

Also, my computer can run a combat AI that outwits me in a strategy game. That doesn't mean my word processor would be capable of fighting me, or that the computer at large would have AI.

Hmmm. Okay, I'll have to mull that one over.
 
The Doctor, for example, is basically a program running on Voyager's computer. Is Voyager's computer sentient? If not, then how can one of its subroutines be sentient?
How would this follow? I live in a city, and believing that the city is sentient is not a requirement for believing that I am. I live in a body, and there's no requirement for believing my thighbone is sentient. And so forth.

Yeaaaahhh, with respect, that's a pretty ridiculous comparison. You're not a program being run by the city.

No, he's an intelligent being that exists in the infrastructure of the city, just as the Doctor is an intelligent being that exists in the infrastructure of the computer. That's the analogy as I understand it.
 
Yup, plus who and what I am as a person is largely dictated by the social environment, which is the city, but the city doesn't need to be a mini-me, a maxi-me or a God in order to yield that outcome. It can be something else altogether.

The writers seemed to think that just leaving a somewhat sophisticated computer program running and interacting with the crew for a while would VOILA! give you a self-aware lifeform that could respond and interact just the way a Human could

The thing is, though, the EMH was designed to do just that. It's a highly interactive program that performs its designed and designated work through human interaction, i.e. by speaking with patients and understanding their problems. It is also supposed to learn by doing. It's only natural that it develop an interest in opera when it needs to understand why the patient is ranting about Mozart from the next cabin over rattling his nerves, and why two placebos and a reassuring consultation in the morning is the proper cure.

Timo Saloniemi
 
It's an interesting issue.

What would the morality be of creating a self-aware holodeck program whose greatest pleasure in existence is to please the users of the holodeck? So they would be as normal as you or I, but if a 'real' person asked them to, they'd happily hold a naked flame to their skin. In fact, being denied the opportunity to do so once asked would cause them severe stress, as would never being asked to do anything.

Is it possible to wrong such a being?
 
And that's not even a particularly exotic setup. Children on Earth are raised to a variety of standards which require them to hold certain beliefs - respect your parents, share and help, don't masturbate, certainly don't copulate, etc. Is it possible to wrong a person who has been taught to turn the other cheek by beating him up real good? Is it possible to wrong a person who has been taught that paradise awaits beyond this mortal coil by murdering him? Those are acts of mercy, even if they feel a bit twisted, because the setup is so twisted.

Timo Saloniemi
 
Yup, plus who and what I am as a person is largely dictated by the social environment, which is the city, but the city doesn't need to be a mini-me, a maxi-me or a God in order to yield that outcome. It can be something else altogether.

But you are not created by the city, nor dependent upon the city from instant to instant for your existence. You may exist outside the city, move freely to another location of your own free will, go for hours or days without the infrastructure of the city...
 
In one voyager episode, there was a Capt Proton holosim linked to a universe with light only beings--so a hologram migration to that with modifications to the programs might allow freedom of movement...
 
But you are not created by the city, nor dependent upon the city from instant to instant for your existence. You may exist outside the city, move freely to another location of your own free will, go for hours or days without the infrastructure of the city...
All of which is pretty much consistent with the EMH case.

The computer is just a very loose framework inside which various things such as EMH programs can exist. The vast majority of the computer is dedicated to programs as radically different from the EMH as I am from a taxi cab or a newspaper stand. And the EMH can hop inside a detached-operations module if need be, or spread his existence to the corridors, although the clumsy Starfleet hardware for this is not installed as default and needs help from Torres (say, "Living Witness" and "The Killing Game"). Thankfully, future technology comes to the rescue.

Thinking that the Doctor is some sort of a miniature version of the big Computer is fundamentally misguided, as both entities are vastly more flexible than that.

Timo Saloniemi
 
If you can prove they are sentient, it is immoral.

But if we accept that one holodeck character (Professor Moriarty) is sentient, doesn't the burden or proof now shift to those who maintain that holodeck characters are not sentient? What if we conclude that they are not sentient and we are wrong? Then the character that worried about what would happen to him and his family when the program ended had good reason to worry.

The computer programs that are present to us as sentient are generally the ones that are self aware and capable of rewiring themselves.

The argument for machines possibly being sentient lies in the comparison: Humans are biological machines, and yet we are sentient, why can't another type of machines be? For that qualification to be met their programming must have the necessary complexity and the ability to dynamically adjust itself to novel situations. There's a better argument for Data, the Doc or the Hirogen holograms than there is for Moriarty. But that qualification certainly doesn't apply to just any random hologram.

It's an interesting issue.

What would the morality be of creating a self-aware holodeck program whose greatest pleasure in existence is to please the users of the holodeck? So they would be as normal as you or I, but if a 'real' person asked them to, they'd happily hold a naked flame to their skin. In fact, being denied the opportunity to do so once asked would cause them severe stress, as would never being asked to do anything.

Is it possible to wrong such a being?

If it was self aware it would be capable of changing its goals and creating its own desires. If this were the case, perhaps the program would ask to stop pleasing people by hurting itself, and if it was denied that right, then it would be wronged.
 
Um, how is a holodoc absurd as a concept? I'm curious.


the EMH CONCEPT was not absurd, but the idea of the way Holodoc developed on Voyager was absurd. The writers seemed to think that just leaving a somewhat sophisticated computer program running and interacting with the crew for a while would VOILA! give you a self-aware lifeform that could respond and interact just the way a Human could.:lol:

It is just that sort of sentientism that prompted the Doctor to write "Photons Be Free."


yeah, the entire "hologram rights" subplot of late Voyager was nonsense, as was the Vic Fontaine nonsense on DS9.

Holographic programs are like video game characters. I don't care how complex, life-like or well-designed a video game character is or how long it's left to run, it's still a programed character. The only similarity between that and a Human "programmed" by genetics or environment is metaphorical only, and not substantive.
 
If you are not already a member then please register an account and join in the discussion!

Sign up / Register


Back
Top