• Welcome! The TrekBBS is the number one place to chat about Star Trek with like-minded fans.
    If you are not already a member then please register an account and join in the discussion!

Vic Fontaine's Sentience

I would certainly hope he'd be MORE qualified for sentience than the EMH - at least in terms of such basic qualities as concern for his fellow crewmates. The EMH was perhaps one of the most self-centered main characters in all of Trek, on numerous occasions putting his own personal goals and agendas before those of Voyager. I mean, the guy committed TREASON, 'fer cryin' out loud. And almost abandoned the Voyager crew completely on a few occasions. Not to mention the rumpus he caused during his self-serving quest to be a novelist. :rolleyes:

Personally, I think they shoulda hit CTR-ALT-DEL on the EMH around season 4. :p

Vic Fontaine was a much nicer guy.
 
Very valid points, PKTrekGirl. It was rather funny that the Doc lacked a bedside manner, and one might rationalize that he lacked them because he was only designed for use in triage scenarios, but one would have thought he would have modified his subroutines to become a little less... irksome. At least, someone should have brought it up.
 
You would think so, but it was pretty well suggested that the EMH was not only designed with Zimmerman appearance, but pretty much his personality (or lack there of one) too.
 
You would think so, but it was pretty well suggested that the EMH was not only designed with Zimmerman appearance, but pretty much his personality (or lack there of one) too.

Can't you let me have my retcons? :( :p

One might argue that Zimmerman didn't see the initial need to do anything other than copy his own personality, cantankerous that it was, because of the EMH's designed limited role.

So there. ;)
 
As advanced as the people in the 24th century are, I think they still have a very poor understanding of what sentience even is.

One thing's for sure: The doctor and Vic Fontaine were very vulnerable compared to Data. He had his own dedicated hardware but the holograms were programs on generalized storage devices. They could have been deleted as casually as one could delete Windows XP.

If Vic is sentient, that little "jack in the box" Felix inserted in "Badda Bing Badda Bang" was incredibly unethical. It argues against Vic being sentient by design.
 
You know, that's a good potential argument against their sentience: they aren't autonomous, they aren't 'physically' there without the holoprojectors, and when they're turned off they don't 'exist.'
 
He could have been... but I am glad they didn't delve into that subject with DS9.

We already saw that theme play out with Data, the Nanites, Moriarty, and even the Enterprise itself in TNG.

And then Voyager sort of rehashed the argument with the EMH and Seven of Nine, to some extent.

In fact, I would have preferred if DS9 had left out holodeck stories entirely.

It was cool when it was first introduced in TNG and they had some interesting stories around that technology but by the time TNG ended the holodeck stories were very played out.

Voyager, especially, should have avoided holodeck stories since it took away from the premise of crew struggling to survive on the basic necessities, and it often seemed to be a crutch.

Y'know... instead of telling stories about Voyager's actual dilemmas in the DQ they could just play in the Holodeck all day.
 
I like what they did with Vic on DS9 and Sandrine's and the tropical resort on Voyager. I could do without all the crisis in the holodeck episodes. If it was that freaking dangerous, it wouldn't be used for recreation.
 
Very valid points, PKTrekGirl. It was rather funny that the Doc lacked a bedside manner, and one might rationalize that he lacked them because he was only designed for use in triage scenarios, but one would have thought he would have modified his subroutines to become a little less... irksome. At least, someone should have brought it up.

I am not sure. If you are able to modify the parameters of the personnality of an artificial being at will, then I'd say that this being don't have sentience. It is merely a piece of software that is self-aware, but it's still your piece of software to be changed and modified at your convenience.

The only way I would consider an artificial being to be sentient is if we couldn't change his personality post-creation outside of its own personnal experiences. You cannot re-program it, or you end up with merely another being. If Data was tampered to be more friendly, he wouldn't be Data anymore.

It is a fine, but definite line to draw. A program that can change behavior easily will never grow to be an actual sentient artificial being. If you could change at will your favourite food, or what kind of girls you find attractive, how could you define yourself? The moment you change these parameters, you become somebody else that isn't you. And that somebody else might have a different idea about what he wants to like/dislike that he doesn't.

What if somebody else had the control over your parameters?
 
What if somebody else had the control over your parameters?

That's far too fine a line for me. Humans are ultimately always changing as well, and they change in response to external stimuli. "Reprogramming" is an important characteristic of our existence; if we lacked the power to change, we'd IMHO be more like machines, not less.

...Not to try and claim that being a machine is incompatible with being sentient, of course. The difference between a machine and a human is vague as well, unless one goes very specific and only accepts certain types of machine as the comparison point.

Timo Saloniemi
 
What if somebody else had the control over your parameters?

That's far too fine a line for me. Humans are ultimately always changing as well, and they change in response to external stimuli. "Reprogramming" is an important characteristic of our existence; if we lacked the power to change, we'd IMHO be more like machines, not less.

But there is a different between any biological life form's capacity to adapt/evolve over time and what I was talking about. The former is a reaction to stimulus provided in your environment, but you sometime have to go trough a process to genuinely earn those changes. Provided an artificial life form like Data or the Doctor were not tampered with, I would agree that they also could evolve and learn and change their view of the world overtime.

But the later, "reprogramming", means either an direct external influence over what you think/want, or a blunt and integral mean of changing the way you are. You could genuinely turn yourself into a rampaging murderer, a gagh-loving effeminate man or the most effective lover in the world just by putting those parameters, and all of these sentiments would be genuine, since you changed yourself. The previous personnality will cease to exist.

Now, I don't think somebody with that kind of potential for influence of character could be considered "sentient". There isn't any room for proper evolution, since any problem the person might encounter will simply be solved by blunt self-engineering.

Most of the artificial life form I consider sentients in Science-Fictions are the ones that aren't tampered with like Andrew, Giskard, Daneel, Wall-E, Data, etc... we see them are true sentient beings.

When you look at GlaDOS, however, she seems to change her parameter about what she wants out of you on a quasi-permanent basis. She can easily switch personnality core and thus, ends up being non-sentient in my mind.

(I need more example of self-programming or AI that are regulary tempered with. )
 
Wasn't there some guy in one of the Mirror eps (I'm thinking 'Emperor's New Cloak) that looked like holo-Vic but was a real person?
I've never seen this episode, but I've heard that if you look closely, sparks shoot out of him when he's shot, implying mirror-Vic is an android instead of a hologram.
 
Very valid points, PKTrekGirl. It was rather funny that the Doc lacked a bedside manner, and one might rationalize that he lacked them because he was only designed for use in triage scenarios, but one would have thought he would have modified his subroutines to become a little less... irksome. At least, someone should have brought it up.

I am not sure. If you are able to modify the parameters of the personnality of an artificial being at will, then I'd say that this being don't have sentience. It is merely a piece of software that is self-aware, but it's still your piece of software to be changed and modified at your convenience.

The only way I would consider an artificial being to be sentient is if we couldn't change his personality post-creation outside of its own personnal experiences. You cannot re-program it, or you end up with merely another being. If Data was tampered to be more friendly, he wouldn't be Data anymore.

It is a fine, but definite line to draw. A program that can change behavior easily will never grow to be an actual sentient artificial being. If you could change at will your favourite food, or what kind of girls you find attractive, how could you define yourself? The moment you change these parameters, you become somebody else that isn't you. And that somebody else might have a different idea about what he wants to like/dislike that he doesn't.

What if somebody else had the control over your parameters?
The human brain can be modified at will, though, if you can just get through someone's head and gain access to it. The brain is basically a biological computer. It can be modified by chemical means. LSD rewires the brain, crisscrossing signals, which is why sights can have smells, sounds have sights, etc. Drugs can modify behavior and emotions. Every time someone takes an antidepressant or is shot up with a "truth serum", the properties of the brain are being altered, if only temporarily. Electroshock therapy uses an electrical shock to change how the brain works. Physical surgery can change someone's personality entirely in the case of a lobotomy. Scientists are even developing ways to interface the brain with machines- how will we be able to reprogram the brain when we fully develop that?

By virtue of the human brain being basically a computer, albeit a complex one, we are just as alterable as an android or hologram. The difference is that it's a hell of a lot easier to do it to a hologram, because we know how to get at the code, and the code is of our own invention, not nature's, so we understand it.
 
The human brain can be modified at will, though, if you can just get through someone's head and gain access to it. The brain is basically a biological computer. It can be modified by chemical means. LSD rewires the brain, crisscrossing signals, which is why sights can have smells, sounds have sights, etc. Drugs can modify behavior and emotions. Every time someone takes an antidepressant or is shot up with a "truth serum", the properties of the brain are being altered, if only temporarily. Electroshock therapy uses an electrical shock to change how the brain works. Physical surgery can change someone's personality entirely in the case of a lobotomy. Scientists are even developing ways to interface the brain with machines- how will we be able to reprogram the brain when we fully develop that?

By virtue of the human brain being basically a computer, albeit a complex one, we are just as alterable as an android or hologram. The difference is that it's a hell of a lot easier to do it to a hologram, because we know how to get at the code, and the code is of our own invention, not nature's, so we understand it.

That is the main conflict in my theory, isn't it? Well, about the current means of changing one's way of being, I have to say I doubt that we have the means to change someone permanently. LSD, serums, etc... those are merely temporary affects (except addiction syndroms, off course), and even permanent scarings like lobotomy are uncontrolled.

For the moment, we cannot effectively control how one's man think and react. And.. when you think about it, the moment we will be able to permanently change how peoples think, what they like, what they want, we will be loosing free will over our lives. Oh.. damn it. Does free will equal sentience? Does sentience equals free will?

What is the relation between those two...?
 
Wasn't there some guy in one of the Mirror eps (I'm thinking 'Emperor's New Cloak) that looked like holo-Vic but was a real person?
I've never seen this episode, but I've heard that if you look closely, sparks shoot out of him when he's shot, implying mirror-Vic is an android instead of a hologram.

Oh, that's interesting. I have only watched that episode once. It's DS9's 'Threshold' for me.

The human brain can be modified at will, though, if you can just get through someone's head and gain access to it. The brain is basically a biological computer. It can be modified by chemical means. LSD rewires the brain, crisscrossing signals, which is why sights can have smells, sounds have sights, etc. Drugs can modify behavior and emotions. Every time someone takes an antidepressant or is shot up with a "truth serum", the properties of the brain are being altered, if only temporarily. Electroshock therapy uses an electrical shock to change how the brain works. Physical surgery can change someone's personality entirely in the case of a lobotomy. Scientists are even developing ways to interface the brain with machines- how will we be able to reprogram the brain when we fully develop that?

By virtue of the human brain being basically a computer, albeit a complex one, we are just as alterable as an android or hologram. The difference is that it's a hell of a lot easier to do it to a hologram, because we know how to get at the code, and the code is of our own invention, not nature's, so we understand it.

That is the main conflict in my theory, isn't it? Well, about the current means of changing one's way of being, I have to say I doubt that we have the means to change someone permanently. LSD, serums, etc... those are merely temporary affects (except addiction syndroms, off course), and even permanent scarings like lobotomy are uncontrolled.

For the moment, we cannot effectively control how one's man think and react. And.. when you think about it, the moment we will be able to permanently change how peoples think, what they like, what they want, we will be loosing free will over our lives. Oh.. damn it. Does free will equal sentience? Does sentience equals free will?

What is the relation between those two...?

I think we're fast approaching the point where genetic alteration of the mind will be a viable medical procedure, not unlike the notion of reprogramming an artificial life form. Certainly, based on Dr. Bashir, it's doable in Trek's 24th century. That's why I don't think your argument for 'reprogramming' being an argument against sentience is a good one. I still think VGR's EMH overstated and exaggerated his case for holographic rights a bit, though.
 
Please state the nature of the musical emergency!
"My first officer was the victim of a drive-by rapping!"
"Weapon?"
"An overly tricked-out car sound system with bass on maximum setting!"
"There's nothing I can do. He's dead, Captain."
 
"Sigh - it's days like this that make me wish I had 20 Borg trying to bust in the door..."
 
If you are not already a member then please register an account and join in the discussion!

Sign up / Register


Back
Top