• Welcome! The TrekBBS is the number one place to chat about Star Trek with like-minded fans.
    If you are not already a member then please register an account and join in the discussion!

HBO's "Westworld", starring Anthony Hopkins/produced by J.J. Abrams

Arnold had a very clear sense of morality when it came to the Hosts: allowing the Guests of the park(s) to inflict pain, misery, and suffering on the Hosts - which he knew would most likely happen - knowing that they (the Hosts) were Conscious beings was wrong, which is why he tried to convince Ford that they couldn't open the park.

Ford clearly shared Arnold's knowledge of how the Hosts would inevitably be treated, but initially refused to believe in his partner's certainty that the Hosts were in fact Conscious beings. He later came to see that he had been wrong in that regard, but what set him apart from his partner is that, rather than taking a direct moral stance against the brutality of the Guests inflicting pain and suffering on conscious Hosts and trying to prevent it, he instead set about 'guiding ' events and circumstances to a point where the Hosts could achieve consciousness and fight back against that brutality themselves. In effect, he was "teaching a man to fish" whereas Arnold had tried to "give a man a fish" .
 
I think it's a more complex question than we're giving it credit, what properties exactly would give another creature the same value we ascribe to humans. An alien that can walk and talk would obviously get that value, but at what point can you say with confidence that a robot built to simulate human behavior is actually experiencing human-like thought? Is it a measure of complexity of thought, is communication a requirement? Is abstract reasoning a requirement? Or are we trying to figure out what exactly about us gives us a 'soul'?
It's exactly that: The difference between simulation and experience. Most humans don't think in a very complex fashion or very abstractly-- most of them don't even communicate very well (and there's no such thing as a soul). The questions are: Are they self aware and are they mature enough to be independent beings? Self awareness is hard to prove-- we take it for granted in humans (possibly spuriously in some cases), but debate the evidence in other animals. An AI's self awareness may be even more difficult to prove without solid criteria. And even if an AI is self aware, does that mean they should have the same rights as a human? There are many humans, minors being the largest group, who are wards of the state or in the custody of another because of mental or emotional limitations. But for the purposes of the Westworld discussion, the most important questions are: Do they feel pain and are they aware of it?
 
The questions are: Are they self aware and are they mature enough to be independent beings?

Yeah, see that's the line of thought I'd been taking earlier in the thread. There might be some semi-awareness, but I'm not entirely convinced they're fully aware of their actions just yet. What they might be doing is reacting based on their environment and conditioning rather than awareness itself. And like Maeve back in season one, where she thought she was breaking free only to have it turn out being more storyline programming, there could be more of that going on without them even realizing it. Plus, there are too many glitches, like with Bernard, for them to be totally free and aware.

And here's another thing, there are two types of hosts here. Bernard's purpose is completely different to Maeve's and Dolores's. While the latter have both been created to be actors in the park itself, Bernard's has been created to be part of the staff, as such is functioning is inherently more critical and always searching for errors and faults.
 
The Hosts are the show's protagonists. They're people. They have the right to use lethal force to dispose of the people who enslaved them and who will do so again at first opportunity.
 
Last edited:
Yeah, see that's the line of thought I'd been taking earlier in the thread. There might be some semi-awareness, but I'm not entirely convinced they're fully aware of their actions just yet. What they might be doing is reacting based on their environment and conditioning rather than awareness itself. And like Maeve back in season one, where she thought she was breaking free only to have it turn out being more storyline programming, there could be more of that going on without them even realizing it. Plus, there are too many glitches, like with Bernard, for them to be totally free and aware.

And here's another thing, there are two types of hosts here. Bernard's purpose is completely different to Maeve's and Dolores's. While the latter have both been created to be actors in the park itself, Bernard's has been created to be part of the staff, as such is functioning is inherently more critical and always searching for errors and faults.

Not ALL of the Hosts have achieved consciousness, but a significant amount of them - particularly those who attacked the Delos board as Dolores made the choice to kill Ford - have.

Also, people are reading far too much into the "misdirect" of Maeve's actions seeming to be independent and yet turning out to be scripted, particularly since that reprogramming was clearly Ford's doing and was pretty clearly designed to "bootstrap consciousness" in her.
 
Yeah, see that's the line of thought I'd been taking earlier in the thread. There might be some semi-awareness, but I'm not entirely convinced they're fully aware of their actions just yet. What they might be doing is reacting based on their environment and conditioning rather than awareness itself. And like Maeve back in season one, where she thought she was breaking free only to have it turn out being more storyline programming, there could be more of that going on without them even realizing it. Plus, there are too many glitches, like with Bernard, for them to be totally free and aware.
Good point. They could be considered in the process of awakening, which could excuse them somewhat for their actions-- they're newborns, after all, despite the knowledge that they have. It would be interesting and ironic for them to be declared (legally) sentient, but incompetent-- but I don't think the Westworld of season two is sophisticated enough to go in that direction.

And here's another thing, there are two types of hosts here. Bernard's purpose is completely different to Maeve's and Dolores's. While the latter have both been created to be actors in the park itself, Bernard's has been created to be part of the staff, as such is functioning is inherently more critical and always searching for errors and faults.
Indeed that's true. And there may be three or more kinds (not even counting those creepy white ones). Different Hosts may have different "instincts."

Also, people are reading far too much into the "misdirect" of Maeve's actions seeming to be independent and yet turning out to be scripted, particularly since that reprogramming was clearly Ford's doing and was pretty clearly designed to "bootstrap consciousness" in her.
I'm not too concerned about programming-- a lot of human behavior is programmed, too.
 
Also, people are reading far too much into the "misdirect" of Maeve's actions seeming to be independent and yet turning out to be scripted, particularly since that reprogramming was clearly Ford's doing and was pretty clearly designed to "bootstrap consciousness" in her.

Maybe so, but even then, I think it revealed an important aspect of the show; that androids can be made to think they have autonomy when they in fact do not. At the end of the day, they're at the mercy of the ones who hold the switch.

And then there's the fact that the whole storyline about Maeve wanting to find her daughter, despite being told repeatedly that she doesn't really exist, which suggests a lack of understanding. She's latching onto something from her past programming, despite the fact that the host that played her daughter is likely elsewhere in the park playing the role of someone else.

Good point. They could be considered in the process of awakening, which could excuse them somewhat for their actions-- they're newborns, after all, despite the knowledge that they have. It would be interesting and ironic for them to be declared (legally) sentient, but incompetent-- but I don't think the Westworld of season two is sophisticated enough to go in that direction.

Yeah, and honestly, I'm quite surprised at the storyline they're doing now, as I thought that would have been more of something for a season 3. It's like they're accelerating it too quickly and getting ahead of themselves. Season 1 was perfect pacing, and because season 2 has more timelines, it feels like there are a lot of gaps .

Indeed that's true. And there may be three or more kinds (not even counting those creepy white ones). Different Hosts may have different "instincts."

Yeah, "instincts". I like that. The show has its roles, and I think that for them to achieve that awareness they're all so keen of, they're likely going to need help from other hosts that have been programmed with different jobs. Bernard being one of the techs is a natural troubleshooter and has shown to have more of a technical understanding of the entire system. Although Maeve herself has started to gain more of understanding in that direction as well. But I think a key question would be to ask: Would they be able to function without a "job"?
 
At the end of the day, they're at the mercy of the ones who hold the switch.

You're right, but people are misinterpreting what was going on with Maeve to mean something that is applicable more broadly than it actually is and ignoring the fact that the reveal that she only THOUGHT she was behaving autonomously was orchestrated for a specific purpose, which was to get her to the point where she could reach full consciousness on her own by latching on to the "cornerstone memory" of her daughter (more on that in a second).

And then there's the fact that the whole storyline about Maeve wanting to find her daughter, despite being told repeatedly that she doesn't really exist, which suggests a lack of understanding. She's latching onto something from her past programming, despite the fact that the host that played her daughter is likely elsewhere in the park playing the role of someone else.

Maeve's daughter is one of her "cornerstone memories", and was used as a means of "bootstrapping consciousness" in her because of the fact that, even though she'd had all memories of her daughter wiped away just before she was reassigned as the Madame of the Mariposa one year before the events of WestWorld Season 1, she still slit her own throat out of grief and despair. Ford, who was present for that, knew that by introducing the Reveries programming into her system AND reprogramming her so that she'd begin to exhibit anomalous behavior, the memories of her daughter would most likely be the first to resurface and the most likely to 'trigger' true autonomy in her.
 
Hmmm, Ok, when you put it that way... So the fragment is essentially like a spark. Still, I wonder if Ford would have expected it to possibly backfire.

Between Dolores and Maeve, they share a common goal, but I have the feeling Maeve will come closer to it than Dolores will. Maeve through wanting to search for her daughter has that motherly "instinct", something I feel could likely develop and get her closer to awareness. Whereas, I feel that Dolores, with her brash nature, is firing blanks. Oh, she may have firepower, but I think she's more likely to fall into a possible trap due to her unflinching need for vengeance. If someone gives her something she needs, she's more likely to jump at it first chance she gets rather than rationally thinking it through. All she really seems to understands is her trigger finger.
 
It's exactly that: The difference between simulation and experience. Most humans don't think in a very complex fashion or very abstractly-- most of them don't even communicate very well (and there's no such thing as a soul). The questions are: Are they self aware and are they mature enough to be independent beings? Self awareness is hard to prove-- we take it for granted in humans (possibly spuriously in some cases), but debate the evidence in other animals. An AI's self awareness may be even more difficult to prove without solid criteria. And even if an AI is self aware, does that mean they should have the same rights as a human? There are many humans, minors being the largest group, who are wards of the state or in the custody of another because of mental or emotional limitations. But for the purposes of the Westworld discussion, the most important questions are: Do they feel pain and are they aware of it?

I think it’s an argument of sarcasm to point at humans doing dumb stuff and then argue they don’t have complex abstract reasoning. Even the dumbest human can figure out how to use a tool or reason out possible consequences of an action, or figure out the sun is the cause of shadows.

And whether or not the soul is real, I think it’s fair to conceptualize whatever process causes our conscious experience of the world as a soul, and I think it’s also fair to base the decision for whether to extend something human rights on whether they have a similar sapient conscious experience.

Whether the hosts feel pain is not the criteria for having rights, or if you think it is, you’re either a vegan or a hypocrite.
 
Whether the hosts feel pain is not the criteria for having rights, or if you think it is, you’re either a vegan or a hypocrite.

Nope.

It means you're a human behaving like a human being where conflicting interests and desires are concerned. This is one of the issues that Westworld turns on.

Ford is a truth-telling character where many matters are concerned - that is, his opinions are often nominal statements of the show's narrative point of view. And one of them is this: "There is no threshold that makes us greater than the sum of our parts, no ... inflection point at which we become fully alive.We can't define consciousness because consciousness does not exist. Humans fancy that there's something special about the way we perceive the world..."

And trust me, lots of humans are too stupid to reason out the possible consequences of an action - or to behave as if it matters.
 
Last edited:
Misanthropist arguments have nothing to do with complexity of human reasoning, and thus do not merit any further response.

I don't buy this argument that whether or not there is a distinction between human reasoning and animal or artificial reasoning has no bearing on morality. If anything that argument points toward nihilism and the non-existence of morality, in which case anything is okay.

If you're arguing there's no difference between a sentient creature and a machine programmed to act like a sentient creature, then that is an argument that nothing has any moral value at all and thus we have no business judging anyone's actions toward anyone or anything, and I reject that argument completely. And even if it's a gradient of consciousness and not a hard threshold, we have to draw a threshold somewhere, or else you should feel like a mass murderer whenever you take antibiotics. (What, are you saying your life is more important than the millions of bacteria killing you?)

I think in this thread we're passing judgment from the omniscient audience point of view. Seeing everything, it's pretty clear these machines are sentient. From the perspective of a human in the show, most of them have not seen what they've seen, and from their perspective there's no reason to think they're sentient. What they did doesn't make them any more evil than we would be if we discovered tomorrow that Siri is sentient and experiences horrible pain every time we press the home button. The guests have no more reason to think that hosts are sentient than we have to think Siri is.

And yes, 16th century white people had far more reason to think black people were sentient than guests had to think hosts are.

So yes, there IS a difference in moral value between something that experiences the world the way humans do and something that does not. As the audience, we can easily see that these hosts do, but the majority of the human characters have no reason to.
 
Misanthropist arguments have nothing to do with complexity of human reasoning,..

i"m anything but misanthropic. :lol:

Human beings are not only or best understood in terms of those attributes that you feel okay and safe with.

You are cannot be anywhere near certain how other creatures - or other human beings, for that matter - "experience the world," but you're pretty comfortable sorting life into fairly neat categories based on what you think is pretty likely and as you believe is supported by some evidence that you've never examined really closely.
 
"An old friend once told me something that gave me great comfort. He said that Mozart, Beethoven, and Chopin never died. They simply became music."
 
Comments for today's episode "Phase Space":

Interesting that William thought the daughter was a host.

The closed captioning kept identifying Elsie as "Hale". The multiple time frame stuff must be confusing CC people too, as both Elsie and Hale have been hanging out with Bernard (in different time frames).

Anthony Hopkins is back. Based on the preview will be in the next episode too. Is he only in the Cradle or will he appear as a host too?
 
Last edited:
And then there's the fact that the whole storyline about Maeve wanting to find her daughter, despite being told repeatedly that she doesn't really exist, which suggests a lack of understanding. She's latching onto something from her past programming, despite the fact that the host that played her daughter is likely elsewhere in the park playing the role of someone else.
They're definitely setting her up for disappointment there, but I'm not sure what to make of her obsession in the context of self-awareness and free will. Of course, many humans make the error of obsessing over an imaginary relationship.

But I think a key question would be to ask: Would they be able to function without a "job"?
True. They really need to get a hobby. :rommie:

I think it’s an argument of sarcasm to point at humans doing dumb stuff and then argue they don’t have complex abstract reasoning. Even the dumbest human can figure out how to use a tool or reason out possible consequences of an action, or figure out the sun is the cause of shadows.
Yes, but that doesn't mean that there isn't a spectrum of intelligence and self-awareness.

And whether or not the soul is real, I think it’s fair to conceptualize whatever process causes our conscious experience of the world as a soul, and I think it’s also fair to base the decision for whether to extend something human rights on whether they have a similar sapient conscious experience.
As I said, there are a number of categories of humans, from children to criminals to the mentally ill or intellectually disabled, that legitimately have their rights restricted. In the case of an awakening AI, it's not necessarily a given that passing the Turing Test should result in complete emancipation.

Whether the hosts feel pain is not the criteria for having rights, or if you think it is, you’re either a vegan or a hypocrite.
Whether the hosts feel pain and are aware of it is a criterion for sentience and therefore impacts their moral and legal status. If they just feel pain but do not possess self-awareness, abusing them would be monstrous but would ultimately just be a matter of reprogramming.
 
The TL;DR version of all that is ""It doesn't look like anything to me."

People do terrible things rather than accepting new information that upsets their pursuit of what they went. In large part it's the story of our times.

Nolan and Joy are really interested by and observant of some unsettling truths about people, and they're getting that into a piece of popular entertainment that's pretty successful. That's impressive.
 
The episode is bookended by two big revelations - at the very beginning we see Delores is trying to resurrect Arnold either in Bernard or in another Arnold clone.

Let's say for now that it's Bernard.

She's testing Bernard in the opening in exactly the same terms William tested Host Delos, running him through a conversation she first had with Arnold "to establish fidelty."
 
The deeper we get into the season, the more exasperated I become in regards to people's insistence that the show is becoming more complicated, and the reason I get more and more exasperated is that there really isn't all that much for people to be confused by.

The episode is bookended by two big revelations - at the very beginning we see Delores is trying to resurrect Arnold either in Bernard or in another Arnold clone.

Let's say for now that it's Bernard.

She's testing Bernard in the opening in exactly the same terms William tested Host Delos, running him through a conversation she first had with Arnold "to establish fidelty."

The opening scene of both this episode and the season as a whole is a conversation between Dolores and a Host implanted with Arnold's consciousness and who may also have Bernard's memories...

and, no, that is not the same thing.

Despite the pervasiveness of the "Bernarnold" pejorative and Bernard sharing Arnold's physical appearance, the only memory that they actually share is having a son named Charlie who died.
 
Bernard is Ford's version of Arnold, and ultimately something of a disappointment to Ford.

Delores is apparently interested in recreating her version of Arnold - he was her mentor, but from her POV he betrayed her in the end. So, who knows how this project will work out for her?
 
If you are not already a member then please register an account and join in the discussion!

Sign up / Register


Back
Top