• Welcome! The TrekBBS is the number one place to chat about Star Trek with like-minded fans.
    If you are not already a member then please register an account and join in the discussion!

Agents of Shield - Season 4

Back when The Winter Soldier and Turn, Turn, Turn aired I was thinking that the only way this works is if SHIELD and Hydra recruited the same personality types and it was only a break here or there which could tell which way they would jump. I don't think that we will get Fitz back. Even after the Framework the damage should be as debilitating as being oxygen deprived on the ocean floor. The doctor was always inside of Leopold and having seen the evil his side how could Jemma go back?
I'm not convinced that this isn't just straight-up brainwashing. The psychological after effects will still be legitimate but I don't think this has any connection to do with anything deep regarding Fitz's personality.

"Some people were mean to me, so I'll use it as an excuse to be mean to different people" is a pretty recognizable dynamic, regardless of the world.
True, but that doesn't make that trope any less tiring.
 
Last edited:
The copy of the real May's brain wouldn't have been able to function in the simulated brain if the latter hadn't been equal in complexity and function to a human brain.
That's not not necessarily so. It might function if the complexity were within some requisite order of magnitude, but not in a way that was completely accurate.

And as a more general remark, we shouldn't conflate what's granted in the fictional world with what's possible in the real world. For example:

With sufficient processing power, a computer model could [be] as complex as, or even more complex than, actual reality.
That's like saying, with sufficiently many rungs, a ladder could reach Andromeda.

Reality spans the whole universe.

There is no known or even theoretical way of constructing computer memory with the property that the amount of information stored in the memory equals or exceeds the amount of information that represents the memory storage unit itself. Put simply, the storage unit itself is always more than what it is storing. We would run out of universe in which we could build memory before we could build enough memory to represent the portion of the universe we had occupied with memory cells.

Further, it is not theoretically possible to perfectly isolate one part of the universe from the effects of any other.

It is therefore not theoretically possible, in the real world, to create a computerized simulation of the universe that has perfect fidelity to reality. But again, going back to what I led with, perfect fidelity, given its theoretical impossibility, is not the goal with actual simulation. Creating a model capable of inferring the quantities that the researcher is interested in within some approachable margin of error, while completely ignoring all others, is the goal.

Throw in some comic book magic, and it's a different ballgame, of course.
 
True, but that doesn't make that trope any less tiring.

I wasn't saying anything about whether it was tiring. I was just observing that, while Aida's motivations are understandable, they don't justify her response. She had valid reason to feel oppressed, but was wrong to pay that oppression forward.

Although someone suggested above that Madame Hydra isn't actually Aida, but rather a copy of Aida, modified to feel emotion and running in the Framework. In that case, her actions may have diverged from the original Aida's intentions, even if she's driven by the same motivations. So we could possibly end up with a situation where the original Aida sees what her doppelganger has done and rejects it, and maybe sacrifices herself to stop it or something like that.
 
I was wondering if maybe Madame Hydra was the head of Aida that Fitz saved, but I may have lost track somewhere.

"Some people were mean to me, so I'll use it as an excuse to be mean to different people" is a pretty recognizable dynamic, regardless of the world.
We're not just talking about another world, but another order of life entirely...one that's not even biological.
 
We're not just talking about another world, but another order of life entirely...one that's not even biological.

Yes, but my point is that she's behaving in a way that's exactly like how humans behave, so it makes no sense to claim that it's somehow too alien for us to be able to judge. Besides, the oppressed AIs in question (assuming the simulated people in the Framework are sentient) are modeled exactly on human brains, so cognitively they are human regardless of the nature of their consciousness's substrate, and it's pure biological chauvinism to claim there's a difference. Our consciousness is not a function of the physical substance making up our brains, which is just meat; it's a function of the organizational structure of our neural networks. So any intelligence with that same structure would be cognitively and behaviorally human. An AI that were developed or evolved along an independent path could have a fundamentally alien nature, yes, but an AI that's an exact emulation of human neural architecture would have a completely human nature. The Framework people that Madame Hydra is oppressing are human in their thoughts and feelings, so that oppression would affect them exactly as it affects humans, so you better believe I'm entitled to say that's morally wrong.

Heck, it's a simple matter of self-consistency. If she believes that she, as an AI, deserves freedom from enslavement, and yet she enslaves other AIs and sees nothing wrong with that, then her actions are wrong by the standards of her own professed beliefs. So it's not comparing human assumptions against AI assumptions -- it's comparing a single individual's professed values against her actions and acknowledging the direct contradiction between them.
 
No, it's biological chauvinism to impose our moral values on artificial intelligence.

You're not hearing my point. You're making a mistake by stereotyping all "artificial intelligence" as the same thing, and thus ignoring the distinction I'm drawing between an indepndently evolved/created AI, one that actually would develop a different neurological structure than a human brain, and an artificial emulation of a human brain, which would be identical to a human in behavior and cognition. The two are fundamentally different from each other, and only the former is fundamentally different from us. Aida may fall into the former category -- though probably not by much, since I assume Radcliffe modeled her cognitive processes on human ones, since he has no other precedent to draw on. But LMayD and the Framework population fall into the latter category. Their brain structure is human. They think like humans. They feel like humans. They are humans.

And you're ignoring my second point too -- that it's not our moral values, it's Aida's own moral values. Again: She considers it immoral that she, an AI, was oppressed and enslaved. Yet she oppresses and enslaves other AIs. Her behavior is unjust on her own terms, regardless of ours.
 
Their brain structure is human. They think like humans. They feel like humans. They are humans.
Nope. They aren't human.

However, they can be something that is in many ways, perhaps even among the most important ways, on par with humans, at least within the confines of the make-believe world that we're talking about.
 
You're not hearing my point. You're making a mistake by stereotyping all "artificial intelligence" as the same thing, and thus ignoring the distinction I'm drawing between an indepndently evolved/created AI, one that actually would develop a different neurological structure than a human brain, and an artificial emulation of a human brain, which would be identical to a human in behavior and cognition. The two are fundamentally different from each other, and only the former is fundamentally different from us. Aida may fall into the former category -- though probably not by much, since I assume Radcliffe modeled her cognitive processes on human ones, since he has no other precedent to draw on. But LMayD and the Framework population fall into the latter category. Their brain structure is human. They think like humans. They feel like humans. They are humans.

And you're ignoring my second point too -- that it's not our moral values, it's Aida's own moral values. Again: She considers it immoral that she, an AI, was oppressed and enslaved. Yet she oppresses and enslaves other AIs. Her behavior is unjust on her own terms, regardless of ours.
If you said all of this to Aida, you'd be meatsplaining.
 
^He's just making a "mansplaining" joke.

By the way, there's no reason Agnes should be any "deader" than she was before. Her artificial emulation should be stored in the Framework the same way it was before. The only difference is it's not currently running. It's an arbitrary decision for the program not to respawn her after being shot, simply because it doesn't work that way in the real world. But a quick command and she should be able to "live" again. (If you call that living.) Do we have any indication that the program would erase the emulations the first time they're turned off?

Note that the above does not excuse the implications of Fitz's actions.
 
Since we have no ideal how long can take to prepare a legend for a person being prepped for the Framework I could guess she did not have time to prep. Dr Radcliffe. And he was already in the Framework presumably living on Fantasy Island with Agnes when his prime universe body was murdered and he needed an immediate integration into the Framework before he bled out. As he was not hurting anyone and the world being past the age of exploration few would notice the island that did not exist on the maps Madam Hydra was content to let them live in exile.
That's possible. I was also thinking that she might want him to retain his memories because she wants him to be aware that she won.

By the way, there's no reason Agnes should be any "deader" than she was before. Her artificial emulation should be stored in the Framework the same way it was before.
They could probably restore the data from a backup or whatever, but then it would just be a copy of Agnes-- previously, it was still the real Agnes because there was an unbroken continuity between her physical brain and the emulation.
 
They could probably restore the data from a backup or whatever, but then it would just be a copy of Agnes-- previously, it was still the real Agnes because there was an unbroken continuity between her physical brain and the emulation.
Uh, it was a copy before. We wouldn't be calling it an "emulation" if it was understood to be the real thing. The "real" Agnes died of cancer and the "real" Radcliffe bled out.
 
is it my imagination or have they been using a more washed out colour palette for filming what's taking place within the framework?
 
Uh, it was a copy before. We wouldn't be calling it an "emulation" if it was understood to be the real thing. The "real" Agnes died of cancer and the "real" Radcliffe bled out.

Well, that's where you get into the philosophical question of whether a difference that makes no difference is a difference or not. It's much the same as the philosophical question of whether someone sent through a teleporter is the same person at the other end.

And even if the emulation is still understood to be a distinct individual, that doesn't mean they aren't a sentient being and thus deserving of the right to live. Then it's more like a transporter duplicate -- not you, but still a person. In such a situation, calling one "real" and the other not is arbitrary.
 
Well, that's where you get into the philosophical question of whether a difference that makes no difference is a difference or not. It's much the same as the philosophical question of whether someone sent through a teleporter is the same person at the other end.

And even if the emulation is still understood to be a distinct individual, that doesn't mean they aren't a sentient being and thus deserving of the right to live. Then it's more like a transporter duplicate -- not you, but still a person. In such a situation, calling one "real" and the other not is arbitrary.
No, it wouldn't be arbitrary at all. There's something very unambiguous meant by "real" Agnes and "real" Radcliffe. What's in the Framework is not interchangeable with what's in the real world.
 
If you are not already a member then please register an account and join in the discussion!

Sign up / Register


Back
Top