I can understand being upset about how it took ST5, but what do you mean by "bigoted"?
The Mirror Universe has gotten a few different interpretations. Aside from the on-air version, there's:
- The DC Comics (first series) story
- Diane Duane's Dark Mirror
- IDW's mirror universe comics series
- and a few others escaping my memory at the moment.
As I recall David Mack's novels kept continuity with the on-air mirror universe pretty-much.
Curiously enough, Dark Mirror featured a mirror Geordi La Forge, but no novel in the modern novelverse has featured the mirror version of Geordi.
Through the sentiment he expressed about his memento of Tasha Yar from "Skin of Evil."
Of course, one of the reasons the whole "no emotion" thing was stupid and misguided was because Data always obviously did have emotions, no matter how much he insisted he didn't. He had affinities and preferences and hopes and dislikes. Even saying "I wish I had emotions" is a contradiction in terms, because just having that wish means you do feel something. It was deeply wrong to say that Data's emotions didn't exist just because he didn't express them outwardly in easily recognizable forms like laughing or crying or yelling. There are people on the autistic spectrum who can't show emotion in that way, but they definitely have the same emotions as anyone else. That's what Data was originally conceived to be like -- someone who had the capacity to feel, just not to express it very well. The mistake Piller and the later writers made was to confuse the expression of emotion with its existence. (Also to regurgitate the hackneyed and ignorant sci-fi cliche that "machines can't feel," that emotions are some uniquely human thing too complex for AIs. That's nonsense. Emotions are simple. Most animals have them, and they're very straightforward -- experience an urge, act on it. It's silly to say they can't be programmed, because they are programming -- automatic responses hardwired into the brain. What makes them complicated in humans is their interaction with our intelligence and abstraction, our ability to have thoughts and goals that come into conflict with our feelings.)
No, it doesn't. Again, where did we ever get this bizarre notion that humans have a monopoly on emotion? Anyone who has a dog should know better. Anyone who's seen animals mourning their dead should know better. (Elephants can mourn their dead for years.) And of course, in Trek, every alien species we've ever seen has emotion. Vulcans put on a facade of emotionlessness, but we know that's because their emotions are actually so overpowering that they have to control them tightly. So really, why should the exemplar of a quest for emotionalism be humans rather than, say, Romulans (since they're Vulcans who don't suppress their powerful emotions)? Or what about Betazoids? Surely a race of empaths is a far better exemplar of emotional awareness than humans. Equating "emotional" with "human" is a non sequitur both in real-world terms and in Trek terms.
And again, there are autistic-spectrum humans who don't show emotion in conventional ways. The suggestion that people who don't show emotion are less than human is very disturbing and ugly in that context.
And it ran counter to Trek's ideal of inclusion to say that a neuro-atypical character like Data should aspire to conform to more ordinary behavior rather than being satisfied with who he was. I think that today there's more awareness of neurodiversity in the culture and the media, and I think a modern version of TNG would respect Data's individuality more rather than insisting there's something wrong with him just because his psychology and outward affect are atypical.
As I understand it, there's yet to be an emotional AI created in real life. Is that just because of the limitations of AI period or because we still need to figure out how to write the software?
Um, no. Read what I said again. If we assume that Data doesn't have emotions, that's a piece he's missing if he wants to be like a human. The fact that other species have emotions too is beside the point. If I wanted to be a painter and tried to learn the method of Picasso exclusively (a bad analogy, but work with me here), it wouldn't matter Van Gogh also painted.
...as I experience certain sensory input patterns, my mental pathways become accustomed to them. The inputs eventually are anticipated and even missed when absent.
It is my understanding that the approach taken to Star Trek V was taken because of an approach to religious beliefs, that were present in different ways in V and Bread and Circuses. Those assumptions assume the superiority of a given group over a different group.
There hasn't been a self-aware AI created in real life either. The absurdity is the belief that creating sapient thought is somehow easier than creating emotion. After all, emotion evolved in animals long before intelligence did. As I said, emotion per se isn't that complicated. It's just a pre-wired response to a stimulus. Conscious thought is immensely more complicated. We haven't successfully simulated either one in computers yet, but logically emotion would be the easier one to achieve.
But that's exactly the problem -- the idea that Data "should" aspire to be something he isn't and feel there's something wrong with him because of his difference, rather than accepting himself for what he is. That kind of intolerance of neurodiversity is unworthy of Star Trek.
It is my understanding that the approach taken to Star Trek V was taken because of an approach to religious beliefs, that were present in different ways in V and Bread and Circuses. Those assumptions assume the superiority of a given group over a different group.
Goodman talks about his reasoning behind his treatment of STV in this post.As far as I know, Goodman only took that approach because Final Frontier sucked. Not because of its approach to religious beliefs. Just that it sucked.![]()
Yes, I always had the impression that Data was deliberately created without emotions and with a desire to be human so that he wouldn't turn out like Lore.Also, didn't his creator design him for that express purpose? If so, how do you think that would factor into the question if he should've tried to emulate people or just expand as a robot?
So, how would that work? Would it be like programming an artificial animal, like, say a cat, or something?
So, how would that work? Would it be like programming an artificial animal, like, say a cat, or something?
Read the second part. It's what Data wanted to do, whether it was for the best or not.
Also, didn't his creator design him for that express purpose? If so, how do you think that would factor into the question if he should've tried to emulate people or just expand as a robot?
Technically, you could say that some of the robots we've developed now actually do have emotion in a sense, in that they respond to stimuli without conscious intervention. It's at no higher level than that of, say, an insect or something, since there's no self-awareness of the sort you see in more complex lifeforms, but some emergent behavior...
And that sums up why I won't touch the bigoted James t Kirk autobiography with a barge pole.
That's taking my analogy way too literally. It's just an example to illustrate how absurd it is to treat emotion as some magic fairydust that's uniquely human and that's somehow more difficult to program than actual conscious thought. The sci-fi cliche that emotions can't be programmed is stupid, because they're the closest thing living beings have to programming. You don't learn emotions. You don't choose to feel something. They're hardwired into our brains and they happen automatically in response to stimuli.
But the show never treated that desire as harmful or a sign of self-loathing. It treated it as an aspiration we were supposed to sympathize with.
According to "Brothers," Soong created Data because he wanted to leave a legacy, to pass on something of himself to posterity.
After all, why create something other than human just to try to copy humans? That seems like a pointless exercise. As Isaac Asimov once said, "If robots turn out exactly like human beings, it would be a terrible waste; we've got human beings."
Well, excusing the challenge of pulling it off in the first place, nothing would be exactly the same. Say Data did succeed in his quest. He'd still be able to think faster, would live far longer, and experience stuff differently. So, he wouldn't be exactly like a human.
We use essential cookies to make this site work, and optional cookies to enhance your experience.