• Welcome! The TrekBBS is the number one place to chat about Star Trek with like-minded fans.
    If you are not already a member then please register an account and join in the discussion!

Are sentient androids really THAT hard to make?

Oh, the Singularity is nonsense. As Ken McLeod puts it, it's "the Rapture for geeks." Even if it is possible for AIs to keep improving themselves exponentially, that doesn't do us meat-based intelligences any good, since our minds need to run within the organic substrate they're evolved for. And even if we could be copied into computers, those copies wouldn't be us; we'd still be mortal flesh. Also, I expect there to be limits on how powerful a brain can get, just as there are practical limits on how big or powerful a body can get.
 
And even if we could be copied into computers, those copies wouldn't be us; we'd still be mortal flesh.

Yep. Unless you believe in the existence of a scientifically measurable and physically transferable soul that can run on either meat-based or silicon-based systems, uploading doesn't really help us.

Reminds me of Spock Must Die, I think it was, in which McCoy argues that every time he uses a transporter, he dies and gets replaced by a new copy who thinks he's the same person but isn't.
 
You could make the same argument about not being the same person you were a year, a month, or a second ago. On some level, you are not. No point in McCoy getting upset about it; the continuity of the self is an illusion anyway.

That said, I could see being afraid of a transporter just because of the violence it does to the body. There must be some process that prevents it from hurting like absolute hell, and we know that process does not always succeed.

Christopher said:
Oh, the Singularity is nonsense. As Ken McLeod puts it, it's "the Rapture for geeks." Even if it is possible for AIs to keep improving themselves exponentially, that doesn't do us meat-based intelligences any good, since our minds need to run within the organic substrate they're evolved for. And even if we could be copied into computers, those copies wouldn't be us; we'd still be mortal flesh. Also, I expect there to be limits on how powerful a brain can get, just as there are practical limits on how big or powerful a body can get.

It's still a lot better than just dying, which is what we have to look forward to otherwise. As for the potential power of consciousnesses, I personally don't think there are hard constraints on how powerful a mind could be, outside of the obvious physical ones like the speed of light or the Bekenstein bound.
 
Reminds me of Spock Must Die, I think it was, in which McCoy argues that every time he uses a transporter, he dies and gets replaced by a new copy who thinks he's the same person but isn't.
Hm, that's an interesting argument, which reminds me of The Prestige.
 
It's still a lot better than just dying, which is what we have to look forward to otherwise.

But even if you have a reasonable facsimile of your mind in a computer somewhere, your consciousness still remains inside your skull, and is still just as mortal as ever. So it won't make any difference to you as far as your personal perceptions are concerned. In fact, you might come to resent the fact that some lousy computer program that only thinks it's you is going to live on after the real you dies.


As for the potential power of consciousnesses, I personally don't think there are hard constraints on how powerful a mind could be, outside of the obvious physical ones like the speed of light or the Bekenstein bound.

It's fallacious to treat consciousness as something separate from physicality. The brain is like any other part of the body -- an evolved organ adapted to suit a particular set of needs. And like any other organ, it's subject to physical needs and physical limitations. There's a reason we don't have 10 arms or 20 eyes or 5 hearts -- because it would simply be overkill, placing too much demand on the system for too little gain.

Evolution isn't some upward ladder toward godhood -- that's a total fantasy. Evolution is adaptation to the needs of one's environment. The optimal state for any organism is the one that's best adapted to its needs. And that means having too much of something is just as bad as having too little. Now, our consciousness evolved to suit the needs of our environment -- to process our perceptions of and interactions with the universe we live in. So it stands to reason that the amount of processing power we have is well-adapted to the needs of an entity that exists in and interacts with the physical universe. A brain that's too complex or powerful might "overshoot" the needs of existence within the universe and thus be just as unable to function as a person weighed down by dozens of extra limbs would be. The inner complexity of its thoughts might overwhelm its perception of the much less complex exterior universe and leave it incurably schizophrenic, say.

More isn't always better. The ideal is balance, not excess. A mind needs to be in balance with the environment it interacts with. Just as a planet needs to be in a "Goldilocks zone" -- not too hot, not too cold, but just right -- in order to support complex life, there may also be a "Goldilocks zone" for consciousness, not too simple and not too complex, but just right for doing the job of comprehending and interacting with our universe.
 
Yes, totally agree with Christopher. It's all about balance. And we do have a lot of real-life and fictional examples where people with extraordinary mental powers have had other mental problems or have had to resort to other ways of keeping themselves engaged and balanced.

Wesley Crusher had to become a Traveler, Sherlock Holmes had his addiction, House has his medical cases and his snarky attitude and his Vicodin, John Nash has to be medicated etc. :)
 
It's still a lot better than just dying, which is what we have to look forward to otherwise.

But even if you have a reasonable facsimile of your mind in a computer somewhere, your consciousness still remains inside your skull, and is still just as mortal as ever. So it won't make any difference to you as far as your personal perceptions are concerned. In fact, you might come to resent the fact that some lousy computer program that only thinks it's you is going to live on after the real you dies.

I only think I'm the me from yesterday, and the one who dies is only going to think he's the me from now, anyway. They don't make any difference to me as far as my personal perceptions are concerned--I nonetheless have some sentimental interest in ensuring that the future one exists.


As for the potential power of consciousnesses, I personally don't think there are hard constraints on how powerful a mind could be, outside of the obvious physical ones like the speed of light or the Bekenstein bound.
It's fallacious to treat consciousness as something separate from physicality

I totally agree.

The brain is like any other part of the body -- an evolved organ adapted to suit a particular set of needs. And like any other organ, it's subject to physical needs and physical limitations. There's a reason we don't have 10 arms or 20 eyes or 5 hearts -- because it would simply be overkill, placing too much demand on the system for too little gain.

Evolution isn't some upward ladder toward godhood -- that's a total fantasy. Evolution is adaptation to the needs of one's environment. The optimal state for any organism is the one that's best adapted to its needs. And that means having too much of something is just as bad as having too little. Now, our consciousness evolved to suit the needs of our environment -- to process our perceptions of and interactions with the universe we live in. So it stands to reason that the amount of processing power we have is well-adapted to the needs of an entity that exists in and interacts with the physical universe.
I think it stands better to reason that once intelligence evolved, selective pressures for greater intelligence rapidly faded.

A brain that's too complex or powerful might "overshoot" the needs of existence within the universe and thus be just as unable to function as a person weighed down by dozens of extra limbs would be. The inner complexity of its thoughts might overwhelm its perception of the much less complex exterior universe and leave it incurably schizophrenic, say.
There's a mechanism that prevents our own imaginations and memories from doing so.

More isn't always better. The ideal is balance, not excess. A mind needs to be in balance with the environment it interacts with. Just as a planet needs to be in a "Goldilocks zone" -- not too hot, not too cold, but just right -- in order to support complex life, there may also be a "Goldilocks zone" for consciousness, not too simple and not too complex, but just right for doing the job of comprehending and interacting with our universe.
These are nice and generally correct sentiments--I'm not sure that "balance" is a cognizable factor, and individuals certainly do not attempt to balance anything, and "unbalanced" adaptations are well represented in nature. If something exists, it just means it is capable of surviving long enough to produce offspring. Like you said, evolution is no march to godhood. It is a competitive struggle, with innumerable false moves, including ones that decrease survivability but nonetheless fail to act as a selective factor--like a peacock's feathers, parasites that kill or degrade their hosts, or manic-depressive disorder in humans.

But, at any rate, I think you're talking past me here...

To clarify, are we talking about a naturally evolving life form, or an artificial intelligence? Huge difference--and I'll concede that evolutionary factors place major constraints on the natural development of an intelligence.

I would not agree that those factors would apply to an intelligence designed and built by a natural intelligence (or by another artificial intelligence). Such an intelligence's environment can be controlled; its interactions with its environment refereed by its creators; and, assuming we understand how to implement them in the first place, it could have any qualities we desire, up to practical and physical limits.

For example, we might build a robot without an analogue to our own limbic system, with no emotion. We might also build a robot capable of vastly greater intuitive mathematical feats than the human brain, but without means to gather or analyze sensory input. We might build a robot incapable of distinguishing memory of sensory input and imagined sensory input from actual sensory input.

Any natural intelligence with these characteristics would, in a state of nature, soon be obliterated by its own environment, or its own inability to sustain its operations, including the emotionless robot, who would probably wind up dead unless it were otherwise invulnerable, since an emotional response like fear is a pretty necessary component of the natural life.

However, such artificial intelligences would persist for as long as its creators deigned to provide it with necessary resources.
 
I think it stands better to reason that once intelligence evolved, selective pressures for greater intelligence rapidly faded.

That's just a different way of expressing the same thing I said.


A brain that's too complex or powerful might "overshoot" the needs of existence within the universe and thus be just as unable to function as a person weighed down by dozens of extra limbs would be. The inner complexity of its thoughts might overwhelm its perception of the much less complex exterior universe and leave it incurably schizophrenic, say.
There's a mechanism that prevents our own imaginations and memories from doing so.

No, there isn't. That's why schizophrenia exists in the first place. We all have an inner mental life alongside our external perceptions. At healthy levels, we call it imagination. People who are more intensely imaginative tend to be writers, artists, scientists, inventors, priests, mystics, etc. But take it too far and the inner reality overwhelms the outer. Your fantasies become more real to you than the outside world. That's schizophrenia.

The only preventive mechanism here is balance -- having a degree of imagination, of internal mental activity, that's not out of balance with the needs of perceiving and interacting with the external world. Take any aspect of consciousness too far and it becomes insanity.


These are nice and generally correct sentiments--I'm not sure that "balance" is a cognizable factor, and individuals certainly do not attempt to balance anything, and "unbalanced" adaptations are well represented in nature.

Balance means avoiding runaway extremes, finding a viable middle ground between too little of something and too much of it. In evolution, those things are functions of the needs of your environment. I'm not talking about some Platonic ideal here, but about functionality, about having traits and abilities that are in balance with the demands of your environment, lifestyle, and biology. Having more than two eyes would place excessive, unnecessary demands on our brains and metabolisms, because we'd need that much more neural complexity to operate them and that much more food to power that extra organ and the extra neural processing to go with it. The cost of having extra eyes outweighs the potential benefits; it would be imbalanced. Conversely, having only one eye would save on metabolic demands, but it would deprive us of depth perception and field of view. Again, the cost would outweigh the benefits, which is imbalanced. For us, given the specifics of our environment, our behavior, and our anatomy, having two eyes is more balanced than the alternatives: the cost is commensurate with the benefit.


To clarify, are we talking about a naturally evolving life form, or an artificial intelligence? Huge difference--and I'll concede that evolutionary factors place major constraints on the natural development of an intelligence.

Any intelligence would have to be able to function within the environment it occupies. Sure, theoretically you could artificially create a brain that's vastly more complex than it would need to be to function in this universe, just as theoretically you could engineer a human with six eyes. My point is that it might not work as well because it would be overkill, out of balance with its needs. Maybe you could make a superintelligent mind, but its internal processing might be so much more complex than the inputs it's receiving from the outside world that it doesn't even notice physical reality. By our standards, it might be schizophrenic or catatonic.

Not to mention that intelligence is a complex, dynamic process. There are a lot of ways it can go wrong. The more complex the system becomes, the greater its potential for instability. So there might be a threshold of complexity beyond which runaway chaos becomes inevitable.

The point is, we just don't know enough to make assumptions. We have a very, very small sample of sapient species to examine, and no examples of sophonts more intelligent than the greatest human minds. We have no grounds for assuming that intelligence has no functional upper bound. We like to think of our intelligence as something extraordinary, something intangible that makes us special, even something divine. It follows from that to think of it as something that can be escalated without limit. But the more we learn about intelligence, the more we discover that we're not so different from other animals; we don't have much of anything they don't have, we just have more of it in one package. Intelligence is a biologically evolved trait that arose in response to survival needs. And most biological traits have their limitations on how far you can augment them before they become maladaptive. So I think it's overly romanticizing intelligence to assume it must be fundamentally different from every other trait, to assume it has no upper bound.
 
So it stands to reason that the amount of processing power we have is well-adapted to the needs of an entity that exists in and interacts with the physical universe. A brain that's too complex or powerful might "overshoot" the needs of existence within the universe and thus be just as unable to function as a person weighed down by dozens of extra limbs would be. The inner complexity of its thoughts might overwhelm its perception of the much less complex exterior universe and leave it incurably schizophrenic, say.
There is the thought that the human mind's natural state is schizophrenia, and that what we now perceive as the "normal" human consciousness is a relatively recent development from within, say, the last three thousand years. It's called bicameralism.
 
^I'm highly skeptical. Why would this transition have happened everywhere in the world at the same time? A lot of parts of the world were isolated from each other 3000 years ago, the Americas in particular remaining almost totally isolated until just 500 years ago. Also, I think the theorist is jumping to the conclusion that just because the storytellers thought a certain way, it means everyone did. Maybe schizophrenics were just as much a minority group in the past, but they were extolled as prophets and bards and were the generators of most mythology and lore.

And the attempt to justify the contents of ancient myths in terms of an overarching scientific theory smacks of Velikovskian thinking.
 
I had a feeling you'd dismiss bicameralism out of hand. *shrug*

Personally, I think it's an interesting theory. Impossible to prove, impossible to disprove for that matter. I don't know quite what I think of it myself; Richard Dawkins is probably right, that it's either mad genius or just plain mad. :)
 
And the attempt to justify the contents of ancient myths in terms of an overarching scientific theory smacks of Velikovskian thinking.

Of course. Shame on you Allyn for not being well versed in the teachings and thought processes of Immanuel Velikovsky. Come on Allyn! Total bushleague :evil:




Wait for it...
 
^I'm highly skeptical. Why would this transition have happened everywhere in the world at the same time?

Morphogenetic fields?

As in Rupert Sheldrake? Not widely considered to be real science, as far as I know.

My knowledge of science is not nearly as good as a lot of my fellow posters, so I wouldn't know the details :). I simply remember reading a book about the theory, as explanations for the unusually rapid spread of behaviour or knowledge between members of a species across great distances. It was all theoretical, but there seemed to be some evidence (whether that evidence was worth much I don't know, but to me it seemed interesting). I found it fascinating, particularly as it overlapped with certain experiences of my own. However, science isn't my strong point and I don't follow developments to the degree that some of you do. I just brought it up because I remember the theory being interesting and it seemed to provide a possible answer for the major problem Christopher identified with the ideas being discussed. :) Don't think I know what we're talking about in any depth, I just felt like throwing that out there...sorry!
 
Last edited:
If you are not already a member then please register an account and join in the discussion!

Sign up / Register


Back
Top