• Welcome! The TrekBBS is the number one place to chat about Star Trek with like-minded fans.
    If you are not already a member then please register an account and join in the discussion!

Are sentient androids really THAT hard to make?

And the attempt to justify the contents of ancient myths in terms of an overarching scientific theory smacks of Velikovskian thinking.
Of course. Shame on you Allyn for not being well versed in the teachings and thought processes of Immanuel Velikovsky. Come on Allyn! Total bushleague :evil:
I've been aware of Velikovsky since I read Sagan's Broca's Brain, where he devotes a whole chapter to Velikovsky's theories.

Curiously, on a mailing list devoted to the history of astronomy I subscribe to, Velikovsky was a recent topic of discussion.

So I'm not well-versed in his theories, but I am aware of them. They're total bunk, and it's obvious they're total bunk.

Psychological bicameralism isn't even in the same league.
 
It's still a lot better than just dying, which is what we have to look forward to otherwise.

But even if you have a reasonable facsimile of your mind in a computer somewhere, your consciousness still remains inside your skull, and is still just as mortal as ever. So it won't make any difference to you as far as your personal perceptions are concerned. In fact, you might come to resent the fact that some lousy computer program that only thinks it's you is going to live on after the real you dies.

It could be possible to transfer your consciousness into a computer by gradually replacing parts of your organic brain with electronic equivalents.

If that's actually feasible - meaning it's possible for electronics to perform the same functions as one's neurons, etc - is unknown.
We know too little to even understand the problems - we're like ancient greeks dicussing moon landings.

As for the potential power of consciousnesses, I personally don't think there are hard constraints on how powerful a mind could be, outside of the obvious physical ones like the speed of light or the Bekenstein bound.
It's fallacious to treat consciousness as something separate from physicality. The brain is like any other part of the body -- an evolved organ adapted to suit a particular set of needs. And like any other organ, it's subject to physical needs and physical limitations. There's a reason we don't have 10 arms or 20 eyes or 5 hearts -- because it would simply be overkill, placing too much demand on the system for too little gain.

Evolution isn't some upward ladder toward godhood -- that's a total fantasy. Evolution is adaptation to the needs of one's environment. The optimal state for any organism is the one that's best adapted to its needs. And that means having too much of something is just as bad as having too little. Now, our consciousness evolved to suit the needs of our environment -- to process our perceptions of and interactions with the universe we live in. So it stands to reason that the amount of processing power we have is well-adapted to the needs of an entity that exists in and interacts with the physical universe. A brain that's too complex or powerful might "overshoot" the needs of existence within the universe and thus be just as unable to function as a person weighed down by dozens of extra limbs would be. The inner complexity of its thoughts might overwhelm its perception of the much less complex exterior universe and leave it incurably schizophrenic, say.
Recent findings in anatomy seem to indicate the fact that our brain/intelligence have grown as much as they can by using the methods evolution applied to make us smarter.
An organic brain larger than ours won't be smarter - it will, in fact, be dummer, due to the distance between different parts of it (the longer time signals will take to reach these parts).

It's probable that we didn't become smarter NOT because there's no evolutionary advantage in doing that, but due to the fact that evolution found no way to make us smarter - no accidental mutation gave its possesors a further advantage in this area.
Nowadays, technology gave humans new tools, inaccessible to biology and evolution.

Also, it would help to define concepts such as "intelligence" or "consciousness".

Some time ago, intelligence was largely equated with computing power - a false assumption.
Nowadays, intelligence is defined as "goal oriented adaptive behaviour" (yes, I know I'm only scratching the surface of definitions of intelligence). What all definitions of intelligence have in common is that they are abstract, general, that no one knows how to translate them in concrete instructions about how to build an intelligent machine.

Greater intelligence might not equate with greater imagination, with schizophrenia. Just because your IQ is bigger doesn't automatically mean that your imagination is more developed.
Even more, a highly developed brain would certainly have the ability to multitask, efficiently processing both the objective and subjective worlds.

"Consciousness" - an even more elusive concept than intelligence. It's the ability humans have to realize one exists. No one knows how to make a computer intelligent - well, that goes double for consciousness.

Can intelligence develop far enough for the technological singularity?
I strongly doubt it. This assumes an ad infinitum exponential develpment of intelligence. The history of science showed that's not how technology develops. The laws of physics impose limits to what one can do - in ANY domain.

How far can intelligence develop?

In my opinion, several aspects of human intelligence could be improved upon (and probably, will be) - for example, memory (humans memorise information slowly/incompletely/inaccurately) or mathematical prowess.

In theory, one could think faster - along the lines of thinking in a second what a human thinks in a hour of concentration. However, this tends to equate intelligence with processing power (as I mentioned, an inaccurate analogy) - perhaps intelligence/creativity can't be rushed.

On the other hand, I doubt highly developed intellects could discover new, revolutionary logic principles, inaccessible to "mere" humans, that allow them to come to correct conclusions, in ways unintelligible to us.
 
Had to read this thread as I'm currently including the transfer of a living "sentience" into a revised positronic net in my fanfic.

To be brutally honest, I'm nowhere near as versed in some of the more arcane scientific (and non-scientific!) texts as many of the posters here seem to be. I simply based my theories on Star Trek ie already established canon examples of such a transfer.

Essentially to me, it means that Trek has shown it possible (if dangerous) and therefore I can do it. It doesn't mean of course that I won't try and wrap it in science (pseudo or not). :D
 
I had a feeling you'd dismiss bicameralism out of hand. *shrug*

You leave us no choice but give you a trolling infraction for this by your own admission here:

I had a feeling you'd dismiss bicameralism out of hand. *shrug*
I don't think you realize what a profound insult I consider that to be.
No, Christopher, actually I did.

Comment to PM, please.
 
Why would this transition have happened everywhere in the world at the same time? A lot of parts of the world were isolated from each other 3000 years ago, the Americas in particular remaining almost totally isolated until just 500 years ago.
For what it's worth, Christopher, according to the theory it didn't. The Native American populations didn't transition from bicameralism to consciousness until exposed to the Europeans, which is one of the factors in why their civilizations fell to the Spaniards and the English.

Comment to PM, please.
Not that I particularly care one way or the other, but how is that trolling? And while I sent you a PM, I'd rather discuss this in the open, as I want to know if you've picked on this because of the open season on Christopher in the "Authors you've been turned off on" thread.

Did I expect Christopher to dismiss bicameralism out of hand? Yes, because it's so bonkers, and Christopher is the living personification of rationcination. Is that the reason why I introduced the idea of bicameralism, to get a rise out of Christopher? Absolutely not.

I thought it was an interesting idea, because it gives us a different way of viewing consciousness, and it raises the idea that something that we view today as abnormal -- schizophrenia -- might actually have an evolutionary basis, that it's the way our brains are actually wired, and it's only through conditioning that what we conceive of as consciousness is possible. Which would have bearing and relevance to a discussion about artificial intelligence and how we define it.

Is the theory problematic? Sure. Is it provable? No.

Is it interesting? Beyond the shadow of a doubt.

If it's an apology or contrition that's wanted, look elsewhere, Rosalind. As I said in my PM to you, I don't care one way or the other.
 
Last edited:
Did I expect Christopher to dismiss bicameralism out of hand? Yes, because it's so bonkers, and Christopher is the living personification of rationcination. Is that the reason why I introduced the idea of bicameralism, to get a rise out of Christopher? Absolutely not.

Fair enough, but when you say "I sure do!" in response to him saying, "I don't think you realize you just insulted me," it sounds like a textbook example of trolling; Saying something that isn't objectively provocative that you know will nevertheless provoke the target. I can only assume this is one of those textual misunderstandings, because that sort of sudden, naked hostility seemed unlike you.
 
If you are not already a member then please register an account and join in the discussion!

Sign up / Register


Back
Top