• Welcome! The TrekBBS is the number one place to chat about Star Trek with like-minded fans.
    If you are not already a member then please register an account and join in the discussion!

The scientist planning to upload his brain to a COMPUTER

So, in other words... At best, a brain can be copied by imitating the electrical impulses in the brain, which can be done easily by sticking a needle into each and every brain cell and measuring their interactions, creating a digital model that the RAM of no computer in the world can hold. At worst, you also need to emulate an unknown number of chemical, biological and physical interactions that nobody is sure about.

That last line is the important bit—again assuming the herculean effort described would actually pan out. It was once blithely thought that DNA was a "blueprint" for an organism... then cloned animals, like CC the cat, started turning up radically different in appearance and character to their progenitor. All that other code once dismissed as "junk DNA" turned out to be latent instructions triggered by environmental factors—"epigenetic code." Odds are there's still a long way to go, more to learn.

We've at least dabbled with genetic engineering and medical treatments, while we know nothing about how mind ties in to matter. And yet certain futurists assure us that personalities can be "uploaded" into a computer memory.

Right.

Never say never, but until the science reaches the level of engineering, it's a little hard to promise a return on the investment. (Like commercial fusion power, which was only "10 years away" 50 years ago.)

Science Fiction is here to remind us that sometimes the impossible is precisely what happens and all the things people thought WOULD happen turn out to be red-herrings.
 
It's true that if people's intellects survive, even if they don't, it could have interesting repercussions. Imagine if Mozart's intellect had continued to exist, the music he could have composed or if Einstein was able to adapt his theories to the new data we've collected... ect. The possibilities are endless.
Good point. Granted, in the example of Mozart, we have his music, and it gives us insight into his mind, but just think if we actually had a complex model of his mind, fully active and searchable.
 
However would Mozart's brain in limbo, with no what I'll call true human experience for lack of a better term (physical, emotional, mental and the like) still create great pieces of music? Great music cannot be played without emotion, never mind be composed in an emotionless vacuum.
 
I think it would make Mozart's music even better, because after uploading he wouldn't be just Mozart, he would have become the main character from Johnny Got His Gun. What do we know, he might end up writing One by Metallica.
 
However would Mozart's brain in limbo, with no what I'll call true human experience for lack of a better term (physical, emotional, mental and the like) still create great pieces of music? Great music cannot be played without emotion, never mind be composed in an emotionless vacuum.

I agree with you on that. I'm saying that it would be an excellent reference into the hows and whys of his music. Creating new music? I think it would be technically flawless, but lack the emotion and heart that makes up the human experience.
 
Science Fiction is here to remind us that sometimes the impossible is precisely what happens and all the things people thought WOULD happen turn out to be red-herrings.

You completely missed my last paragraph, didn't you? And science fiction doesn't "remind us" of anything, history does.
 
Science Fiction is here to remind us that sometimes the impossible is precisely what happens and all the things people thought WOULD happen turn out to be red-herrings.

You completely missed my last paragraph, didn't you? And science fiction doesn't "remind us" of anything, history does.
You have it backwards, history doesn't remind us of anything, it is what other things (like Science Fiction sometimes) remind us of. For example, a politician in his discourse will remind us of some historical moment "Four scores and seven years ago..."

See?
 
...and now my head hurts and I need a lie-down. ;)

What I am getting at is... Just like the physical brain and the consciousness are difficult to separate (possibly making uploading impossible), I believe the emotions and thinking would be even more inseparable. I don't know enough about cognitive science, but I've always suspected rational thinking is dependent on emotion, say, your ability to make judgements on limited information is similar to the way you make emotional decisions. Your ability to read text is you "feeling" the letters and words, not making rational conclusions what they are. Emotion might be different, but equally ingrained.

So if you managed to upload somebody, it would be difficult to make them emotionless. They would more likely experience emotions like everyone else. Except that – lacking the normal senses that normal people have – those would be emotions of horror and pain. Cameras and microphones might help, but they won't feel the same, and it would be difficult to adjust to them. And you will be missing your body. Badly.
 
I don't know enough about cognitive science, but I've always suspected rational thinking is dependent on emotion, say, your ability to make judgements on limited information is similar to the way you make emotional decisions. Your ability to read text is you "feeling" the letters and words, not making rational conclusions what they are.
That's an interesting statement, but what's the basis for it? I should think that reading -- recognizing written symbols and the speech sounds they represent, knowing words and their meanings, understanding grammar and syntax to make sense of sentences -- is one of the most purely logical and rational functions of the human mind, like learning mathematics (or learning to tie one's shoelaces, for that matter). Where does emotion enter into it?
 
So if you managed to upload somebody, it would be difficult to make them emotionless. They would more likely experience emotions like everyone else. Except that – lacking the normal senses that normal people have – those would be emotions of horror and pain. Cameras and microphones might help, but they won't feel the same, and it would be difficult to adjust to them. And you will be missing your body. Badly.

Agreed. What a horrific existence that would be.

Scotpens, I agree that simply recognising letters and understanding grammar and syntax doesn't require emotion, but where I think emotion can enter into it is how we interpret and react to what we read, which in turn influences how we process the information and use it for future reference. This many not apply so much to, say, algebra, which you either understand or stare at in a haze of confusion (*raises hand*). However a list of, say, historical events may need an emotional response to in order to properly understand the full effects of the event on the people involved, both active participants and innocent bystanders. Something that looks logical on paper (or in a computer algorithm) may have a very different outcome once actual thinking, feeling, fully conscious human beings are involved.

Apologies if this is garbled. I have a stonking head cold and my brain is fuzzy.
 
Actually, I did mean both. On one hand, the things that you say, on the other, the actual recognition.

You don't spend time processing what you see, even the letters that you read, you just look at them and you know what they are. And we kinda suspect that the brain doesn't process every bit of information in them, and you certainly don't do any of it consciously, so some part of it is a lousy guessing game. If we're talking about letters, you don't always notice spelling mistakes, transposed letters and such. I think that extends to words and grammar, and that often leads to misunderstanding.

But I was mostly thinking about a slightly more complex problem – objects. When you look at a scene, you certainly aren't doing careful analysis of the shadows to deduce where the objects are, you're just knowing (i.e. feeling) where everything is. Which is why optical illusions would work, and which is also why even when you realize your brain has been fooled you still continue to see the false image and are faced with a conundrum.

I don't think emotions work that differently, at least as an effect. You see a tiger, and you know that it is not a good thing and high-priority action is necessary, and you call that knowing fear. It is not the same thing, even at a first glance, since emotions are simpler, often stronger, act on a larger part of the brain, are more clear and I suspect there's less learning involved. Some of them have more direct link to biological processes in your body (which is not necessarily a difference). But at the same time, I think there is a lot in common between the two things.

So which is why I often imagine emotions – or the building blocks of emotions – as the building blocks of a lot of the other tasks your brain does for you. That is probably wrong, but at the same time I would be very surprised if they aren't heavily intertwined. At the very least, evolutionary, as our brains were getting more complex, we must have gained the simpler – emotions or something like them – first, and then the rest. It would be incredible if the rest came independently without being a complicated version of the proto-emotions.
 
Actually, I did mean both. On one hand, the things that you say, on the other, the actual recognition.

You don't spend time processing what you see, even the letters that you read, you just look at them and you know what they are. And we kinda suspect that the brain doesn't process every bit of information in them, and you certainly don't do any of it consciously, so some part of it is a lousy guessing game. If we're talking about letters, you don't always notice spelling mistakes, transposed letters and such. I think that extends to words and grammar, and that often leads to misunderstanding.
Yes, but computers already do all that, IE recognize shapes, eliminate imperfections, and guess the meaning of an incomplete or misspelled word. It's not very difficult.
But I was mostly thinking about a slightly more complex problem – objects. When you look at a scene, you certainly aren't doing careful analysis of the shadows to deduce where the objects are, you're just knowing (i.e. feeling) where everything is. Which is why optical illusions would work, and which is also why even when you realize your brain has been fooled you still continue to see the false image and are faced with a conundrum.

Three words: Shape recognition software.


I don't think emotions work that differently, at least as an effect. You see a tiger, and you know that it is not a good thing and high-priority action is necessary, and you call that knowing fear. It is not the same thing, even at a first glance, since emotions are simpler, often stronger, act on a larger part of the brain, are more clear and I suspect there's less learning involved. Some of them have more direct link to biological processes in your body (which is not necessarily a difference). But at the same time, I think there is a lot in common between the two things.

So which is why I often imagine emotions – or the building blocks of emotions – as the building blocks of a lot of the other tasks your brain does for you. That is probably wrong, but at the same time I would be very surprised if they aren't heavily intertwined. At the very least, evolutionary, as our brains were getting more complex, we must have gained the simpler – emotions or something like them – first, and then the rest. It would be incredible if the rest came independently without being a complicated version of the proto-emotions.
Ever hear about fuzzy logic?
 
So if you managed to upload somebody, it would be difficult to make them emotionless. They would more likely experience emotions like everyone else. Except that – lacking the normal senses that normal people have – those would be emotions of horror and pain. Cameras and microphones might help, but they won't feel the same, and it would be difficult to adjust to them. And you will be missing your body. Badly.

Agreed. What a horrific existence that would be....

I have heard this before and am not quite convinced by it. If we ever master consciousness and can build an artificial brain, I would think an artificial body wouldn't be so hard. Given how far off a brain upload is, assuming it is ever possible, it may be that the senses we get from an artificial body would be even superior to our lowly biological meat bags.

That said, I think the more likely version of this would be to replace portions of the brain a bit at a time until it is fully artificial.
 
So if you managed to upload somebody, it would be difficult to make them emotionless. They would more likely experience emotions like everyone else. Except that – lacking the normal senses that normal people have – those would be emotions of horror and pain. Cameras and microphones might help, but they won't feel the same, and it would be difficult to adjust to them. And you will be missing your body. Badly.

Agreed. What a horrific existence that would be....

I have heard this before and am not quite convinced by it. If we ever master consciousness and can build an artificial brain, I would think an artificial body wouldn't be so hard. Given how far off a brain upload is, assuming it is ever possible, it may be that the senses we get from an artificial body would be even superior to our lowly biological meat bags.

That said, I think the more likely version of this would be to replace portions of the brain a bit at a time until it is fully artificial.

This is possible but keep in mind that it wouldn't be the persons themselves but copies, that in most likelihood wouldn't even be self-aware. They would mimic self-awareness to a T but deep down, they would be as dead as a stone.
 
So if you managed to upload somebody, it would be difficult to make them emotionless. They would more likely experience emotions like everyone else. Except that – lacking the normal senses that normal people have – those would be emotions of horror and pain. Cameras and microphones might help, but they won't feel the same, and it would be difficult to adjust to them. And you will be missing your body. Badly.

Agreed. What a horrific existence that would be....

I have heard this before and am not quite convinced by it. If we ever master consciousness and can build an artificial brain, I would think an artificial body wouldn't be so hard. Given how far off a brain upload is, assuming it is ever possible, it may be that the senses we get from an artificial body would be even superior to our lowly biological meat bags.

That said, I think the more likely version of this would be to replace portions of the brain a bit at a time until it is fully artificial.
We can already build artificial legs that allow you to run faster than a human with two biological legs, they look like curved blades and give you more spring. We're already have early versions of artificial eyes that allow the blind to see. In a few years, we'll probably develop an artificial eye far superior to the human eye. One that can see farther, clearer and maybe see more than the visible spectrum. Given how terrible my eyes are, I'd probably sign up for that. Especially if I could make them glow red like the Terminator.

This is possible but keep in mind that it wouldn't be the persons themselves but copies, that in most likelihood wouldn't even be self-aware. They would mimic self-awareness to a T but deep down, they would be as dead as a stone.
What are you even basing this on? If we copy a person so perfectly that the copy believes that they are that person and seem self-aware, then how are they not self-aware?
 
...
What are you even basing this on? If we copy a person so perfectly that the copy believes that they are that person and seem self-aware, then how are they not self-aware?

For two major reasons:

First, emulating/simulating a thing is very different from making the thing itself. You can create a flight simulator as complex as precise as realistic as you want, it will never BE A PLANE!!!

Second, self-awareness, whatever it is, is obviously something extremely complex and elaborate, before we can reproduce it we have to understand exactly how it works and why it appears in some creatures and not in others.(we know for certain that some creatures, though possessing a working brain are not self-aware)

Before we get to that stage, we are about as likely to create self-awareness as to create a working car by throwing randomly mechanical pieces up in the air and hoping that they'll fall into place.
 
Agreed. What a horrific existence that would be....

I have heard this before and am not quite convinced by it. If we ever master consciousness and can build an artificial brain, I would think an artificial body wouldn't be so hard. Given how far off a brain upload is, assuming it is ever possible, it may be that the senses we get from an artificial body would be even superior to our lowly biological meat bags.

That said, I think the more likely version of this would be to replace portions of the brain a bit at a time until it is fully artificial.

This is possible but keep in mind that it wouldn't be the persons themselves but copies, that in most likelihood wouldn't even be self-aware. They would mimic self-awareness to a T but deep down, they would be as dead as a stone.

Actually we have no way to know that. We do not even have a theoretical framework of how this would be accomplished. I tend to think on a philosophical level that even if we master artificial ai to the point of self awareness, that a remote brain upload would at best be a copy of the person being uploaded, someone you could actually have a conversation with yourself. I do think it could be as alive as anything if self awareness = alive (is a sufficient condition).

However, I am not as philosophically certain that replacing portions of the brain, a bit at a time with perfectly functioning pieces would result in just a copy. Rather, I think it is possible that it could still end up being you. There would be no loss of continuity in your consciousness and whatever changes this would cause would be more like the changes caused by your brain aging and maturing than a change that makes it not 'you'. In both cases though, I think the possibility for self awareness is there. I don't see why not.
 
Last edited:
I have heard this before and am not quite convinced by it. If we ever master consciousness and can build an artificial brain, I would think an artificial body wouldn't be so hard. Given how far off a brain upload is, assuming it is ever possible, it may be that the senses we get from an artificial body would be even superior to our lowly biological meat bags.

That said, I think the more likely version of this would be to replace portions of the brain a bit at a time until it is fully artificial.

This is possible but keep in mind that it wouldn't be the persons themselves but copies, that in most likelihood wouldn't even be self-aware. They would mimic self-awareness to a T but deep down, they would be as dead as a stone.

Actually we have no way to know that. We do not even have a theoretical framework of how this would be accomplished. I tend to think on a philosophical level that even if we master artificial ai to the point of self awareness, that a remote brain upload would at best be a copy of the person being uploaded, someone you could actually have a conversation with yourself. However, I think it could be as alive as anything if self awareness = alive.

However, I am not as philosophically certain that replacing portions of the brain, a bit at a time with perfectly functioning pieces would result in just a copy. Rather, I think it is possible that it could still end up being you. There would be no loss of continuity in your consciousness and whatever changes this would cause would be more like the changes caused by your brain aging and maturing than a change that makes it not 'you'. In both cases though, I think the possibility for self awareness is there. I don't see why not.
On a philosophical, totally unproven level. I think that the me of say one year ago (just to give an example not an actual figure) is dead/ no longer exists, he's been replaced gradually by the current me, one little bit at a time, so slowly that he didn't notice it until it was too late. I don't believe in a continuity of the ego. In people with a serious mental illness like Alzheimer's it is obvious. They are not the people we used to know. In us, it is not so obvious, but that doesn't make it any less real.

So don't fear death, because you've died already many times since the day you were born and you don't even know it.
 
If you are not already a member then please register an account and join in the discussion!

Sign up / Register


Back
Top