• Welcome! The TrekBBS is the number one place to chat about Star Trek with like-minded fans.
    If you are not already a member then please register an account and join in the discussion!

Uploading yourself can be fun, aka....

With that rationale, no one should have sailed from Europe to the Americas. Science back then said the earth was flat. Trying to sail around the world would be a waste of money, right?

Wrong, especially if you're referring to Columbus. His mistake was using a wildly incorrect estimate of the Earth's size. But even the ancient Greeks knew the Earth was spherical.

The rationale Crazy Eddie was explaining is completely understandable when one realizes that modern science is a highly competitive endeavor. Peer review (although it has its faults) was set up to keep researchers focused and limited funding applied to studies that might actually be helpful.
 
^But how can we really know if a study can be helpful without the study, especially in regards to all the things we don't know about the brain.
 
With that rationale, no one should have sailed from Europe to the Americas.
Actually that WAS the rationale that lead to the discovery of the Americas. Columbus wanted to find a trade route to India that wouldn't involve sailing around the notoriously stormy and shark-infested waters near the Cape of Africa. Knowing that the Earth was spherical (everyone knew that at the time) he assumed that if he traveled far enough west he would eventually arrive at India from the opposite direction. No one had previously attempted this voyage because 1) ocean-going vessels of previous eras never had the range to pull it off and 2) despite the common conception of maritime navigators being able to determine their location by tracking stars and complex geometry, they were actually REALLY bad at it and depended more on navigation by landmarks, such as island formations, shorelines and recognized currents. Traveling across open ocean is something most sailors tried to avoid because it was really easy to get lost.

Columbus' only mistake was in drastically under-estimating the size of the Earth. It wasn't for another several decades that it was learned that the land Columbus discovered was nowhere near India after all.

Science back then said the earth was flat.
No it didn't.

^But how can we really know if a study can be helpful without the study, especially in regards to all the things we don't know about the brain.
Because only a SCIENTIFIC study would be useful in the pursuit of actual SCIENCE, which is basically the neurologists' complaint. They're taking issue with the fact that Europe is spending an assload of money on what may turn out to be a trans-humanist art project, which would actually cause more harm than good when it comes to real scientists trying to get funding for similar non-bullshit projects.

To use your Columbus example: a scientific study on 15th century sailing ships is a lot more useful than, say, a photographic study, even if the latter produces way more interesting pictures. In the same vein, a scientific experiment -- which the human brain project was originally supposed to be -- would be a lot more useful than a study.
 
Last edited:
It won't be me. Only a copy.
And there are a dozen reasons why I would want to have a copy of myself in a supercomputer. There are a lot of things I want to see happen in the world and having a digital version of myself would allow the meat-version of me to focus on more personal issues while super-me explores my loftier ambitions.

Also: I'm not a religious person or anything, but I feel it might be good idea to save my soul.
It wouldn't be "super you". It would be "super copy of you at the time of transfer". That copy may achieve those loftier goals, but you'd just be living vicariously through the achievements of that copy.
All true. I'm perfectly comfortable with that.
 
While the notion of time uploading itself is crazy, I find the claim that the uploaded copy is not you to be unsupported by science and a vain exercise in self-importance. What we know about our consciousness is that the information stored and processed in our neurons. The matter composing it makes a difference as much as it governs the way the processing happens, but there is no evidence of you being tied to these particular bits of matter, or of there being an independent soul. As far as our knowledge goes, if you replaced all the matter in you with identical but different atoms, you'd still be the same person. And it doesn't have to be completely identical, as you yourself aren't completely identical to what you were some time ago. The problem with uploading your mind to a computer is that there would have to be some unprecedented and presently unthinkable breakthrough that would get you close to having an identical copy, if such a thing is even physically possible or practical. And worse, if you do get close to identical, but not close enough, that would qualify as torture.

Now of course, mind uploading itself is a vain exercise in self-importance. You leave enough parts of yourself behind by making a difference in the world. Teaching your kids, changing the lives of the people close to you, or if you're extremely lucky leaving inventions, written works and art behind or even being personally remembered. That's already more than the world needs. If you are that important, one day someone who thinks similar to you will get born, will get inspired by your work, and will carry it on, and do so better than an ageing disintegrating digital brain will. If you aren't that important, you lived your life, you left your small footprint, why do you want to extend it beyond that – what makes you more special than all the other people who are like you who will always be there?

I personally don't want some very old me who can't deal with the future world and can't contribute to be put in a computer. The years would have changed me to the point that this guy won't even be me any more, and likewise he won't care for the continued existence of this digital brain which will also change quickly from what he is. What will I do in that computer really? Complain that they are ruining the world by polygamous marriage and changing the versioning scheme of the Linux kernel again? I think Cleverbot can do that.
 
While the notion of time uploading itself is crazy, I find the claim that the uploaded copy is not you to be unsupported by science and a vain exercise in self-importance.

It's not about self importance. It's about existence. When I die only a copy will remain. I will be dead.

Here's a thought experiment. If you believe the copy is you, would "original you" be willing to commit suicide at that point? After all, "copy you" would still exist.

It's basically the same argument that always arises in the "would you use a transporter?" threads.
 
Well, that's more of a Thomas Riker argument than a transporter argument – he's a different person not because he's a copy, but because both survived and diverged and are now measurably different. You could upload yourself moments before dying. Or after you have died. There's no rational measure as of yet which, given a perfect copy, could be used to distinguish the two and claim it is a different person.

Now why the hell would that be a good idea instead of improving medicine I don't know. If I haven't used the already long time of my life in a good way – which I haven't – I don't see why I should be allotted more of it, let alone an infinite amount of it. Yeah, sure, given an infinite amount of time even I would end up doing something good eventually, but why waste all those computing resources on such improbable goal? Might also try doing infinite digital monkeys with typewriters. Then again I suspect I am probably worth more than high-frequency trading. Hm...

It makes much more sense to freeze yourself, at least you might get to see the future before finally dying. And you will entertain people and wow historians. (Well not really, because there are thousands of frozen people, so very few will pay attention to you, but you do get to see the future. Totally worth it.)
 
If you asked me before the divergence, sure. If one of us had to die, it would make zero difference to me which – original and copy are the same to me, i.e. me. Although even then it isn't a completely fair question, as having yourself twice is an advantage.

If you asked me after the divergence, no, for the same reason I'd refuse to have part of my memory erased, although that will be reinforced by an irrational fear that the minute I've been independently alive is of great significance. Even that's probably not true.
 
I don't consider the copy any less me than the original, particularly from the POV before the point of divergence, because in that moment I am the past of both. If the original vanishing is me dying, the copy vanishing is also me dying.

Say, you gave me a real-life transporter and offered me a journey to an interesting place. Now, of course, before walking into it I would be terrified that there is a real chance I might be wrong and die. The same way I am sometimes terrified being at home, because there's a real chance an earthquake collapses the humongous building above my head. But just like I go home without any hesitation, I would walk into the transporter without any hesitation, because I am certain that the chance of a different person walking at the other end is minuscule. (Or if they are different, they are no more different than I would be a minute from now anyway.)

And even the fear will be quickly replaced by another fear – that the original keeps on living for a minute in which it feels the molecular disintegration, so I'd ask the operator to put me on anaesthesia. Preferably one that makes me completely unconscious. Really, after I imagine the potential pain of a semi-conscious brain disintegrating, I would not give a damn about philosophical debates about originals and copies.
 
I'll re-ask it, to try to cut down on all the bobbing and weaving. You've been duplicated. You and your Thomas Riker are standing side by side. The finger points at you, and not your Thomas, to step into the disintegration booth, so that only one of you remains. You'd be willing to go?
 
I answered that one:

If you asked me after the divergence, no

I would be unwilling to get disintegrated, even though this is probably just as irrational because we are still almost the same person, and in theory I should go into the disintegrating booth without hesitation. But I never said I was rational. ;)
 
I answered that one:

If you asked me after the divergence, no

I would be unwilling to get disintegrated, even though this is probably just as irrational because we are still almost the same person, and in theory I should go into the disintegrating booth without hesitation. But I never said I was rational. ;)

Fair enough, but why the qualification of after versus before?

Seems to me, if you have a different answer afterwards, that you should admit that answering in the affirmative before the divergence isn't the most honest reply. The whole point of sojourner's line of questioning, as I see it, is to point out that there is a je ne sais quoi aspect to personal experience and existence that is just as real as the aspects that we can quantify scientifically, both at the present time as well as at any time in the future when there might be human duplication machines [according to present-day thought experiment]. The fact that you are standing over here and your Thomas Riker is standing over there would only support the reality of that aspect.

The only alternative I can see would be if both you and your Thomas were simultaneously aware of each others' experiences, implying a bizarre quasi-schizophrenic type of consciousness presently unknown to science. Destroying one of the two might be desirable to cure this quite possibly debilitating condition. But we have implicitly ruled out such an alternative in our thought experiments, as one being out of the question, I think.
 
Back to the thread topic, we are somewhat agreed on the uploaded brain being a copy but assuming the machine capable of uploading your brain is also capable of writing the learned information back to the brain, does the copy become you?

It's like saying is the Neo bending the spoon a copy or Neo?
 
Last edited:
Fair enough, but why the qualification of after versus before?

Seems to me, if you have a different answer afterwards, that you should admit that answering in the affirmative before the divergence isn't the most honest reply.
As I said, even before I will fear the possibility that I might be wrong. After the divergence my experience would seem to confirm the fear, and I would be overwhelmed by it, whether it is justified or not. Besides, now both of me will have independent experiences of their own, which is a real thing, no matter how small. That's why I would ask the operator to make me unconscious just in case, so that either of me doesn't experience anything after there's two. In an actual world where transporters were real, that would make me seem eccentric.

OTOH, if you offered to upload my brain to a computer after my death, and I accepted the offer, I probably won't fear death for the rest of my life. Refusing brain uploading is a lot harder than accepting to go through a transporter. But the emotion behind this is not a very good argument that the uploaded brain is me. Similarly, the emotion behind the transporter thing is not a very good argument that the copy/uploaded brain/whatever is not me. I can feel disintegration of the original kills me, or that brain uploading makes me immortal, but feelings alone don't make those things true.

You know who might have a better perspective? The people praying to all deities somebody forgets to pay the Internet connection of my uploaded brain after I've been disintegrated so that I stop posting.
 
If you are not already a member then please register an account and join in the discussion!

Sign up / Register


Back
Top