Re: Conciousness [OT]
- Posted by Dan Moyer <DANIELMOYER at prodigy.net> Sep 10, 2003
- 380 views
Al, I think one of us is missing the point :) In relation to copying a human brain in order to get a copy of a human mind, the fact that the copy might be indistinguishable from the original is not as important as the fact that it is a COPY, ie, not the original, at least from the viewpoint of the actual original. In other words as I've already asked: if a copy of your brain/mind were made and put into a robotic body, and one or the other (you or the copy) was going to be destroyed, and YOU (the actual original) were given the opportunity to say which would be destroyed, would you actually say, "ok, kill me, I don't care, after all, my COPY will continue to exist, and it thinks it's me anyway, so that's all that matters" ???? So, in that situation, would *you* honestly volunteer to be ended??? If you *wouldn't*, then I think my point that the copy is not the original is made; if you would, then I think you need to think more carefully about it :) ----- Original Message ----- From: "Al Getz" <Xaxo at aol.com> To: <EUforum at topica.com> Sent: Wednesday, September 10, 2003 7:08 AM Subject: RE: Conciousness [OT] Hey Dan, You brought up some very interesting points. Dan Moyer wrote: > > > Ok, notice the word DUPLICATED, as in COPY. A COPY of a mind. A copy > is a > copy is a copy is a...COPY. A copy, no matter how perfectly identical, > is > still a copy, it is NOT the original...so, if a COPY of YOUR brain/mind > were > made, and put into an android/robot, and you (the actual original) were > told > that either the COPY of you, or YOU were going to be destroyed, and you > could, if you wanted, CHOOSE WHICH, could you honestly say that you > would > choose for YOU to be destroyed, thinking that somehow "you" would > continue > to live "in" the copy??? I know *I* wouldn't, because it wouldn't be ME > that continued, it would be a *copy* of me that THOUGHT it was me!!!! > But > *I* would be dead. > For the purposes of what i was talking about, a 'copy' would be an exact copy such that it would not be distinguishable from the original. This is entirely possible in a deterministic world. For example: Copy Copy Copy Which word above was typed first, and even if you knew, would it matter? Well, I think you've set up a straw man. The question is about human minds, not typed words, and to any reasonable human mind, the fact that a copy of oneself exists *would* matter, at the very least in terms of family, property, job, etc. > Personally I'd forget about any mumbo-jumbo imaginary "soul", and just > consider the mind /brain question. > > Can you prove absolutely that a soul is really mumbo-jumbo, or is it possible that something within the quantum world has an influence over this entity? If someone believes in a "soul", it's up to THEM to prove it actually exists. If I say that there is still in existence a Tasmanian Tiger, & you say "no there isn't", I'd have no business asking *you* to prove there isn't, I'd have to prove that there IS. > > until the moment just 'after' it's being brought to consiousness, > I disagree. Consider sensory-deprivation tank experiments...with no > external stimuli, "hallucinations" occur; what are they? Essentially, > when > no stimuli are present, neurons will SPONTANEOUSLY fire, which means, I > would think, RANDOMLY, which would make the two entities, the real > original > & the copy, have different hallucinations in that situation. I would > think > the same would apply to dreaming. > (And, I suspect, this spontaneous firing could be the origin of "will", > perhaps, too?) > In a deterministic world there is no random function. If it can be proved that the smallest contribution to thought patterns is the atom, the brain, in all its complexity, is only a deterministic system, where an exact copy will not only behave exactly as the original, it will do so at the same time for the same exact input, keeping in exact step with the original for every moment in time since its creation! Ok, I *think* I understand that, BUT: while I don't disagree that someday a copy of a human brain down to the cellular functioning level might be made, I don't really have any reason to assume that there could be enough memory storage in the universe to exactly define it down to the atomic & sub-atomic (and quantum?) levels, and besides, to "look" at something closely enough to make an absolutely exact copy inevitably *changes* that thing, so the data you record will not be true anymore anyway, so you wouldn't get an actually identical copy (Heisenberg Uncertainty Principle: you can't use real world actions to perceive real world phenomena without affecting & thereby changing the thing you're trying to perceive; you can know the position, *or* the motion, of a sub-atomic particle, but not both at the same time, because assertaining the position causes it to move differently than it was). > > I guess if they woke up and saw each other somehow, they would begin > > to process different information unless they were fooled into thinking > > they were looking into a mirror like on so many comedy shows. > > Once one of them made a different move, they would each process > > different information and possibly realize that although they > > have the same memories, same parents, etc, they arent the same > > person. > > > > Also, i can see that my assumption that the brain doesnt take advantage > > of any sub-atomic activity was a bit of a leap, because from what > > i have read so far within the scope of human understanding this isnt > > certain yet. > > Yes, I've seen somewhere the suggestion that neuronal synapses don't > actually work by a CHEMICAL action of the neurotransmitters, but by > QUANTUM > effects of the neurotransmitters on the receptors. Can't begin to > understand that. :) > Well, that's the only way i can see a random variable entering the picture. Even so, is there an approximation that would yield reasonable results? I dont know how to answer this question. > Dan Moyer, the original :) > > (no, *I'm* the original!) > <no, *I* am!> > [no, you're both wrong, *I'm* the real Dan] > Take care for now, Al (at least i think im him Dan