Freethought & Rationalism ArchiveThe archives are read only. |
01-19-2002, 01:20 PM | #41 |
Banned
Join Date: Jul 2001
Location: South CA
Posts: 222
|
[QUOTE]Originally posted by excreationist:
I'd get more RAM (short term memory or working memory) - I think that is the most important thing. I don't need any more long term storage though. My brain could be overclocked from something like 20-40 cycles per second to 1000+ Hz. It would be like upgrading parts of a computer. [QUOTE] Yeah just like a computer, hehe. Say you're considering upgrading your brain/body, installing a new one into your habitat, and selling your old brain/body as a discount combo deal. (It is "used", after all.) There must be all kinds of long-term data on your brain you are not using. You wouldn't need all of that data in your new body/brain, correct? We could probably improve on the neurological structure of your brain, as well. I suppose you would you like this structure to be optimized for your particular use. Now we just have to figure out what you use your brain for. I gather it has something to do with pleasure. How about if your new brain/body combo is much happier than the old one? [QUOTE]Originally posted by excreationist: quote: -------------------- Originally posted by hedonologist: ...What if you wanted to get rid of cancer, so you had a whole exact duplicate body (with brain) made with the agreement that after you met the new body and checked to make sure it was working properly, you would take something to die. Would "you" "wake up" as the new body? ------------------ Well the "you" who had experiences after the duplication wouldn't be in the body, but the one immediately before the duplication (who decided to be duplicated, etc) would be in the body. quote: -------------------- Originally posted by hedonologist: Would you use this method to get rid of cancer? ------------------ I guess... mainly so that I could meet myself. [QUOTE] Notice, you are referring to an "I" who would be having a different experience from who you are calling "myself". I define a person as an experiencer, so if a being is having a different experience, they are a different person. "I" can't have a different experience from "myself". [QUOTE]Originally posted by excreationist: Well assuming that the person is unconscious during the copying, I think there aren't any problems - the original is copied while being unconscious and then killed. The copy is revived. But if the original is conscious during or after the copying at all, then there are problems - one version of that person knows that they will die. (At least during the copying, the only one who wakes up knows that they will be alive) [QUOTE] Is there any problem with someone slipping you some drugs that will first knock you out cold (as far as we can see) and then kill you, assuming there were no chance of this not going smoothly? You apparently wouldn't *know* what happened. Materialists and dualists could have some serious symbiosis, if we just had the technology. I'm so sentimentally attached to certain brain functions of my current brain that I would sell it real cheap with the understanding that it would be installed in a better body and allowed to do what it wants. But my old brain would be your brain, once it was installed in your body, so it would be doing things the "new you" wanted to do, not things "I" wanted. And guess what? You could make the payment to your own new brain! You just take the money out of your account and put it right back. And that is just the beginning. You would get to keep all of my: material possessions, money, even any friends who don't particularly mind that my old brain was someone else! Before your old brain complains, you and me could find some way to make sure it didn't know what was going on. hehe What have you got to loose? [ January 19, 2002: Message edited by: hedonologist ]</p> |
01-19-2002, 01:28 PM | #42 | |
Veteran Member
Join Date: Sep 2000
Location: Massachusetts, USA -- Let's Go Red Sox!
Posts: 1,500
|
Quote:
But there are a couple related points: 1) Consciousness is a causal product of the brain. If, for example, my brain were slowly replaced by parts from another, identical brain, my consciousness would remain intact. Its a fact that my brain has the causal powers capable of producing mental events. The same could possibly hold true with other materials; computer chips, for example. Assuming they have the same sort of causal properties, they too must nessecarly produce this particular consciousness....*my* consciousness. 2) Related to the first, consciousness has a first-person ontology. Its mode of existance is subjective. Now suppose we leave this particular brain alone, and construct another totally indentical one, and place it in someones body. Does it follow that *I* am that other person? Im fairly sure it does not. My consciousness is subjective, it is, after all, *my* consciousness. The other fellow, while presumably having the same content of thought, emotion, what have you....is still, quite clearly, not *me*. For example, suppose we take two glasses of water. The macro-property we call "liquidity" is a causal result of the micro-structure. If both have identical micro-stuctures, and if we assume causation is nessecary, they must nessecarly have the same macro properties. But it would be absurd to suggest that, because they have identical macro properties, they are *the same thing*. Glass X is not glass Y, by definition.... identical as it may be. |
|
01-19-2002, 02:18 PM | #43 | ||||
Banned
Join Date: Jul 2001
Location: South CA
Posts: 222
|
Quote:
Quote:
Quote:
Quote:
So I am trying to translate phrases like "*the same thing*" or "quite clearly, not *me*", to their implications in terms of behavior. |
||||
01-19-2002, 09:48 PM | #44 |
Veteran Member
Join Date: Sep 2000
Location: Massachusetts, USA -- Let's Go Red Sox!
Posts: 1,500
|
As to the first point, i dont believe there are any. Its just a simple ontological distinction that needs to be drawn. One can hold a belief in consciousness and maintain a "secular worldview", or at least believe consciousness isent some magical "ooze".
As to the second, yes, i think you could, and could go about it in the way same sort of way i described. Slowly replace existing neurons (or whatever) with a mechanical substitute, and report on the effects. Finally, i really dont think this issue has any bearing at all on ethical or behavorial matters. People have no intrinsic value....it is ascribed to them, or sometimes not at all....me, clone me, or me with a clone brain. None of these things affect what we ought not do (i attempted to work this out in my rather long-winded post on contractarianism). |
01-19-2002, 11:39 PM | #45 | ||||||||||
Veteran Member
Join Date: Aug 2000
Location: Australia
Posts: 4,886
|
[quote]Originally posted by hedonologist:
<strong> Quote:
Quote:
Quote:
Quote:
I'd also fix my body while I was at it - I'd look like a guy that girls like the look of and hopefully not be very hairy. Quote:
Quote:
Quote:
Quote:
Quote:
Quote:
|
||||||||||
01-19-2002, 11:45 PM | #46 | ||
Banned
Join Date: Jul 2001
Location: South CA
Posts: 222
|
Quote:
Quote:
|
||
01-20-2002, 01:43 AM | #47 | ||||||||||
Banned
Join Date: Jul 2001
Location: South CA
Posts: 222
|
I put this next statement on top because this is what I am thinking is the sort of behavior a "materialist" philosophy can lead to. When I say "materialist" philosophy I'm speaking of the "info matters" type materialist, that tronvillain mentioned.
Quote:
How about if someone had some way of getting into your brain and fiddling with your desires so that you felt that you wanted to be their slave? You would have no *knowledge* that they were doing this. For all you know you just enjoyed nothing more than to clean their house, file their bunions, etc. Quote:
Quote:
Quote:
Quote:
So maybe some of the data will be stored in a different format-- the new brain could perform any *objective* function the old one did, and more. Aren't we better off to judge whether or not to upgrade our computer by using some objective sort of benchmark? Quote:
Quote:
Quote:
Quote:
Quote:
[ January 20, 2002: Message edited by: hedonologist ]</p> |
||||||||||
01-20-2002, 02:33 AM | #48 | ||||||||||
Veteran Member
Join Date: Aug 2000
Location: Australia
Posts: 4,886
|
Quote:
Quote:
I think our large collection of memories is where we get our "common sense" intuitions and things like that from. Without it we'd be very naive and incompetent a lot of the time. Quote:
Quote:
It would help to have a lot of their learnt patterns inserted into my brain so that my "feature set" of possible patterns would be much larger. It might take many nights of dreaming to integrate the new memory fragments. Quote:
Quote:
Quote:
Quote:
Quote:
Quote:
About the brain-transferring scenario: Say you're called "A" and I'm called "B". Originally brain A is in body A and brain B is in body B. Both brains and bodies are alive. How would you talk me (person B) into letting me put brain A into body B? |
||||||||||
01-20-2002, 04:37 AM | #49 | |
Veteran Member
Join Date: Oct 2000
Location: Alberta, Canada
Posts: 5,658
|
Jamie_L:
Quote:
[ January 20, 2002: Message edited by: tronvillain ]</p> |
|
01-20-2002, 05:25 AM | #50 | |||
Veteran Member
Join Date: Oct 2000
Location: Alberta, Canada
Posts: 5,658
|
hedonologist:
Quote:
Quote:
Quote:
A statement like "I would act the same as anyone else, if I had their genetics and environment" strips the word "I" of its meaning. Take away genetics and environment and what do you have left? Nothing. So it is difficult to see in what sense "we are all just like copies of one person", unless it is simply that we may have some things in common. |
|||
Thread Tools | Search this Thread |
|