Freethought & Rationalism ArchiveThe archives are read only. |
10-31-2001, 12:16 AM | #11 | ||
Veteran Member
Join Date: Aug 2000
Location: Australia
Posts: 4,886
|
Quote:
In other threads I talked about how you could work out if they have a voice in their head - e.g. one idea is that you could teach it a song, then play it on a CD player while they sing along in their head, then you put headphones on so that they can no longer hear the music but you can. And they continue to sing along to it in their head. Then after a random amount of time (e.g. 47.3 seconds) they sing out loud - and you see how accurate they were. (And you could be tapping the beat to help them out) If they can do that successfully, the simplest explanation is that they can have a voice in their head. Quote:
Also, human-level consciousness is about the things in the movie A.I. - where it has big dreams - so it is somewhat idealistic. And it is autonomous and somewhat self-centered. And it can actively learn things. So you could talk about an unfamiliar subject and explain it to them, and they might ask you some insightful questions about it. The zombie should also show signs of being bored about things and tell you about ideas or thoughts it had about things. And it must have had those ideas or thoughts - (or its puppet-master did or its just speaking randomly) even if it doesn't have a voice in its head, it still came up with the ideas. And if it says that it thought about those ideas (and shows evidence of this) then it must have, on some level. So I'm really talking about "awareness" which is just about detecting, recognising, responding and learning about things. Even a computer can do that - well computers aren't very sophisticated at responding to things or are good at learning unfamiliar things (e.g. philosophy) autonomously. |
||
10-31-2001, 12:18 AM | #12 | ||
Banned
Join Date: Jul 2001
Location: South CA
Posts: 222
|
Quote:
Quote:
|
||
10-31-2001, 12:39 AM | #13 | |||||
Banned
Join Date: Jul 2001
Location: South CA
Posts: 222
|
Quote:
Quote:
You would have to know an NCC to answer this, or what I would call a “physical correlate of an being who feels”. Quote:
It seems like you are saying that freewill, in the form of will that is free of determinism, is necessary for things like walking, or whatever behavior you would choose. And that being conscious allows one to be free of determinism, thus to do things which a deterministic or randomizing machine, could not do. Imagine an android who we omnisciently know is an aware and feeling being, who is discussing with his android friend whether humans are really conscious. He proposes a test of adding a million numbers in a minute. He thinks this is what would require “consciousness”. Of course a human can’t do this so he concludes that the human is not conscious. That is like your walking experiment. Quote:
Quote:
What if they can make computers that, if you had an online “relationship” with one (such as in chatrooms), and you were trying to figure out if they were a computer, you would not be able to tell the difference between them and a human. Would this prove this “AI” was a feeling being? |
|||||
10-31-2001, 12:41 AM | #14 | |
Banned
Join Date: Jul 2001
Location: South CA
Posts: 222
|
Quote:
As you say “by definition the zombie and human are physically identical” so it would seem the answer would be “no” for that reason, so it is somewhat rhetorical. The obvious follow up question is “If there is no physical difference, how does a materialist named Bob know that Bob (himself) is not a zombie?” Why would it be an invalid question? |
|
10-31-2001, 12:43 AM | #15 |
Banned
Join Date: Jul 2001
Location: South CA
Posts: 222
|
I have a question for whoever. Aren’t materialists saying that we are all zombies? If that is not what they are saying what are they saying about what feeling is?
|
10-31-2001, 03:52 AM | #16 | ||||||||||
Veteran Member
Join Date: Aug 2000
Location: Australia
Posts: 4,886
|
Quote:
Quote:
Quote:
Quote:
Quote:
I would call things that don't quite meet my definition of awareness zombies. e.g. plants - individual plants don't autonomously develop beliefs about the world - they just follow hardcoded rules. Quote:
There are 100 billion neurons in our head, and each one can have an effect on the decisions we make. In the same way, weather involves many individual particles. So I think our behaviour is deterministic, but not deterministic in a simplistic way, like gravity or magnetism is. Our brains can learn to be intelligent by learning the patterns in our experiences and applying these patterns (or recombinations of them) to different problem areas. Random systems can't do this (well they can for very brief instances, but generally random systems aren't intelligent). Robots don't yet have the neural network reasoning abilities that we have so that their behaviour is not very intelligent. My point about walking is that it involves some degree of awareness about your environment - assuming that it can autonomously modify its behaviour to maximize its expected pleasure and/or minimize its expected suffering. Quote:
Quote:
So basically it's about real-time autonomous sophisticated reasoning that involves autonomously learnt uncertain knowledge (not preprogrammed "facts") and learnt predictions about the world based on your possible actions (and even the actions of others). So most of this intelligence is *learnt*. In the case of most A.I. systems, the programmers just program the A.I. to be clever - but it is the programmer who had insightful awareness about the world - not the A.I. Quote:
Quote:
Anyway, in most of those possibilities awareness and insight was involved. Well except for the possibility that it was completely random. e.g. if I had a chat with an illiterate monkey and it happened to show that it understood me. In that case no intelligence or real awareness is involved. (The monkey is aware of some things though, but not about what it appears to be aware of, according to its random yet insightful typing) |
||||||||||
10-31-2001, 04:52 AM | #17 | |
Veteran Member
Join Date: Aug 2000
Location: Australia
Posts: 4,886
|
Quote:
|
|
10-31-2001, 08:04 AM | #18 | |
Veteran Member
Join Date: Jan 2001
Location: Median strip of DC beltway
Posts: 1,888
|
Quote:
Does it matter? Does a massage feel any less pleasurable? Does a slap to the back of the head hurt any less? |
|
11-04-2001, 11:43 PM | #19 | ||
Banned
Join Date: Jul 2001
Location: South CA
Posts: 222
|
to Nial also-- defining a zombie
Quote:
Quote:
|
||
11-04-2001, 11:45 PM | #20 | |||||||
Banned
Join Date: Jul 2001
Location: South CA
Posts: 222
|
Quote:
Quote:
2) Feeling pleasure or pain is a purely subjective aspect of an experience. If you think some arrangement of matter other than your own body is “feeling” something, you are projecting your own subjective experience onto it, and assuming that it is a body with an experiencer, like yourself. “Seeking” and “avoidance” can be defined by objective behaviors, such as how a guided missile “behaves”, but feeling pleasure can not be defined by objective behavior without someone projecting their own subjective experience onto the “behavior”. Assuming that the “body”/machine is reacting based on subjective experience rather than just determinism or chance. What sort of “seeking” or “avoidance” would prove that there was an experiencer “in” an organism or machine? Quote:
Quote:
Quote:
Quote:
Quote:
All value and significance, is subjective in the same way. So is color. (Had to get rid of those damn parentheses eating smilies.) [ November 05, 2001: Message edited by: hedonologist ] |
|||||||
Thread Tools | Search this Thread |
|