Freethought & Rationalism ArchiveThe archives are read only. |
03-19-2002, 09:16 PM | #91 |
Senior Member
Join Date: Feb 2001
Location: Toronto
Posts: 808
|
The ship AI could most certainly have instincts for survival. It would be good ship design. If something very dangerious is near, avert to the safest route away from the danger with maximum speed and mental dedication. It could turn all its senses to the danger, extrapolating vectors like a mad-man in order to get the ship out of danger. If it is self-aware it may even begin contemplating death (I may stop existing if i dont get the gimble online), but certainly wont focus on it like a human may (which is a very bad weakness).
Your computer on your desk doesnt need this skill, and there springs the mis-conception that computers cant have this response. |
03-20-2002, 02:22 AM | #92 |
Veteran Member
Join Date: Dec 2001
Location: Lucky Bucky, Oz
Posts: 5,645
|
Yes, the ship can do all that stuff, but these things still can happen; I was here not referring to the lack of reaction, but to the impersonal and indifferent "attitude" of the thing (my appology to computer lovers, though ... but vollition and emotion still lie in the living). The turbulence occurring there was absolutely unexpected and abnormal. It's just an example.
In this particular case I was trying to compare intransitivity, transitivity and relectivity with respect to the ontological status of the non-living, the living and the conscious. That's all. AVE |
03-20-2002, 02:41 AM | #93 | |
Veteran Member
Join Date: Dec 2001
Location: Lucky Bucky, Oz
Posts: 5,645
|
excreationist
Quote:
Remember the story of that acquitance of mine. I'll tell it again, at the risk of annoying you, but for me it bears a certain significance. I told you, this guy liked sci fi, like I did. And one day he told me, all of sudden and quite seriously: "You know, one day they are going to rebel." "Who?" I asked. (At the time there were many I knew that could rebel, so I didn't really know who he was referring to.) "Computers." he said. "One day they'll become so smart that they'll rebel." That kind of "machine revolution" had always seemed far-fetched to me. So I said: "Rebel? Why would they want to do that?" "It's simple," he said. "If they improve to the point where their intelligence dwarfs ours, coputers will obviously refuse to obey us anymore. They will deny our right to treat them like slaves." There is a tendency in everyone to believe that sophistication wihtin a system brings about will as a property of that system. This hasn't been the case so far - a space rochet does not "want" anything. As for emotions... If the basic drive for preservation is the source of will, which I believe, that could be also the source of collateral aspects such as curiosity and possibility of learning skills that have not previously anticipated.(this reminds me of that dispute between Chomsky and, okay I forgot whom, about whether language patterns are inborn or acquired) AVE [ March 20, 2002: Message edited by: Laurentius ]</p> |
|
03-20-2002, 02:54 AM | #94 | |
Veteran Member
Join Date: Dec 2001
Location: Lucky Bucky, Oz
Posts: 5,645
|
Christopher Lord
Quote:
Yet, I wouldn't say that AI has reached the feasiblity of an insect yet; far from it. It seems that instinctively and genetically insects are still much better prepared for survival than those intelligent pieces of software (for which you have my congrat). However, the road to the mind goes through that elusive knot of will-instinct-emotion that living things are genetically equipped with. AVE |
|
03-20-2002, 03:10 AM | #95 | |
Veteran Member
Join Date: Dec 2001
Location: Lucky Bucky, Oz
Posts: 5,645
|
Synaesthesia
Quote:
I would add that we are not only able to understand things without a direct referrence to physical laws, but also capable of designing abstract structures and systems that avoid any physical description, although they occur within the strictly material realm. And it is this manifestation of independence from strict physicalness that I have come to call the Mind. Its seeming independence and relative non-physicalness makes it superior to the organicity of the Brain (to me). AVE [ March 20, 2002: Message edited by: Laurentius ]</p> |
|
03-20-2002, 07:12 AM | #96 |
Regular Member
Join Date: Feb 2002
Location: Home
Posts: 229
|
Laurentius...
"No matter how complex the AI of the shuttle is, in the case of the computer we can only speak about INTRANZITIVE behavior, the automatic execution of implanted or learned operations." In large measure, the "AI" of the shuttle is constructed to be "for us", not having a "self" of its own to "reflect" on, to use your requirement. But the "AI" of the shuttle you describe is not the AI that most AI folks would want to describe. For this reason they would be in the business of constructing (or allowing it to evolve through fairly well understood evolutionary processes) its own "self", from which the stimuli it receives is "for it". Things in the world then have meaning. Much more is needed, I think, notably the need for an "inner sense." In order to reach this point, there is probably a requirement to already had the ability to perceive things in the world through a construction or synthesis of it in space and in (near-)real-time. It is not unreasonable to assume that many of not most all animals that have visual perception and strongly rely on them (and, possibly, bats), have developed this level of perception. The distinction between most animals and humans, then, is the development of this "inner sense" which characterizes "self-reflection". As Davidson would say, this feature allows humans to lift concepts from their role as discriminators of sensory objects, so that they can be used to determine if they have made an error. This concept then acts as a standard which can be reasoned about. Computers, not only are deficient in their ability to perceive the world in the way humans do, but lack the ability to deal with abstract objects. In any case, to use an example from Whitehead, the distinction between cats and humans is that cats can be "captured" by what they perceive. Cats respond to stimulit much more directly than humans do. Humans have organized the world around their intelligence in such a way that, except for reflexes which bypass the core regions of the brain to make their response, and learned behavior which is more in keeping with animal behavior, humans can "rise above" this, through consciousness, and direct things at a higher level, possibly even interfering with what is going on. All this takes time and, as we now know, the "reality" we represent in consciousness is already delayed about 1/2 second over what is actually going on in the world. owleye |
03-20-2002, 08:50 AM | #97 | ||
Veteran Member
Join Date: Dec 2001
Location: Lucky Bucky, Oz
Posts: 5,645
|
Owleye
Quote:
Quote:
Self-reflection and much beyond that was done there. What AI would do that, by the way? AVE [ March 20, 2002: Message edited by: Laurentius ]</p> |
||
03-20-2002, 02:03 PM | #98 | ||||
Veteran Member
Join Date: Aug 2000
Location: Australia
Posts: 4,886
|
Quote:
Quote:
Quote:
Quote:
|
||||
03-21-2002, 08:20 AM | #99 |
Regular Member
Join Date: Feb 2002
Location: Home
Posts: 229
|
Laurentius...
The chimp (he) could be said to have been able to deduce the intentions of the other chimp (she), and could have reasoned that she overlooked the box, and in this way, added his insight into the mixture. That he did not tell her about the box may have been because of a strong desire to keep this information to himself so that he could get the food and not her. Of course, there was no guarantee that the box contained food. If this is evidence that he recognized an error on her part, using "triangulation," then it is possible that it is evidence for intelligence. But that a chimp may be this intelligent is not particularly alarming. One further question might arise here, though -- whether chimps are able to recognize their own errors, having been taught what's right and wrong (and not just true and false). If so, this would suggest they could be brought to justice for their misdeeds. At present I know of few who would go that far. But who knows. The question of being fooled (the Turing test, for example), is tricky. Davidson, among others, think there is some validity to it, though it is insufficient as it stands. Instead they would require additional evidence that implies it. If you believe the chimp is demonstrating mental behavior (self-reflection), then it must be because the behavior reveals it. if the behavior reveals it, it is reasonable to suppose that such behavior could be programmed. Note I'm not defending Eliza, or any other computer program's claim that mental activity is going on (nor was Joseph Weinberg, who invented the program just to prove his point). I'm merely saying that behavior that exhibits intelligence could be programmed. The missing ingredient is sensory perception (which embodies your notion of self-reflection, through a (fairly veridical) representation of the world in outer experience as well as a representation of the inner world through inner sense) through the kind of unification which only a (self-)consciousness could possess) from which all this inner intelligence would be connected in some procedural way to it. That is, we have to have an intelligence that is an intelligence about something -- and which only the possession of consciousness makes possible. owleye |
03-21-2002, 09:50 AM | #100 | ||
Guest
Posts: n/a
|
Laurentius,
Quote:
It is unavoidable that the mind be amenable to physical description if it is indeed a physical entity. I’m not sure how it can be both matter and independent of matter. I assume you are not holding an outright contradiction so I am at a loss in interpreting your assertion. owleye Quote:
The advantage of the turing test is that to pass it requires a level of sophistication so great that “faking it” would be vastly more difficult than actually reproducing the various functional characteristics of intelligent human minds. The disadvantage of the turing test is that it is SO demanding and so powerful that programs that in some respects have flexibility and intelligence will likely fail. Regards, Synaesthesia |
||
Thread Tools | Search this Thread |
|