FRDB Archives

Freethought & Rationalism Archive

The archives are read only.


Go Back   FRDB Archives > Archives > IIDB ARCHIVE: 200X-2003, PD 2007 > IIDB Philosophical Forums (PRIOR TO JUN-2003)
Welcome, Peter Kirby.
You last visited: Today at 05:55 AM

 
 
Thread Tools Search this Thread
Old 07-12-2002, 07:25 PM   #11
Regular Member
 
Join Date: May 2002
Location: North America
Posts: 203
Post

Synaesthia:

Quote:
So if we made a big machine that functionally operated in much the same manner as humans, and exibited the same capabilities, it would certainly be conscious.
What would it mean to say that a human brain and some 'big machine' are functionally equivalent?
Taffy Lewis is offline  
Old 07-12-2002, 07:27 PM   #12
Regular Member
 
Join Date: May 2002
Location: North America
Posts: 203
Post

John Page asks a good question: Why is being organic necessary to feel?

What are the necessary conditions for having the capacity for experiential states?
Taffy Lewis is offline  
Old 07-12-2002, 07:37 PM   #13
Regular Member
 
Join Date: May 2002
Location: North America
Posts: 203
Post

Here is an interesting quote from Colin McGinn's Mysterious Flame:

Quote:
....we might have commonsense reasons for believing in other minds, as well as that only organic brains could be conscious. Maybe we cannot definitively answer these questions, but we might be able to find reasons for favoring one view over the other, however tentatively. Is it then more likely that other people have minds than that they do not? And is it more likely that brains have to be organic to produce minds than that they do not? I think the answer to both questions is yes. It is likely that others have minds because they are similar to me behaviorally and physiologically, so it would be anomalous if they were not similar in the further respect of having a mind like me. They certainly give me a strong impression of having minds. Similarly, it is reasonable to assume that there is a strong connection between consciousness and organic tissue for the simple reason that there are no actual exceptions to this rule. All the cases of consciousness we know of are associated with organic brains. Induction therefore suggests that this assumption is backed by some kind of necessary truth. It would be different if certain complex crystals exhibited signs of consciousness, having little crystalline eyes and ears and ways of behaving. Then we would have empirical reason to believe that we could build an inorganic machine with consciousness. But in actual fact all the known cases of consciousness are organic in nature, so it is reasonable to suppose that organic tissue is necessary, although in ways we do not comprehend. (pp. 199-200)
[ July 12, 2002: Message edited by: Taffy Lewis ]</p>
Taffy Lewis is offline  
Old 07-12-2002, 07:52 PM   #14
Veteran Member
 
Join Date: Oct 2000
Location: Alberta, Canada
Posts: 5,658
Post

Induction therefore suggests that this assumption is backed by some kind of necessary truth? Perhaps, but I suspect it is not the one he seems to think. The most likely explanation for the observation "All the cases of consciousness we know of are associated with organic brains" is "The only significant replicators known to exist are organic." To jump to the conclusion that some unknown causal power in the collection of atoms we label "organic" is completely unwarranted.
tronvillain is offline  
Old 07-12-2002, 09:25 PM   #15
Veteran Member
 
Join Date: Aug 2000
Location: Australia
Posts: 4,886
Post

Taffy Lewis:
...Could there be a robot or computer which is *not conscious* yet capable of performing any action or engaging in any behavior that a human can?
Well firstly I'll quote my definitions of awareness and consciousness:
Quote:
The hierarchy of intelligent systems:

1. Processing Systems [or Programmed Systems]
...receive [or detect], process and respond to input.

2. Aware Systems
...receive input and respond according to its goals/desires and beliefs learnt through experience about how the world works
(self-motivated, acting on self-learnt beliefs) ["self" refers to the system as a whole]

This learning can lead to more sophisticated self-motivated intelligence. This is taken straight from <a href="http://chiron.valdosta.edu/whuitt/col/cogsys/piaget.html" target="_blank">Piaget's Stages of Cognitive Development</a>. I hope to eventually integrate this with my generalized framework.

2. Sensorimotor stage (Infancy).
In this period (which has 6 stages), intelligence is demonstrated through motor activity without the use of symbols. Knowledge of the world is limited (but developing) because its based on physical interactions / experiences. Children acquire object permanence at about 7 months of age (memory). Physical development (mobility) allows the child to begin developing new intellectual abilities. Some symbollic (language) abilities are developed at the end of this stage.

3. Pre-operational stage (Toddler and Early Childhood).
In this period (which has two substages), intelligence is demonstrated through the use of symbols, language use matures, and memory and imagination are developed, but thinking is done in a nonlogical, nonreversable manner. Egocentric thinking predominates

4. Concrete operational stage (Elementary and early adolescence).
In this stage (characterized by 7 types of conservation: number, length, liquid, mass, weight, area, volume), intelligence is demonstarted through logical and systematic manipulation of symbols related to concrete objects. Operational thinking develops (mental actions that are reversible). Egocentric thought diminishes.

5. Formal operational stage (Adolescence and adulthood).
In this stage, intelligence is demonstrated through the logical use of symbols related to abstract concepts. Early in the period there is a return to egocentric thought. Only 35% of high school graduates in industrialized countries obtain formal operations; many people do not think formally during adulthood.
Awareness happens at level 2... consciousness happens in the later stages (perhaps after they're a toddler).
In those definitions, it has things that you can test - you can test to see if the computer can actively learn new behaviours.... you could see if it is at stage 5 by seeing if it can have philosophical discussion with you that last for *hours* where it shows evidence of new learning. If it satisfies my definitions then it would truly be aware and conscious. I don't think beliefs can be preprogrammed in if they are self-learnt. (By "self" I mean the system as a whole) If they are self-learnt and it is acting on them they are its real beliefs - they aren't fake. Same with self-learnt problem-solving strategies, etc. Basically I'm saying that it wouldn't be able to fully appear to be human and not be conscious. It might appear conscious to someone that naive, but not to someone who knows how to test these things. (e.g. you've got to see if it can actively learn new beliefs and problem-solving strategies)

Also, can there be conscious computers or conscious machines which are not organically based?
Of course
excreationist is offline  
Old 07-12-2002, 11:58 PM   #16
Contributor
 
Join Date: May 2002
Location: Saint Paul, MN
Posts: 24,524
Post

I have no idea. My instinct is to believe that anything which is genuinely indistinguishable from a conscious entity is, in fact, conscious, but I can't tell.

I can't imagine how we'd *know*; after all, if, as we assume, the behavior is outwardly indistinguishable, what's the grounds for claiming it's not conscious? We can't say "oh, we know how it works, and there's no conscious parts"; careful study of human brains reveals no "conscious" part. We're a bunch of chemicals and electrical impulses, and yet, each of us knows that at least *ONE* of us is conscious.
seebs is offline  
Old 07-13-2002, 02:54 AM   #17
Regular Member
 
Join Date: Mar 2002
Location: CT
Posts: 333
Post

Quote:
Why is being organic necessary to feel?
Well, I don't really know if it is necessary, but I can't really imagine anything else.

Suppose people build a complex machine that becomes self aware. maybe it could even be programed to exhibit signs of anger at appropriate times. Maybe it could self program to exhibit such signs at times that are in it's best intest. Do computations that that produce signs, that we humans would recognise as anger, acutually produce "feelings" of anger within the machine? Will it's all it's "neural" circuits fire at once? Will adrenaline flood it's "bloodstream"? Will it "feel" it's muscles tense, and it's heart race?

I doubt it, ergo ersatz consciousnes; from an organic perspective anyways.(What other perspective is there?)

SB

[ July 13, 2002: Message edited by: snatchbalance ]

[ July 13, 2002: Message edited by: snatchbalance ]</p>
snatchbalance is offline  
Old 07-13-2002, 04:07 AM   #18
Veteran Member
 
Join Date: Aug 2000
Location: Australia
Posts: 4,886
Post

in reply to snatchbalance:

AI's that satisfy my definitions for awareness and learn to become conscious (high-level awareness) would have "drives" which they *constantly* try and satisfy - sometimes in new ways that they've learnt... I don't think it is necessary for them to behave exactly as we do to be conscious... e.g. they don't need to have malfunctioning tear ducts (cry) when they feel great loss, etc.

[ July 13, 2002: Message edited by: excreationist ]</p>
excreationist is offline  
Old 07-13-2002, 06:18 AM   #19
Regular Member
 
Join Date: Mar 2002
Location: CT
Posts: 333
Post

excretionist,

I agree that a consious being does not need to behave exactly like we do for it to be consious. However, I would say that it must have SOME recogniseable behaviors before we would consider any being to be consious.

I would go on to say that it may be possible to build a machine that mimics consiousnes, but that that machine would not be consious in a way that an organic being would be consious. Hence, I would still refer to it as an ersatz form of consiousness, being the elitist organic organism that I am that is.

SB
snatchbalance is offline  
Old 07-13-2002, 07:42 AM   #20
Banned
 
Join Date: Jun 2002
Location: Montrčal
Posts: 367
Post

Synaesthesia : ..."exibited the same capabilities, it would certainly be conscious".

Did Searle not show this to be false. The machine would only be a symbolic manipulator.

* * *
Sammi says :

The basic machine (the kernel of the NEW MACHINE) must necessarily collect information about itself, its basics, in order TO TAKE the first step on the road to consciousness.

Our current CPU technology will never qualify "current computers" to be conscious. No matter how hard we try, Searle will always crop up to dismiss our quasi claims of machine consciousness.

Sammi Na Boodie (sounds good! does it not?)
Mr. Sammi is offline  
 

Thread Tools Search this Thread
Search this Thread:

Advanced Search

Forum Jump


All times are GMT -8. The time now is 09:17 PM.

Top

This custom BB emulates vBulletin® Version 3.8.2
Copyright ©2000 - 2015, Jelsoft Enterprises Ltd.