FRDB Archives

Freethought & Rationalism Archive

The archives are read only.


Go Back   FRDB Archives > Archives > IIDB ARCHIVE: 200X-2003, PD 2007 > IIDB Philosophical Forums (PRIOR TO JUN-2003)
Welcome, Peter Kirby.
You last visited: Yesterday at 05:55 AM

 
 
Thread Tools Search this Thread
Old 07-15-2002, 03:26 AM   #41
Veteran Member
 
Join Date: Mar 2002
Location: Canton, Ohio
Posts: 2,082
Post

Taffy,

In order for a machine to exhibit human "consciousness", it would have to be constructed using reference from homology, not from analogy. The homology of human consciousness remains a topic of wild speculation.

Ierrellus
PAX
Ierrellus is offline  
Old 07-15-2002, 04:38 AM   #42
Banned
 
Join Date: Jun 2002
Location: Montrčal
Posts: 367
Post

Humans are almost consciously unaware of their low level processors. This in fact does not mean that the human is unconsciously not-blind to its parts.

Again you fellas (no women) are being selective in your connotations. You are all basing your replies on the assumption that the only things that perform acts of consciousness are conscious. You ignore completely the supporting staff of consciousness.

Again I will quote from nature. We are blind to how the eyes see, we are not conscious of the mechanisms of sight, we are not conscious of the means of sight. However this does not imply that the mechanism of sight is blind to what it is doing, or what it is seeing, BUT, it can report conditions of itself, conditions of its network, conditions of its topology...

Suddenly your eyes hurt, start to fail, but, you were never conscious of the cause, how can this be?

Sammi Na Boodie (I am b ack)
Mr. Sammi is offline  
Old 07-15-2002, 08:13 AM   #43
Regular Member
 
Join Date: Mar 2002
Location: CT
Posts: 333
Post

Tron,

Quote:
If having a body similar to that of a human is necessary for human style consciousness (it is not clear that it is) then a virtual body could be realized for an artificial intelligence.
Either you are evading the question, or you just don't understand the question.

Here, let me make it plain, the brain and body are part and parcel of an organism. Now, if you create, somehow, a human style brain and put it in a human body, you may have created human intelligence.

If you create a machine of electronic components, that becomes, one way or another, self aware, you may have created some type of machine intelligence.

Will the machine have feeligs comparable to the human?

I really don't know why I bother.

Do you have even a incling about the structure and function of the autonomic nervouse system? Will your machine breath? Will it know the feeling of asphixiation?

The unconscious, so let's see, you'll provide the capacity for emotional trauma and let your machine develope phobias. This is part a survival mechanism that predates brains of any sort. Do you think it depends mathematical algorithims?


Talk about a laugh. Why not go and actually think about your position before posting nonsesne.

Why is it not clear to you that human style consciousness requires a human style body?(Just like bat style consciousness requires a bat style body, no?)

SB

[ July 15, 2002: Message edited by: snatchbalance ]</p>
snatchbalance is offline  
Old 07-15-2002, 08:19 AM   #44
Regular Member
 
Join Date: Mar 2002
Location: CT
Posts: 333
Post

Sammi,

Quote:
All you dandy fellas and ladies drove right past my point. The point of "kernel information". I will only make one analogy, and the human one is, when the heart is failing, or another body part, the body knows and tells the brain.
This is the point that I've been trying to get Tron to ackowledge. He's too caught up in trying to promote his own agenda.

I would add that the sensations involved are specific to a biological organism. If they can be reproduced, or even approximated, by a different type of "machine", well that would be a pretty difficult proposition to support.

SB
snatchbalance is offline  
Old 07-15-2002, 09:01 AM   #45
Regular Member
 
Join Date: May 2002
Location: North America
Posts: 203
Post

tronvillain:

John Searle says some things relevant to your claim about computation. In his book The Rediscovery of the Mind, Searle has a discussion of computation in which he points out how the concept implies 'multiple realizability'. He says "To find out if an object is really a digital computer, it turns out that we do not actually have to look for 0's and 1's, etc.; rather we just have to look for something that we could treat as or count as or that could be used to function as 0's or 1's. Furthermore, to make the matter more puzzling, it turns out that this machine could be made out of just about anything."

Searle thinks that multiple realizability implies universal realizability. He says:

Quote:
Why are the defenders of computationalism not worried by the implications of multiple realizability? The answer is that they think it is typical of functional accounts that the same function admits of multiple realizations. In this respect, computers are just like carburetors and thermostats. Just as carburetors can be made of brass or steel, so computers can be made of an indefinite range of hardware materials.....But there is a difference: The classes of carburetors and thermostats are defined in terms of the production of certain physical effects. That is why, for example, nobody says you can make carburetors out of pigeons. But the class of computerse is defined syntactically in terms of the assignment of 0's and 1's. The multiple realizability is a consequence not of the fact that the same physical effect can be achieved in different physical substances, but that the relevant properties are purely syntactical. The physics is irrelevant except in so far as it admits of the assignments of 0's and 1's and of state transitions between them.....But this has two consequences that might be disastrous: 1. The same principle that implies multiple realizability would seem to imply universal realizability. If computation is defined in terms of the assignment of syntax, then everything would be a digital computer, because any object whatever could have syntactical ascriptions made to it. You could describe anything in terms of 0's and 1's. 2. Worse yet, syntax is not intrinsic to physics. The ascription of syntactical properties is always relative to an agent or observer who treats certain physical phenomena as syntactical......we wanted to know how the brain works, specifically how it produces mental phenomena. And it would not answer that question to be told that the brain is a digital computer in the sense that stomach, liver, heart, solar system, and the state of Kansas are all digital computers. The model we had was that we might discover some fact about the operation of the brain that would show that it is a computer. We wanted to know if there was not some sense in which brains were intrinsically digital computers in a way that green leaves intrinsically perform photosynthesis or hearts intrinsically pump blood. It is not a matter of us arbitrarily or "conventionally" assigning the word "pump" to hearts or "photosynthesis" to leaves. There is an actual fact of the matter. And what we were asking is, "Is there in that way a fact of the matter about brains that would make them digital computers?" It does not answer that question to be told, yes, brains are digital computers because everything is a digital computer....On the standard textbook definition of computation, it is hard to see how to avoid the following results: 1. For any object there is some description of that object such that under that description the object is a digital computer. 2. For any program and for any sufficiently complex object, there is some description of the object under which it is implementing the program. Thus for example the wall behind my back is right now implementing the Wordstar program, because there is some pattern of molecule movements that is isomorphic with the formal structure of Wordstar. But if the wall is implementing Wordstar, then if it is a big enough wall it is implementing any program, including any program implemented in the brain.
He goes on to say that the definition of computation could be tightened up and that might allow one to avoid the consequences he mentioned above. He says, "I think the main reason that the proponents do not see that multiple or universal realizability is a problem is that they do not see it as a consequence of a much deeper point, namely that "syntax" is not the name of a physical feature, like mass or gravity."

He goes on to say:

Quote:
....the really deep problem is that syntax is essentially an observer-relative notion. The multiple realizability of computationally equivalent processes in different physical media is not just a sign that the processes are abstract, but that they are not intrinsic to the system at all. They depend on an interpretation from outside. We were looking for some facts of the matter that would make brain processes computational; but given the way we have defined computation, there never could be any such facts of the matter. We can't, on the one hand, say that anything is a digital computer if we can assign a syntax to it, and then suppose there is a factual question intrinsic to its physical operation whether or not a natural system such as the brain is a digital computer.
And later he says:

Quote:
...notions such as computation, algorithm, and program do not name intrinsic physical features of systems. Computational states are not discovered within the physics , they are assigned to the physics.
All of this is from Chapter 9 of his book.
Taffy Lewis is offline  
Old 07-15-2002, 09:04 AM   #46
Regular Member
 
Join Date: May 2002
Location: North America
Posts: 203
Post

excreationist,

Quote:
Now to work out what it would need:

"receives input"
A sub-system that extracts features from the external environment (e.g. red, green and blue light intensity detectors)

"responds"
Uses motors or muscles or something that allows it to interact with the external environment in order to test its beliefs, etc.

"according to its goals/desires and beliefs learnt through experience about how the world works"
It has long-term memories that it has accumulated that it uses to predict what it needs to do in order to attempt to satisfy its fundamental drives (e.g. seek newness, seek coherence, avoid frustration, avoid bodily injury) This learning/motivational part is pretty complex...
But the question is, what are the necessary conditions for having beliefs, desires, and intentional actions.
Taffy Lewis is offline  
Old 07-15-2002, 09:54 AM   #47
Synaesthesia
Guest
 
Posts: n/a
Post

snatchbalance,
Quote:
I would add that the sensations involved are specific to a biological organism. If they can be reproduced, or even approximated, by a different type of "machine", well that would be a pretty difficult proposition to support.
As a matter of fact, this is the basic conception behind doctrine of vitalism. You are assuming that one aspect of our organization that is responsible for our intelligence is the fact that we are made of material characteristic of self-replicating organisms on earth.

You have not justified this assumption, you have merely put forward as a matter of fact that there is something (what it is you don't say) about biology that permits consciousness, something that other sorts of organization cannot emulate.


Hey Taffy, thanks for bringing up this point!:
Quote:
He says "To find out if an object is really a digital computer, it turns out that we do not actually have to look for 0's and 1's, etc.; rather we just have to look for something that we could treat as or count as or that could be used to function as 0's or 1's. Furthermore, to make the matter more puzzling, it turns out that this machine could be made out of just about anything."
This is at first glance a very confounding consideration for functionalism, and is a very popular method of attacking reductionism. Yet Searl has (not uncharacteristically, see Chinese room) failed to discuss the very areas most vital to his argument, the very sorts of considerations at it's conceptual heart.

When he asserts that we could find a pattern of molecules structurally isomorphic to a computer program in a wall, I cannot help but wonder how the wall manages recursive feedback and functional quirks of a computer program. How exactly would we go about finding this isomorphism?

Searl doesn't clarify what he means in the quote at question. The devil, in this case, is most certainly in the details.

Let us imagine we are something very near gods. We pick out a few trillion air molecules to emulate a brain. Since their movement is quite chaotic, we need to invoke our super-intelligence and come up with a fantastically complicated procedure to translate the movement of these molecules into sensible, coherent cognition. Now here is the devil: the procedure by which we interpret the air molecules as a brain will itself be far, more complex than the system of air (or the wall), indeed, FAR more complex than the brain itself!

Now I wonder, at this point, how much of this computation is really a function of our vastly convoluted interpretive mechanism, and how much of it is the chaotic movement of air molecules, or the slow oscillation of our wall?

How much of Michelangelo David is marble, how much of it is the work of Michelangelo?


------

Searl's conception of computation being relative to an observer is not without merit. We can treat natural events as decision algorithms, for example flipping a coin. However, he places undue stress upon the observer being conscious, a well evolved observer. Functional proccessing occurred for billions of years in functional nervous or chamical systems in small animals and cells before we observers ever came onto the scene.

Regards,
Synaesthesia
 
Old 07-15-2002, 10:37 AM   #48
Regular Member
 
Join Date: Mar 2002
Location: CT
Posts: 333
Post

syneathasia,

Quote:
As a matter of fact, this is the basic conception behind doctrine of vitalism. You are assuming that one aspect of our organization that is responsible for our intelligence is the fact that we are made of material characteristic of self-replicating organisms on earth.

You have not justified this assumption, you have merely put forward as a matter of fact that there is something (what it is you don't say) about biology that permits consciousness, something that other sorts of organization cannot emulate.
Nope, never said anything like that. I said that consciousness, as we presently know it, is obviously an emergent property of certain arrangments of biological material.

Can other types of matter acheive consciousness; I guess so, but no one has seen it yet.

Will it's nature be different than biological consciousness, think the answer has to be yes.

I guess that I could go on to say that the sensations of biological creaturs will differ from the sensations experienced by non biological creatures. Further, I would say that whatever "feelings" or sensations that may be experienced by non-biological creatures, would be unrecognisable, as such, to biological creatures. Why? Because feelings are inseperable from changes in physiological states.

SB

[ July 15, 2002: Message edited by: snatchbalance ]

[ July 15, 2002: Message edited by: snatchbalance ]

[ July 15, 2002: Message edited by: snatchbalance ]

[ July 15, 2002: Message edited by: snatchbalance ]</p>
snatchbalance is offline  
Old 07-16-2002, 08:55 PM   #49
Veteran Member
 
Join Date: Oct 2000
Location: Alberta, Canada
Posts: 5,658
Post

snatchbalance, permit me to quote myself:
Quote:
Well, you show your own limitations and ignorance.
Now, on to your last post:
Quote:
Either you are evading the question, or you just don't understand the question.
Exactly which question are you talking about?

Quote:
Here, let me make it plain, the brain and body are part and parcel of an organism. Now, if you create, somehow, a human style brain and put it in a human body, you may have created human intelligence.
If I take your human style brain and put it in an artificial body which merely mimics a human body (responding to and creating appropriate neural and hormonal messages), will you still be human? Yes. If it's done well enough, you won't even notice. If I take your human style brain and put it in a machine that simply simulates a human body, will you still be human? Yes. Again, if it's done well enough (extremely difficult to do), you won't even notice.

Now, what if we start in on your brain? If we gradually replace each neuron in your brain with machines that simply mimic the operation of organic neurons, down to their responses and production of neurotransmitters, will you still be human? Yes. If it's done well enough, you won't even notice. If we then do away with hormones and neurotransmitters, with each artificial neuron transmitting to a computer which then simulates their action and feeds the result to the other artificial neurons will you still be human? Yes. Again, if it's done well enough, you won't even notice. Now, if we gradually replace each artifical neuron with a computer simulation, will you still be human? Yes. Yes again, if it's done well enough, you won't even notice.

So, what do we get if we put these two scenarios together? We get you living entirely within a machine, and you didn't even notice. Exactly how are the brain and the body part and parcel of an organism now?

Quote:
If you create a machine of electronic components, that becomes, one way or another, self aware, you may have created some type of machine intelligence.
True, but that does not preclude it being similar or identical to human intelligence.

Quote:
Will the machine have feeligs comparable to the human?
Maybe - it all depends on how it is built. If the architecture of its intelligence is comparable to that of a human, then yes.

Quote:
I really don't know why I bother.
I don't know. Perhaps you haven't really thought it through.

Quote:
Do you have even a incling about the structure and function of the autonomic nervouse system? Will your machine breath? Will it know the feeling of asphixiation?
Yes, I do have an inkling about the structure and function of the autonomic nervous system. Does the question have a point? As for breathing and knowing the feeling of asphixiation, it is not clear that either are necessary for human style consciousness, but either could theoretically be simulated for an appropriately constructed artificial intelligence to be created. They are, after all, simply patterns of neural impulses sent to the brain.

Quote:
The unconscious, so let's see, you'll provide the capacity for emotional trauma and let your machine develope phobias. This is part a survival mechanism that predates brains of any sort. Do you think it depends mathematical algorithims?
As I mentioned before, it is quite likely that it is impossible to create consciousness without an unconscious, whatever its style. It depends on patterns of neural activity in the brain, which could in itself be reproduced, or perhaps replaced given a more complete understanding. Oh, and the unconscious does not "predate brains of any sort", at least not in any sense that I am familiar with the term.

Quote:
Talk about a laugh. Why not go and actually think about your position before posting nonsesne.
I see you have taken to talking to yourself. I have no problem with that, but I would prefer you not do it in posts addressed to me.

Quote:
Why is it not clear to you that human style consciousness requires a human style body?(Just like bat style consciousness requires a bat style body, no?)
It is clear to me that human style consciousness requires a human style mind architecture. What is not clear to me is that a human style mind architecture requires a human style body.
tronvillain is offline  
Old 07-16-2002, 08:59 PM   #50
Veteran Member
 
Join Date: Oct 2000
Location: Alberta, Canada
Posts: 5,658
Post

Taffy Lewis, I never claimed that the brain was a digital computer, which makes most of your post pointless. Everything that the brain does could be done by a digital computer, but the brain itself is not a digitical computer.
tronvillain is offline  
 

Thread Tools Search this Thread
Search this Thread:

Advanced Search

Forum Jump


All times are GMT -8. The time now is 07:26 AM.

Top

This custom BB emulates vBulletin® Version 3.8.2
Copyright ©2000 - 2015, Jelsoft Enterprises Ltd.