FRDB Archives

Freethought & Rationalism Archive

The archives are read only.


Go Back   FRDB Archives > Archives > IIDB ARCHIVE: 200X-2003, PD 2007 > IIDB Philosophical Forums (PRIOR TO JUN-2003)
Welcome, Peter Kirby.
You last visited: Today at 05:55 AM

 
 
Thread Tools Search this Thread
Old 07-13-2002, 09:08 AM   #21
Veteran Member
 
Join Date: Aug 2000
Location: Australia
Posts: 4,886
Post

Quote:
Originally posted by Sammi:
<strong>Synaesthesia : ..."exibited the same capabilities, it would certainly be conscious".

Did Searle not show this to be false. The machine would only be a symbolic manipulator.</strong>
Well this is what Searle said in <a href="http://www.mdx.ac.uk/www/psychology/cog/psy3241/searleconsc.htm#key%20questions" target="_blank">The Mystery of Consciousness</a>:
Quote:
Q1. Could a machine be conscious?

A: The brain is a machine, and is conscious. (It's a biological machine, like the liver or the heart.)

Q2. Could an artificial machine be conscious?

A: There's no reason in principle why you scientists couldnít in principle make an artificial brain, rather like an artificial heart. If all the right causal mechanisms were working properly, then the artificial brain would be conscious.

Q3. Would an artificial brain that was conscious have to be made out of brain tissue, or could it be made of silicon, transistors, etc.?

A: This is an empirical question. No-one knows the answer to it at present.

...
Anyway, I think the main thing the chinese room argument was against was people who thought a computer that had explicitly programmed rules in it would be conscious. The chinese room is like <a href="http://alice.sunlitsurf.com/alicechat.html" target="_blank">A.L.I.C.E.</a>... all of its intelligence is explicitly programmed in... it isn't self-learnt. So the program doesn't have any real understanding of what it is doing.
excreationist is offline  
Old 07-13-2002, 09:46 AM   #22
Synaesthesia
Guest
 
Posts: n/a
Post

Sammi,
Quote:
Did Searle not show this to be false. The machine would only be a symbolic manipulator.
Hey Sammi. Searle himself claims to have shown that the brain's functionality could be emulated on a symbolic manipulator. He then simply makes a leap and assumes that symbol manipulation can never be conscious. On this point no argument ever actually put foreward, his most controversial conclusion is really only a premise.

I don't think we can detach feeling from thinking. Brain damage affecting emotional responses causes cognitive defects. We operate much differently when conscious than when unconscious is compelling indication that consciousness is functional.
 
Old 07-13-2002, 10:31 AM   #23
Veteran Member
 
Join Date: Oct 2000
Location: Alberta, Canada
Posts: 5,658
Post

snatchbalance:
Quote:
Well, I don't really know if it is necessary, but I can't really imagine anything else.
I see that more as an indication of your lack of imagination rather than an special insight into physical necessity.

Quote:
Suppose people build a complex machine that becomes self aware. maybe it could even be programed to exhibit signs of anger at appropriate times. Maybe it could self program to exhibit such signs at times that are in it's best intest. Do computations that that produce signs, that we humans would recognise as anger, acutually produce "feelings" of anger within the machine? Will it's all it's "neural" circuits fire at once? Will adrenaline flood it's "bloodstream"? Will it "feel" it's muscles tense, and it's heart race?
What are the human body and brain if not extremely complex machines? There is no apparent obstable to a sufficiently powerful computer performing any action of which a human brain is capable, so the machine (or rather then entity created by the machine) could have feelings.

Quote:
I doubt it, ergo ersatz consciousnes; from an organic perspective anyways.(What other perspective is there?)
At the moment we are not aware of any other perspective, but that does not mean one could not be created from "inorganic" materials.
tronvillain is offline  
Old 07-13-2002, 10:56 AM   #24
Veteran Member
 
Join Date: Oct 2000
Location: Alberta, Canada
Posts: 5,658
Post

Sammi:
Quote:
Did Searle not show this to be false. The machine would only be a symbolic manipulator.
For such a machine to only be a symbolic manipulator is unlikely in extreme, given the incomprehensible size of the program which would be required. It is far more likely that the program would take shortcuts, permitting it to do things like learn and think and generate responses rather than having them be preprogrammed.

As Dennett says:
Quote:
The fact is that any program that could actually hold up its end in the conversation depicted would have to be an extraordinarily supple, sophisticated, and multilayered system, brimming with "world knowledge" and meta-knowledge and meta-meta-knowledge about its own responses, the likely responses of its interlocuter, its own "motivations" and the motications of its interlocuter, and much, much more. Searle does not deny that programs can have all this structure of course,. He simply discourages us from attending to it. But if we are to do a good job imagined the case, we are not only entitled but obliged to imagine that the program Searle is hand-simulating has all of this structre - and more, if only we can imagine it. But it is no longer obvious I trust, that there is no genuine understanding of the joke going on. Maybe the billions of all those highly structered parts produce genuine understanding in the system after all. If your response to this hypothesis is that you haven't the faintest idea whether there would be genuine understanding in such a complex system, that is already enough to show that Searle's thought experiment depends, ilicility, on your imagining to simple a case, an irrelevant case, and drawing the "obvious" case from it.
The argument depends on the assumption that "Surely more of the same, not matter how much more, would never add up to genuine understanding." Of course, this is obviously not true when comparing organics humans and insects, so there is no reason to think it true when it comes to inorganics. If there was, one could simply point to a calculator and say "This calculator does not understand Chinese" and not have to resort to Searle's misdirection.
tronvillain is offline  
Old 07-13-2002, 11:00 AM   #25
Synaesthesia
Guest
 
Posts: n/a
Post

Snatch,
Quote:
Do computations that that produce signs, that we humans would recognise as anger, acutually produce "feelings" of anger within the machine? Will it's all it's "neural" circuits fire at once? Will adrenaline flood it's "bloodstream"? Will it "feel" it's muscles tense, and it's heart race?
Surely you take the signes, the purely behavioral signs, that other participants on this BB are exibiting as evidence of their psycological similarity to you. There's more to behavior than pounding hearts and open sweat glands.
 
Old 07-13-2002, 12:02 PM   #26
Regular Member
 
Join Date: Mar 2002
Location: CT
Posts: 333
Post

tron,

Quote:
What are the human body and brain if not extremely complex machines? There is no apparent obstable to a sufficiently powerful computer performing any action of which a human brain is capable, so the machine (or rather then entity created by the machine) could have feelings.
Well, when you or anyone else builds a machine, I guess based on silicone and metals, that can assimlate protiens and produce various hormones with the proper stimulation, just let me know. Such a thing may be possible, in lala land, but not around here.

In other threads you argue against the possiblity of knowing how a bat could feel. Here, you argue that a machine could have feelins, supposedly comparable to our own. What a joke.

SB
snatchbalance is offline  
Old 07-13-2002, 12:10 PM   #27
Regular Member
 
Join Date: Mar 2002
Location: CT
Posts: 333
Post

Snyeathasia,

No emotions are not the only sign of consciousnes, but they are, IMO, part and parcel of "complete" consciousness. The ability to seperate from your surroundings and perform computations may be a form of consciousness, but, again IMO, a lesser form.

SB
snatchbalance is offline  
Old 07-13-2002, 04:17 PM   #28
Veteran Member
 
Join Date: Oct 2000
Location: Alberta, Canada
Posts: 5,658
Post

snatchbalance:
Quote:
Well, when you or anyone else builds a machine, I guess based on silicone and metals, that can assimlate protiens and produce various hormones with the proper stimulation, just let me know. Such a thing may be possible, in lala land, but not around here.
Electronic equivalents could be used for hormone, substituting electronic messengers for chemical ones. There is no apparent need to resort to utilizing actual biological components to produce emotion.

Quote:
In other threads you argue against the possiblity of knowing how a bat could feel. Here, you argue that a machine could have feelins, supposedly comparable to our own. What a joke.
I didn't argue that it was impossible, I simply argued that it is entirely possible that we cannot know what it is like to be a bat - we may have absolutely nothing analagous to a bat's experience of sonar. This does not preclude knowing a lot about what it is like to be a bat. Here I argue that a machine could have feelings comparable to our own, because there is no apparent obstacle - a serial machine such as a computer can act as a parallel machine, and our brain is simply an extremely large and complex parallel machine. If you think assimilating
tronvillain is offline  
Old 07-14-2002, 06:01 AM   #29
Regular Member
 
Join Date: Mar 2002
Location: CT
Posts: 333
Post

Tron,

Quote:
Here I argue that a machine could have feelings comparable to our own, because there is no apparent obstacle - a serial machine such as a computer can act as a parallel machine, and our brain is simply an extremely large and complex parallel machine.

Except that the brain, and I guess you really mean the neocortex, does not function in isolation. Conscious computations are only a small part of the story. All of our drives(feelings) take place on the substrate of biology, this cannot be denied. To say that a computational made out silicon and such, will have comparable drives(feelings), is a very hard contention to support,IMO.

With this being said, I don't doubt that eventually a machine that is aware, in some sense, that it is a machine, can and will be created. But it will still be machine(not that I'm denying that we are BIOLOGICAL machines). Will it have an unconscious, that informs its conscious functions? Will it begin to metabolise its own parts if it gets "hungry" enough? No, whatever "drives" or "feelings" such a machine might have are truly beyond our imagimations. We will have nothing to relate them to; no common ground.

To me, they won't count as "feelings" at all.

SB
snatchbalance is offline  
Old 07-14-2002, 07:02 AM   #30
Banned
 
Join Date: Jun 2002
Location: Montrčal
Posts: 367
Post

All you dandy fellas and ladies drove right past my point. The point of "kernel information". I will only make one analogy, and the human one is, when the heart is failing, or another body part, the body knows and tells the brain. When the bus gets scratched in a machine, the whole machine goes down, it just crashes. The implication is a change in "physical construction" for machines to attain "the sense of itself".

The difference between symbolic operations in a machine AND symbolic operations in a human is a simple one. The human is NOT-BLIND to its symbolic operations WHEREAS a machine is blind to all its operations. I have the proof of Serle's leap-of-faith.

Sammi Na Boodie ()
Mr. Sammi is offline  
 

Thread Tools Search this Thread
Search this Thread:

Advanced Search

Forum Jump


All times are GMT -8. The time now is 09:17 PM.

Top

This custom BB emulates vBulletin® Version 3.8.2
Copyright ©2000 - 2015, Jelsoft Enterprises Ltd.