FRDB Archives

Freethought & Rationalism Archive

The archives are read only.


Go Back   FRDB Archives > Archives > IIDB ARCHIVE: 200X-2003, PD 2007 > IIDB Philosophical Forums (PRIOR TO JUN-2003)
Welcome, Peter Kirby.
You last visited: Yesterday at 05:55 AM

 
 
Thread Tools Search this Thread
Old 07-14-2002, 09:32 AM   #31
Veteran Member
 
Join Date: Oct 2000
Location: Alberta, Canada
Posts: 5,658
Post

snatchbalance:
Quote:
Except that the brain, and I guess you really mean the neocortex, does not function in isolation. Conscious computations are only a small part of the story. All of our drives(feelings) take place on the substrate of biology, this cannot be denied. To say that a computational made out silicon and such, will have comparable drives(feelings), is a very hard contention to support,IMO.
No, I do not really mean the neocortex, I mean the entire brain. To say that all of our drives take place of the substrate of biology is simply to say that they depend on the action of hormones and neuron, which simply make a complex parallel network. Remember, the intelligence will probably not be the machine, it will be created by the machine, the product of its operation.

Quote:
With this being said, I don't doubt that eventually a machine that is aware, in some sense, that it is a machine, can and will be created. But it will still be machine(not that I'm denying that we are BIOLOGICAL machines). Will it have an unconscious, that informs its conscious functions? Will it begin to metabolise its own parts if it gets "hungry" enough?
There is no reason that it should not have an "unconscious" - indeed, it is quite possible that is impossible to construct anything conscious without one. Siilarly, there is no reason it should not also get hungry and metabolize its own part if it gets "hungry", except that such additions to the program appear totally unecessary. If I never got hungry, would that affect my ability to experience other emotions? No. Still, you could make them hungry for something else, like information, and add positive and negative feedback comparable to being hungry or full.

Quote:
No, whatever "drives" or "feelings" such a machine might have are truly beyond our imagimations. We will have nothing to relate them to; no common ground. To me, they won't count as "feelings" at all.
While it is possible that we will have no common ground with them, it all depends on how they are built. Since we are the ones who would build them, we could have a lot in common with them.
tronvillain is offline  
Old 07-14-2002, 09:48 AM   #32
Veteran Member
 
Join Date: Oct 2000
Location: Alberta, Canada
Posts: 5,658
Post

Sammi:
Quote:
The difference between symbolic operations in a machine AND symbolic operations in a human is a simple one. The human is NOT-BLIND to its symbolic operations WHEREAS a machine is blind to all its operations. I have the proof of Serle's leap-of-faith.
Where is this proof? I hope it is not simply your assertion that "The human is NOT-BLIND to its symbolic operations WHEREAS a machine is blind to all its operations" or your comparison between a human and a bus.

Again, I will quote Dennett:
Quote:
The fact is that any program that could actually hold up its end in the conversation depicted would have to be an extraordinarily supple, sophisticated, and multilayered system, brimming with "world knowledge" and meta-knowledge and meta-meta-knowledge about its own responses, the likely responses of its interlocuter, its own "motivations" and the motivations of its interlocuter, and much, much more. Searle does not deny that programs can have all this structure of course. He simply discourages us from attending to it. But if we are to do a good job imagined the case, we are not only entitled but obliged to imagine that the program Searle is hand-simulating has all of this structure - and more, if only we can imagine it. But it is no longer obvious I trust, that there is no genuine understanding of the joke going on. Maybe the billions of all those highly structered parts produce genuine understanding in the system after all. If your response to this hypothesis is that you haven't the faintest idea whether there would be genuine understanding in such a complex system, that is already enough to show that Searle's thought experiment depends, ilicility, on your imagining to simple a case, an irrelevant case, and drawing the "obvious" case from it.
You apparently wish to point to a calculator (or a bus) and say "This calculator does not understand Chinese, therefore no machine can understand Chinese." I could say the same thing about a flatworm and organics, but the argument wouldn't be any better.
tronvillain is offline  
Old 07-14-2002, 10:04 AM   #33
Veteran Member
 
Join Date: Oct 2001
Location: Canada
Posts: 3,751
Post

tron, well said. Er... quoted. Dennett is wonderfully clear about this.

The other side of the coin is that humans are almost entirely "blind" to their first-person cognitive mechanisms -- including at the level of symbol processing, whatever level that is. A host of experimental data confirms this, perhaps the simplest being the serial memory task: response time varies with the length of the list, rather than the position of the probe item on the list, even though subjects report "running through the list" in the order they learned it *only* as far as the probe item.
Clutch is offline  
Old 07-14-2002, 11:46 AM   #34
Synaesthesia
Guest
 
Posts: n/a
Post

Snatchbalance.
Quote:
To say that a computational made out silicon and such, will have comparable drives(feelings), is a very hard contention to support,IMO.
To say that there is some vitalistic ingredient to biological matter is an even harder contention to support.

That being said, there are very prosaic reasons why a silicone emulation of the brain would be less feasible than hardware much more lke the brain. The specific neurotransimitters used have very subtle effects. Modelling their precise behavior would be so fantastically complicated that the notion of having thousands of trillions of them modeled on a computer seems grossly implausible. (But not impossible - in principle)

Sammi,
Quote:
The difference between symbolic operations in a machine AND symbolic operations in a human is a simple one. The human is NOT-BLIND to its symbolic operations WHEREAS a machine is blind to all its operations. I have the proof of Serle's leap-of-faith.
As a matter of fact humans are entirely blind to our mechanical operations. There are many layers of feedback, but we can’t directly discern their physical nature without scientific investigation. This is perfectly possible, in principle, for an emulated human. Searle’s leap of faith remains just that.
 
Old 07-14-2002, 12:01 PM   #35
Regular Member
 
Join Date: Mar 2002
Location: CT
Posts: 333
Post

Tron,

Quote:
No, I do not really mean the neocortex, I mean the entire brain. To say that all of our drives take place of the substrate of biology is simply to say that they depend on the action of hormones and neuron, which simply make a complex parallel network. Remember, the intelligence will probably not be the machine, it will be created by the machine, the product of its operation.
Yes, well I guess you just illustrated my point, hormones and neurons work within a protien and water based body. Hormones cause, increase in heart rate and breathing, muscles to tense, bowels to void, etc. What comparison do have in a being of silicon circuits, no muscles, and no bowels? Like you said the brain(and there may be more than one) works in paraelle with the body, they are inseperable.

Like I keep saying, it may be a conscious machine, but it won't be an animal consciousnes.

Quote:
There is no reason that it should not have an "unconscious" - indeed, it is quite possible that is impossible to construct anything conscious without one. Siilarly, there is no reason it should not also get hungry and metabolize its own part if it gets "hungry", except that such additions to the program appear totally unecessary. If I never got hungry, would that affect my ability to experience other emotions? No. Still, you could make them hungry for something else, like information, and add positive and negative feedback comparable to being hungry or full.

First, somehow, an unconscious will be consciously programed into the machine. Pretty good trick.

Then, somehow, perhaps magically, silicon chips could be metabolised to provide energy; but that's not "necessary". So, consciously, a program that stimulates "hunger" will be provided, only the "hunger" will be for information, not food.

And this all, somehow, is supposed to be comparable to organic experience.

I'm sorry, but it's all too much for me.

Quote:
While it is possible that we will have no common ground with them, it all depends on how they are built. Since we are the ones who would build them, we could have a lot in common with them.
I suppose so.

SB

[ July 14, 2002: Message edited by: snatchbalance ]</p>
snatchbalance is offline  
Old 07-14-2002, 12:21 PM   #36
Regular Member
 
Join Date: Mar 2002
Location: CT
Posts: 333
Post

Synthesia,

Quote:
To say that there is some vitalistic ingredient to biological matter is an even harder contention to support.
If you are refering to some sort of elan vital, well, I never made any such contention that there is such a thing. In fact, I tend to reject such notions out of hand.

All I can say at that point, is that it is obvious life, and certain forms of consciouness, are emergent properties of biological matter.

I agree with the rest of your post.

sb

[ July 14, 2002: Message edited by: snatchbalance ]</p>
snatchbalance is offline  
Old 07-14-2002, 12:40 PM   #37
Veteran Member
 
Join Date: Oct 2000
Location: Alberta, Canada
Posts: 5,658
Post

This is pathetic. Sometimes I wonder why I am even bothering to talk to you.

snatchbalance:
Quote:
Yes, well I guess you just illustrated my point, hormones and neurons work within a protien and water based body. Hormones cause, increase in heart rate and breathing, muscles to tense, bowels to void, etc. What comparison do have in a being of silicon circuits, no muscles, and no bowels? Like you said the brain(and there may be more than one) works in paraelle with the body, they are inseperable.
As I have said before, there is no apparent obstacle to building something in a serial computer exactly equivalent to the complex parallel network of human neurons and hormones. If it does the same job, does it matter whether it is a chunk of organic matter or a piece of software? There is no apparent reason that it should. If having a body similar to that of a human is necessary for human style consciousness (it is not clear that it is) then a virtual body could be realized for an artificial intelligence.

Quote:
First, somehow, an unconscious will be consciously programed into the machine. Pretty good trick.

Then, somehow, perhaps magically, silicon chips could be metabolised to provide energy; but that's not "necessary". So, consciously, a program that stimulates "hunger" will be provided, only the "hunger" will be for information, not food.

And this all, somehow, is supposed to be comparable to organic experience.
*chuckle* A rather sad little attempt at ridiculing my position. Yes, it would be a good trick to construct a system with an unconscious, but evolution has managed to do it so obviously it can be done. Apparently the difficulty must lie with doing it consciously as we must, rather than unconsciously as evolution did?

As for hunger, it is not clear that it is necessary at all. I only suggested a hunger for information as a possibility if it was necessary, but is it so hard to imagine? When you have gone a while without taking in information you feel hunger pangs, you take down a book and read until you feel full. If that won't do, you could create virtual food and virtual metabolism, but it seems unlikely that it would be required. If I was never hungry or consumed food would I be incapable of experiencing other emotions? No.

Quote:
I'm sorry, but it's all too much for me.
Well, you show your own limitations and ignorance.

[ July 14, 2002: Message edited by: tronvillain ]</p>
tronvillain is offline  
Old 07-14-2002, 12:49 PM   #38
Regular Member
 
Join Date: May 2002
Location: North America
Posts: 203
Post

If being an organic system is not a necessary condition for states of consciousness then what is necessary?
Taffy Lewis is offline  
Old 07-14-2002, 09:13 PM   #39
Veteran Member
 
Join Date: Oct 2000
Location: Alberta, Canada
Posts: 5,658
Post

Well, if we knew that we'd know a lot more about consciousness than we do now. A computational system of sufficient power complexity obviously, but as for the details of its organization...
tronvillain is offline  
Old 07-15-2002, 12:55 AM   #40
Veteran Member
 
Join Date: Aug 2000
Location: Australia
Posts: 4,886
Post

Taffy Lewis:
Quote:
If being an organic system is not a necessary condition for states of consciousness then what is necessary?
This is my definition for an aware system:

"...it receives input and responds according to its goals/desires and beliefs learnt through experience about how the world works
(self-motivated, acting on self-learnt beliefs ["self" refers to the system as a whole])"

Now to work out what it would need:

"receives input"
A sub-system that extracts features from the external environment (e.g. red, green and blue light intensity detectors)

"responds"
Uses motors or muscles or something that allows it to interact with the external environment in order to test its beliefs, etc.

"according to its goals/desires and beliefs learnt through experience about how the world works"
It has long-term memories that it has accumulated that it uses to predict what it needs to do in order to attempt to satisfy its fundamental drives (e.g. seek newness, seek coherence, avoid frustration, avoid bodily injury) This learning/motivational part is pretty complex...
excreationist is offline  
 

Thread Tools Search this Thread
Search this Thread:

Advanced Search

Forum Jump


All times are GMT -8. The time now is 07:26 AM.

Top

This custom BB emulates vBulletin® Version 3.8.2
Copyright ©2000 - 2015, Jelsoft Enterprises Ltd.