Freethought & Rationalism ArchiveThe archives are read only. |
07-14-2002, 09:32 AM | #31 | |||
Veteran Member
Join Date: Oct 2000
Location: Alberta, Canada
Posts: 5,658
|
snatchbalance:
Quote:
Quote:
Quote:
|
|||
07-14-2002, 09:48 AM | #32 | ||
Veteran Member
Join Date: Oct 2000
Location: Alberta, Canada
Posts: 5,658
|
Sammi:
Quote:
Again, I will quote Dennett: Quote:
|
||
07-14-2002, 10:04 AM | #33 |
Veteran Member
Join Date: Oct 2001
Location: Canada
Posts: 3,751
|
tron, well said. Er... quoted. Dennett is wonderfully clear about this.
The other side of the coin is that humans are almost entirely "blind" to their first-person cognitive mechanisms -- including at the level of symbol processing, whatever level that is. A host of experimental data confirms this, perhaps the simplest being the serial memory task: response time varies with the length of the list, rather than the position of the probe item on the list, even though subjects report "running through the list" in the order they learned it *only* as far as the probe item. |
07-14-2002, 11:46 AM | #34 | ||
Guest
Posts: n/a
|
Snatchbalance.
Quote:
That being said, there are very prosaic reasons why a silicone emulation of the brain would be less feasible than hardware much more lke the brain. The specific neurotransimitters used have very subtle effects. Modelling their precise behavior would be so fantastically complicated that the notion of having thousands of trillions of them modeled on a computer seems grossly implausible. (But not impossible - in principle) Sammi, Quote:
|
||
07-14-2002, 12:01 PM | #35 | |||
Regular Member
Join Date: Mar 2002
Location: CT
Posts: 333
|
Tron,
Quote:
Like I keep saying, it may be a conscious machine, but it won't be an animal consciousnes. Quote:
First, somehow, an unconscious will be consciously programed into the machine. Pretty good trick. Then, somehow, perhaps magically, silicon chips could be metabolised to provide energy; but that's not "necessary". So, consciously, a program that stimulates "hunger" will be provided, only the "hunger" will be for information, not food. And this all, somehow, is supposed to be comparable to organic experience. I'm sorry, but it's all too much for me. Quote:
SB [ July 14, 2002: Message edited by: snatchbalance ]</p> |
|||
07-14-2002, 12:21 PM | #36 | |
Regular Member
Join Date: Mar 2002
Location: CT
Posts: 333
|
Synthesia,
Quote:
All I can say at that point, is that it is obvious life, and certain forms of consciouness, are emergent properties of biological matter. I agree with the rest of your post. sb [ July 14, 2002: Message edited by: snatchbalance ]</p> |
|
07-14-2002, 12:40 PM | #37 | |||
Veteran Member
Join Date: Oct 2000
Location: Alberta, Canada
Posts: 5,658
|
This is pathetic. Sometimes I wonder why I am even bothering to talk to you.
snatchbalance: Quote:
Quote:
As for hunger, it is not clear that it is necessary at all. I only suggested a hunger for information as a possibility if it was necessary, but is it so hard to imagine? When you have gone a while without taking in information you feel hunger pangs, you take down a book and read until you feel full. If that won't do, you could create virtual food and virtual metabolism, but it seems unlikely that it would be required. If I was never hungry or consumed food would I be incapable of experiencing other emotions? No. Quote:
[ July 14, 2002: Message edited by: tronvillain ]</p> |
|||
07-14-2002, 12:49 PM | #38 |
Regular Member
Join Date: May 2002
Location: North America
Posts: 203
|
If being an organic system is not a necessary condition for states of consciousness then what is necessary?
|
07-14-2002, 09:13 PM | #39 |
Veteran Member
Join Date: Oct 2000
Location: Alberta, Canada
Posts: 5,658
|
Well, if we knew that we'd know a lot more about consciousness than we do now. A computational system of sufficient power complexity obviously, but as for the details of its organization...
|
07-15-2002, 12:55 AM | #40 | |
Veteran Member
Join Date: Aug 2000
Location: Australia
Posts: 4,886
|
Taffy Lewis:
Quote:
"...it receives input and responds according to its goals/desires and beliefs learnt through experience about how the world works (self-motivated, acting on self-learnt beliefs ["self" refers to the system as a whole])" Now to work out what it would need: "receives input" A sub-system that extracts features from the external environment (e.g. red, green and blue light intensity detectors) "responds" Uses motors or muscles or something that allows it to interact with the external environment in order to test its beliefs, etc. "according to its goals/desires and beliefs learnt through experience about how the world works" It has long-term memories that it has accumulated that it uses to predict what it needs to do in order to attempt to satisfy its fundamental drives (e.g. seek newness, seek coherence, avoid frustration, avoid bodily injury) This learning/motivational part is pretty complex... |
|
Thread Tools | Search this Thread |
|